Oct 01 14:55:54 crc systemd[1]: Starting Kubernetes Kubelet... Oct 01 14:55:54 crc restorecon[4671]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:54 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 14:55:55 crc restorecon[4671]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 01 14:55:55 crc restorecon[4671]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 01 14:55:55 crc kubenswrapper[4771]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 01 14:55:55 crc kubenswrapper[4771]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 01 14:55:55 crc kubenswrapper[4771]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 01 14:55:55 crc kubenswrapper[4771]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 01 14:55:55 crc kubenswrapper[4771]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 01 14:55:55 crc kubenswrapper[4771]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.746482 4771 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749504 4771 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749518 4771 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749524 4771 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749528 4771 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749532 4771 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749536 4771 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749540 4771 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749544 4771 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749548 4771 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749552 4771 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749555 4771 feature_gate.go:330] unrecognized feature gate: Example Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749559 4771 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749562 4771 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749566 4771 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749570 4771 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749573 4771 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749577 4771 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749580 4771 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749583 4771 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749589 4771 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749594 4771 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749598 4771 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749602 4771 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749606 4771 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749619 4771 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749623 4771 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749627 4771 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749631 4771 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749634 4771 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749639 4771 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749643 4771 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749647 4771 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749650 4771 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749654 4771 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749657 4771 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749662 4771 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749665 4771 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749669 4771 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749672 4771 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749676 4771 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749680 4771 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749684 4771 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749687 4771 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749691 4771 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749694 4771 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749698 4771 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749702 4771 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749705 4771 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749709 4771 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749713 4771 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749716 4771 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749721 4771 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749726 4771 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749743 4771 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749747 4771 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749751 4771 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749755 4771 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749759 4771 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749763 4771 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749767 4771 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749770 4771 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749774 4771 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749779 4771 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749783 4771 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749787 4771 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749791 4771 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749794 4771 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749798 4771 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749804 4771 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749808 4771 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.749812 4771 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.751913 4771 flags.go:64] FLAG: --address="0.0.0.0" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.751931 4771 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.751939 4771 flags.go:64] FLAG: --anonymous-auth="true" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.751945 4771 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.751952 4771 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.751956 4771 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.751963 4771 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.751968 4771 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.751972 4771 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.751977 4771 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.751981 4771 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.751986 4771 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.751990 4771 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.751994 4771 flags.go:64] FLAG: --cgroup-root="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.751999 4771 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752003 4771 flags.go:64] FLAG: --client-ca-file="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752007 4771 flags.go:64] FLAG: --cloud-config="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752011 4771 flags.go:64] FLAG: --cloud-provider="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752015 4771 flags.go:64] FLAG: --cluster-dns="[]" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752022 4771 flags.go:64] FLAG: --cluster-domain="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752026 4771 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752030 4771 flags.go:64] FLAG: --config-dir="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752035 4771 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752039 4771 flags.go:64] FLAG: --container-log-max-files="5" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752045 4771 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752049 4771 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752054 4771 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752058 4771 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752062 4771 flags.go:64] FLAG: --contention-profiling="false" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752067 4771 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752071 4771 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752077 4771 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752081 4771 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752087 4771 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752091 4771 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752095 4771 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752099 4771 flags.go:64] FLAG: --enable-load-reader="false" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752104 4771 flags.go:64] FLAG: --enable-server="true" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752108 4771 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752114 4771 flags.go:64] FLAG: --event-burst="100" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752118 4771 flags.go:64] FLAG: --event-qps="50" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752123 4771 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752128 4771 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752132 4771 flags.go:64] FLAG: --eviction-hard="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752137 4771 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752142 4771 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752146 4771 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752150 4771 flags.go:64] FLAG: --eviction-soft="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752154 4771 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752158 4771 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752162 4771 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752166 4771 flags.go:64] FLAG: --experimental-mounter-path="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752170 4771 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752175 4771 flags.go:64] FLAG: --fail-swap-on="true" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752179 4771 flags.go:64] FLAG: --feature-gates="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752185 4771 flags.go:64] FLAG: --file-check-frequency="20s" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752189 4771 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752194 4771 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752198 4771 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752203 4771 flags.go:64] FLAG: --healthz-port="10248" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752207 4771 flags.go:64] FLAG: --help="false" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752211 4771 flags.go:64] FLAG: --hostname-override="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752215 4771 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752220 4771 flags.go:64] FLAG: --http-check-frequency="20s" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752224 4771 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752228 4771 flags.go:64] FLAG: --image-credential-provider-config="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752231 4771 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752235 4771 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752239 4771 flags.go:64] FLAG: --image-service-endpoint="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752243 4771 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752247 4771 flags.go:64] FLAG: --kube-api-burst="100" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752251 4771 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752255 4771 flags.go:64] FLAG: --kube-api-qps="50" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752260 4771 flags.go:64] FLAG: --kube-reserved="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752264 4771 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752268 4771 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752272 4771 flags.go:64] FLAG: --kubelet-cgroups="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752276 4771 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752280 4771 flags.go:64] FLAG: --lock-file="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752284 4771 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752288 4771 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752292 4771 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752300 4771 flags.go:64] FLAG: --log-json-split-stream="false" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752305 4771 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752309 4771 flags.go:64] FLAG: --log-text-split-stream="false" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752314 4771 flags.go:64] FLAG: --logging-format="text" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752318 4771 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752323 4771 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752329 4771 flags.go:64] FLAG: --manifest-url="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752334 4771 flags.go:64] FLAG: --manifest-url-header="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752340 4771 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752344 4771 flags.go:64] FLAG: --max-open-files="1000000" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752350 4771 flags.go:64] FLAG: --max-pods="110" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752355 4771 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752360 4771 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752364 4771 flags.go:64] FLAG: --memory-manager-policy="None" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752368 4771 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752372 4771 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752377 4771 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752381 4771 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752392 4771 flags.go:64] FLAG: --node-status-max-images="50" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752397 4771 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752401 4771 flags.go:64] FLAG: --oom-score-adj="-999" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752405 4771 flags.go:64] FLAG: --pod-cidr="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752418 4771 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752443 4771 flags.go:64] FLAG: --pod-manifest-path="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752447 4771 flags.go:64] FLAG: --pod-max-pids="-1" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752453 4771 flags.go:64] FLAG: --pods-per-core="0" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752457 4771 flags.go:64] FLAG: --port="10250" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752462 4771 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752467 4771 flags.go:64] FLAG: --provider-id="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752470 4771 flags.go:64] FLAG: --qos-reserved="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752475 4771 flags.go:64] FLAG: --read-only-port="10255" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752478 4771 flags.go:64] FLAG: --register-node="true" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752482 4771 flags.go:64] FLAG: --register-schedulable="true" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752487 4771 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752494 4771 flags.go:64] FLAG: --registry-burst="10" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752498 4771 flags.go:64] FLAG: --registry-qps="5" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752502 4771 flags.go:64] FLAG: --reserved-cpus="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752506 4771 flags.go:64] FLAG: --reserved-memory="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752511 4771 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752516 4771 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752520 4771 flags.go:64] FLAG: --rotate-certificates="false" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752524 4771 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752528 4771 flags.go:64] FLAG: --runonce="false" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752533 4771 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752537 4771 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752542 4771 flags.go:64] FLAG: --seccomp-default="false" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752546 4771 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752550 4771 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752554 4771 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752558 4771 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752563 4771 flags.go:64] FLAG: --storage-driver-password="root" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752567 4771 flags.go:64] FLAG: --storage-driver-secure="false" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752571 4771 flags.go:64] FLAG: --storage-driver-table="stats" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752576 4771 flags.go:64] FLAG: --storage-driver-user="root" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752581 4771 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752585 4771 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752590 4771 flags.go:64] FLAG: --system-cgroups="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752594 4771 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752600 4771 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752604 4771 flags.go:64] FLAG: --tls-cert-file="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752608 4771 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752615 4771 flags.go:64] FLAG: --tls-min-version="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752619 4771 flags.go:64] FLAG: --tls-private-key-file="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752624 4771 flags.go:64] FLAG: --topology-manager-policy="none" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752628 4771 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752633 4771 flags.go:64] FLAG: --topology-manager-scope="container" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752637 4771 flags.go:64] FLAG: --v="2" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752643 4771 flags.go:64] FLAG: --version="false" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752649 4771 flags.go:64] FLAG: --vmodule="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752655 4771 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.752659 4771 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.752818 4771 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.752824 4771 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.752828 4771 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.752832 4771 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.752837 4771 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.752841 4771 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.752845 4771 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.752849 4771 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.752852 4771 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.752856 4771 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.752860 4771 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.752863 4771 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.752866 4771 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.752870 4771 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.752874 4771 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.752879 4771 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.752884 4771 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.752888 4771 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.752892 4771 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.752897 4771 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.752901 4771 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.752905 4771 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.752908 4771 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.752913 4771 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.752916 4771 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.752920 4771 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.752924 4771 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.752928 4771 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.752932 4771 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.752936 4771 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.752940 4771 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.752944 4771 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.752949 4771 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.752953 4771 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.752956 4771 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.752960 4771 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.752964 4771 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.752967 4771 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.752972 4771 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.752976 4771 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.752980 4771 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.752984 4771 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.752988 4771 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.752992 4771 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.752995 4771 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.752999 4771 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.753003 4771 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.753007 4771 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.753010 4771 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.753014 4771 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.753017 4771 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.753021 4771 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.753025 4771 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.753028 4771 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.753032 4771 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.753035 4771 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.753039 4771 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.753042 4771 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.753045 4771 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.753049 4771 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.753052 4771 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.753056 4771 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.753059 4771 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.753063 4771 feature_gate.go:330] unrecognized feature gate: Example Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.753068 4771 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.753073 4771 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.753077 4771 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.753081 4771 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.753085 4771 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.753090 4771 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.753095 4771 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.753916 4771 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.764394 4771 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.764416 4771 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764484 4771 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764490 4771 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764494 4771 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764499 4771 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764502 4771 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764506 4771 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764510 4771 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764513 4771 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764517 4771 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764520 4771 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764524 4771 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764527 4771 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764531 4771 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764534 4771 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764538 4771 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764541 4771 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764545 4771 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764548 4771 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764552 4771 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764555 4771 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764559 4771 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764562 4771 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764567 4771 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764572 4771 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764577 4771 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764582 4771 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764586 4771 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764590 4771 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764594 4771 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764598 4771 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764602 4771 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764605 4771 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764608 4771 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764612 4771 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764616 4771 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764620 4771 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764623 4771 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764627 4771 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764631 4771 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764635 4771 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764639 4771 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764642 4771 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764646 4771 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764649 4771 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764653 4771 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764657 4771 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764661 4771 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764664 4771 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764668 4771 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764671 4771 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764675 4771 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764678 4771 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764682 4771 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764685 4771 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764689 4771 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764692 4771 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764697 4771 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764701 4771 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764704 4771 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764708 4771 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764712 4771 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764749 4771 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764753 4771 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764757 4771 feature_gate.go:330] unrecognized feature gate: Example Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764761 4771 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764764 4771 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764768 4771 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764771 4771 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764776 4771 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764782 4771 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764793 4771 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.764799 4771 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764924 4771 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764932 4771 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764936 4771 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764940 4771 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764945 4771 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764949 4771 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764952 4771 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764956 4771 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764959 4771 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764963 4771 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764966 4771 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764970 4771 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764973 4771 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764977 4771 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764980 4771 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764984 4771 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764987 4771 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764992 4771 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.764997 4771 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765001 4771 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765005 4771 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765008 4771 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765012 4771 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765016 4771 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765019 4771 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765023 4771 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765027 4771 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765030 4771 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765034 4771 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765037 4771 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765041 4771 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765044 4771 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765049 4771 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765054 4771 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765058 4771 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765062 4771 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765066 4771 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765069 4771 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765074 4771 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765078 4771 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765082 4771 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765086 4771 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765089 4771 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765093 4771 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765096 4771 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765100 4771 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765104 4771 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765107 4771 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765112 4771 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765116 4771 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765120 4771 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765124 4771 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765128 4771 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765132 4771 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765136 4771 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765140 4771 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765144 4771 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765147 4771 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765151 4771 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765155 4771 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765159 4771 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765163 4771 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765168 4771 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765172 4771 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765176 4771 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765179 4771 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765183 4771 feature_gate.go:330] unrecognized feature gate: Example Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765186 4771 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765190 4771 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765194 4771 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.765197 4771 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.765203 4771 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.765340 4771 server.go:940] "Client rotation is on, will bootstrap in background" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.770918 4771 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.771004 4771 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.773009 4771 server.go:997] "Starting client certificate rotation" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.773039 4771 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.773266 4771 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-25 17:43:43.351422493 +0000 UTC Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.773416 4771 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1322h47m47.578010201s for next certificate rotation Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.805355 4771 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.807355 4771 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.827928 4771 log.go:25] "Validated CRI v1 runtime API" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.860877 4771 log.go:25] "Validated CRI v1 image API" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.862633 4771 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.868597 4771 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-01-14-51-29-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.868635 4771 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:45 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:42 fsType:tmpfs blockSize:0}] Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.882805 4771 manager.go:217] Machine: {Timestamp:2025-10-01 14:55:55.880177987 +0000 UTC m=+0.499353168 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654120448 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:ab8b87ec-94d1-4eae-9ea3-b28f83991d01 BootID:f03ada0f-e2c8-42c8-86e3-3e9572f1e63b Filesystems:[{Device:/run/user/1000 DeviceMajor:0 DeviceMinor:45 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:42 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:27:2e:e7 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:27:2e:e7 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:2c:a9:32 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:f8:82:f4 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:f7:98:2e Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:f6:f6:ac Speed:-1 Mtu:1496} {Name:eth10 MacAddress:aa:09:2f:a8:4a:d0 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:fa:86:0c:68:0d:63 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654120448 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.883034 4771 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.883213 4771 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.885298 4771 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.885492 4771 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.885532 4771 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.885882 4771 topology_manager.go:138] "Creating topology manager with none policy" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.885902 4771 container_manager_linux.go:303] "Creating device plugin manager" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.886389 4771 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.886419 4771 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.886605 4771 state_mem.go:36] "Initialized new in-memory state store" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.886697 4771 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.890263 4771 kubelet.go:418] "Attempting to sync node with API server" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.890286 4771 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.890334 4771 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.890347 4771 kubelet.go:324] "Adding apiserver pod source" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.890371 4771 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.894469 4771 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.895539 4771 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.897322 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.897374 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Oct 01 14:55:55 crc kubenswrapper[4771]: E1001 14:55:55.897426 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.234:6443: connect: connection refused" logger="UnhandledError" Oct 01 14:55:55 crc kubenswrapper[4771]: E1001 14:55:55.897440 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.234:6443: connect: connection refused" logger="UnhandledError" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.898125 4771 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.903119 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.903184 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.903202 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.903218 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.903244 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.903269 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.903284 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.903310 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.903326 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.903344 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.903367 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.903382 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.904432 4771 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.905456 4771 server.go:1280] "Started kubelet" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.907376 4771 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.907538 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.907570 4771 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.907692 4771 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.907942 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 20:01:06.361353813 +0000 UTC Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.907987 4771 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 2261h5m10.453368655s for next certificate rotation Oct 01 14:55:55 crc kubenswrapper[4771]: E1001 14:55:55.908192 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 01 14:55:55 crc systemd[1]: Started Kubernetes Kubelet. Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.908435 4771 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.908443 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.908857 4771 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.908892 4771 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.908953 4771 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.909074 4771 factory.go:55] Registering systemd factory Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.909096 4771 factory.go:221] Registration of the systemd container factory successfully Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.909479 4771 factory.go:153] Registering CRI-O factory Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.909519 4771 factory.go:221] Registration of the crio container factory successfully Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.909624 4771 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.909709 4771 factory.go:103] Registering Raw factory Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.909776 4771 manager.go:1196] Started watching for new ooms in manager Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.909813 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Oct 01 14:55:55 crc kubenswrapper[4771]: E1001 14:55:55.909904 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.234:6443: connect: connection refused" logger="UnhandledError" Oct 01 14:55:55 crc kubenswrapper[4771]: E1001 14:55:55.911795 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.234:6443: connect: connection refused" interval="200ms" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.911852 4771 manager.go:319] Starting recovery of all containers Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.912418 4771 server.go:460] "Adding debug handlers to kubelet server" Oct 01 14:55:55 crc kubenswrapper[4771]: E1001 14:55:55.916936 4771 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.234:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186a65d1bb3b5920 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-01 14:55:55.905399072 +0000 UTC m=+0.524574283,LastTimestamp:2025-10-01 14:55:55.905399072 +0000 UTC m=+0.524574283,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.932896 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.933083 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.933191 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.933315 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.933399 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.933529 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.935300 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.935340 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.935376 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.935395 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.935421 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.935441 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.935462 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.935553 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.935604 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.935624 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.935639 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.935715 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.935863 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.935885 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.935968 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.936002 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.936029 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.936046 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.936103 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.936193 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.936257 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.936283 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.936305 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.936345 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.936366 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.940435 4771 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.940494 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.940511 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.940527 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.940542 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.940558 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.940573 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.940592 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.940607 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.940621 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.940634 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.940648 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.940661 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.940674 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.940686 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.940700 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.940712 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.940742 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.940792 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.940805 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.940819 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.940831 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.940861 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.940875 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.940888 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.940904 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.940922 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.940943 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.940953 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.940965 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.940977 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.940988 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941001 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941014 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941026 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941038 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941050 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941063 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941075 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941086 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941097 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941109 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941120 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941131 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941143 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941156 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941168 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941181 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941191 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941202 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941216 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941227 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941239 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941252 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941264 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941293 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941304 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941317 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941331 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941345 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941357 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941369 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941383 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941397 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941415 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941427 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941439 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941452 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941464 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941475 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941488 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941500 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941511 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941523 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941592 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941622 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941634 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941646 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941660 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941671 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941685 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941697 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941711 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941724 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941757 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941769 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941782 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941795 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941806 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941818 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941832 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941845 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941857 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941869 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941883 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941924 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941937 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941951 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941963 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941975 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941985 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.941995 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942006 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942017 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942028 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942038 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942071 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942082 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942091 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942102 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942112 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942123 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942133 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942145 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942174 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942184 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942195 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942206 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942218 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942228 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942241 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942251 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942277 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942289 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942300 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942311 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942321 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942332 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942343 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942354 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942381 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942392 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942411 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942422 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942434 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942445 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942457 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942581 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942607 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942617 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942630 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942647 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942661 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942671 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942684 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942739 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942770 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942782 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942794 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942805 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942818 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942830 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942841 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942866 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942907 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942920 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942933 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942945 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942959 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942971 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942982 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.942995 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.943020 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.943031 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.943043 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.943055 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.943068 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.943080 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.943091 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.943117 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.943142 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.943154 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.943166 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.943225 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.943237 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.943249 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.943260 4771 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.943271 4771 reconstruct.go:97] "Volume reconstruction finished" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.943279 4771 reconciler.go:26] "Reconciler: start to sync state" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.951350 4771 manager.go:324] Recovery completed Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.962189 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.963694 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.963765 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.963779 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.965000 4771 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.965013 4771 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.965033 4771 state_mem.go:36] "Initialized new in-memory state store" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.981288 4771 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.983878 4771 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.983950 4771 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 01 14:55:55 crc kubenswrapper[4771]: I1001 14:55:55.984005 4771 kubelet.go:2335] "Starting kubelet main sync loop" Oct 01 14:55:55 crc kubenswrapper[4771]: E1001 14:55:55.984083 4771 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 01 14:55:55 crc kubenswrapper[4771]: W1001 14:55:55.989275 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Oct 01 14:55:55 crc kubenswrapper[4771]: E1001 14:55:55.989394 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.234:6443: connect: connection refused" logger="UnhandledError" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.003556 4771 policy_none.go:49] "None policy: Start" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.004908 4771 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.004952 4771 state_mem.go:35] "Initializing new in-memory state store" Oct 01 14:55:56 crc kubenswrapper[4771]: E1001 14:55:56.009224 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 01 14:55:56 crc kubenswrapper[4771]: E1001 14:55:56.084298 4771 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Oct 01 14:55:56 crc kubenswrapper[4771]: E1001 14:55:56.109993 4771 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.110296 4771 manager.go:334] "Starting Device Plugin manager" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.110403 4771 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.110428 4771 server.go:79] "Starting device plugin registration server" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.110935 4771 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.110959 4771 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.111507 4771 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.111627 4771 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.111635 4771 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 01 14:55:56 crc kubenswrapper[4771]: E1001 14:55:56.112528 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.234:6443: connect: connection refused" interval="400ms" Oct 01 14:55:56 crc kubenswrapper[4771]: E1001 14:55:56.119418 4771 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.211470 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.213099 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.213145 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.213159 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.213191 4771 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 14:55:56 crc kubenswrapper[4771]: E1001 14:55:56.213845 4771 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.234:6443: connect: connection refused" node="crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.284819 4771 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.284997 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.286545 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.286613 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.286627 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.286945 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.287183 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.287232 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.288152 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.288165 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.288189 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.288201 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.288190 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.288244 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.288650 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.288775 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.288824 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.289659 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.289692 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.289708 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.289779 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.289802 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.289813 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.289872 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.290046 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.290134 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.290483 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.290515 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.290526 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.290617 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.290752 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.290792 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.293183 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.293222 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.293247 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.293507 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.293547 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.293656 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.293716 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.293787 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.295146 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.295405 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.296383 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.296471 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.296414 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.296503 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.352207 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.352287 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.352327 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.352361 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.352434 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.352536 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.352583 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.352605 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.352622 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.352639 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.352669 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.352748 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.352790 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.352814 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.352833 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.414089 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.416043 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.416081 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.416145 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.416226 4771 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 14:55:56 crc kubenswrapper[4771]: E1001 14:55:56.417015 4771 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.234:6443: connect: connection refused" node="crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.454482 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.454554 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.454591 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.454628 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.454666 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.454697 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.454721 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.454765 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.454883 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.454898 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.454935 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.454942 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.454969 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.454975 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.455015 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.455021 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.455048 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.455071 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.455088 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.455107 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.454828 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.455159 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.455172 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.455197 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.455219 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.455238 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.455295 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.455330 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.455372 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.455429 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: E1001 14:55:56.513912 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.234:6443: connect: connection refused" interval="800ms" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.618442 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.645221 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: W1001 14:55:56.663960 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-f9002cc6552bc3c9f79c6cb2f0f87378f570d37f919d8af4298cd9dcd9313705 WatchSource:0}: Error finding container f9002cc6552bc3c9f79c6cb2f0f87378f570d37f919d8af4298cd9dcd9313705: Status 404 returned error can't find the container with id f9002cc6552bc3c9f79c6cb2f0f87378f570d37f919d8af4298cd9dcd9313705 Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.666692 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: W1001 14:55:56.679141 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-15d98d0e79ea384f031f9607f61e18cceca165acb7fcade2697b87477298aa73 WatchSource:0}: Error finding container 15d98d0e79ea384f031f9607f61e18cceca165acb7fcade2697b87477298aa73: Status 404 returned error can't find the container with id 15d98d0e79ea384f031f9607f61e18cceca165acb7fcade2697b87477298aa73 Oct 01 14:55:56 crc kubenswrapper[4771]: W1001 14:55:56.685121 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-f1925c88cbdbc01606be0fb097d0e194686f34deb05142d210421804b7eea08e WatchSource:0}: Error finding container f1925c88cbdbc01606be0fb097d0e194686f34deb05142d210421804b7eea08e: Status 404 returned error can't find the container with id f1925c88cbdbc01606be0fb097d0e194686f34deb05142d210421804b7eea08e Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.685975 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.697563 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 01 14:55:56 crc kubenswrapper[4771]: W1001 14:55:56.704411 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-beb1b2275ec4a9c98007b7417759ad2da7db6a08e45a5d8cbf8036215a8e2391 WatchSource:0}: Error finding container beb1b2275ec4a9c98007b7417759ad2da7db6a08e45a5d8cbf8036215a8e2391: Status 404 returned error can't find the container with id beb1b2275ec4a9c98007b7417759ad2da7db6a08e45a5d8cbf8036215a8e2391 Oct 01 14:55:56 crc kubenswrapper[4771]: W1001 14:55:56.714290 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-fb8d6113267b5ae2910cd04a83e7ae7c9cf1a0bc5c54c726b6dd48ecb5ac8d1c WatchSource:0}: Error finding container fb8d6113267b5ae2910cd04a83e7ae7c9cf1a0bc5c54c726b6dd48ecb5ac8d1c: Status 404 returned error can't find the container with id fb8d6113267b5ae2910cd04a83e7ae7c9cf1a0bc5c54c726b6dd48ecb5ac8d1c Oct 01 14:55:56 crc kubenswrapper[4771]: W1001 14:55:56.812549 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Oct 01 14:55:56 crc kubenswrapper[4771]: E1001 14:55:56.812695 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.234:6443: connect: connection refused" logger="UnhandledError" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.818106 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.820084 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.820133 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.820148 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.820178 4771 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 14:55:56 crc kubenswrapper[4771]: E1001 14:55:56.820722 4771 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.234:6443: connect: connection refused" node="crc" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.910862 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Oct 01 14:55:56 crc kubenswrapper[4771]: W1001 14:55:56.975444 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Oct 01 14:55:56 crc kubenswrapper[4771]: E1001 14:55:56.975612 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.234:6443: connect: connection refused" logger="UnhandledError" Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.991332 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"fb8d6113267b5ae2910cd04a83e7ae7c9cf1a0bc5c54c726b6dd48ecb5ac8d1c"} Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.992466 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"beb1b2275ec4a9c98007b7417759ad2da7db6a08e45a5d8cbf8036215a8e2391"} Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.993602 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"f1925c88cbdbc01606be0fb097d0e194686f34deb05142d210421804b7eea08e"} Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.995287 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"15d98d0e79ea384f031f9607f61e18cceca165acb7fcade2697b87477298aa73"} Oct 01 14:55:56 crc kubenswrapper[4771]: I1001 14:55:56.996494 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f9002cc6552bc3c9f79c6cb2f0f87378f570d37f919d8af4298cd9dcd9313705"} Oct 01 14:55:57 crc kubenswrapper[4771]: W1001 14:55:57.150711 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Oct 01 14:55:57 crc kubenswrapper[4771]: E1001 14:55:57.150879 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.234:6443: connect: connection refused" logger="UnhandledError" Oct 01 14:55:57 crc kubenswrapper[4771]: W1001 14:55:57.261616 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Oct 01 14:55:57 crc kubenswrapper[4771]: E1001 14:55:57.261766 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.234:6443: connect: connection refused" logger="UnhandledError" Oct 01 14:55:57 crc kubenswrapper[4771]: E1001 14:55:57.315001 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.234:6443: connect: connection refused" interval="1.6s" Oct 01 14:55:57 crc kubenswrapper[4771]: I1001 14:55:57.620950 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 14:55:57 crc kubenswrapper[4771]: I1001 14:55:57.622875 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:55:57 crc kubenswrapper[4771]: I1001 14:55:57.622915 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:55:57 crc kubenswrapper[4771]: I1001 14:55:57.622925 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:55:57 crc kubenswrapper[4771]: I1001 14:55:57.622952 4771 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 14:55:57 crc kubenswrapper[4771]: E1001 14:55:57.623529 4771 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.234:6443: connect: connection refused" node="crc" Oct 01 14:55:57 crc kubenswrapper[4771]: I1001 14:55:57.912271 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Oct 01 14:55:58 crc kubenswrapper[4771]: I1001 14:55:58.001138 4771 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b" exitCode=0 Oct 01 14:55:58 crc kubenswrapper[4771]: I1001 14:55:58.001238 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b"} Oct 01 14:55:58 crc kubenswrapper[4771]: I1001 14:55:58.001323 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 14:55:58 crc kubenswrapper[4771]: I1001 14:55:58.002572 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:55:58 crc kubenswrapper[4771]: I1001 14:55:58.002614 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:55:58 crc kubenswrapper[4771]: I1001 14:55:58.002628 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:55:58 crc kubenswrapper[4771]: I1001 14:55:58.003328 4771 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="bc6ccbd034a5825c2e5e55954fa2ad1b33ba80ec6c5c4dcbcf629fa71d57f3a4" exitCode=0 Oct 01 14:55:58 crc kubenswrapper[4771]: I1001 14:55:58.003388 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"bc6ccbd034a5825c2e5e55954fa2ad1b33ba80ec6c5c4dcbcf629fa71d57f3a4"} Oct 01 14:55:58 crc kubenswrapper[4771]: I1001 14:55:58.003449 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 14:55:58 crc kubenswrapper[4771]: I1001 14:55:58.004589 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:55:58 crc kubenswrapper[4771]: I1001 14:55:58.004627 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:55:58 crc kubenswrapper[4771]: I1001 14:55:58.004640 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:55:58 crc kubenswrapper[4771]: I1001 14:55:58.006564 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"495c07855c881f2a9cd788740e2197c165a38891c729a8563f3540bf5c6d8dcc"} Oct 01 14:55:58 crc kubenswrapper[4771]: I1001 14:55:58.006629 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e0651d298d7c1364b4de3b030574abd7d6ba8ccacab5872d74162878f92592cf"} Oct 01 14:55:58 crc kubenswrapper[4771]: I1001 14:55:58.006642 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b72097f8016d0e79134956d3555973cd7c137c30ab5eade1bd92805d16f66bf2"} Oct 01 14:55:58 crc kubenswrapper[4771]: I1001 14:55:58.008676 4771 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="eab2c2fe2d4eae570e4686e0c48ff8e9407ff544bcd9f5339371287c23449333" exitCode=0 Oct 01 14:55:58 crc kubenswrapper[4771]: I1001 14:55:58.008781 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"eab2c2fe2d4eae570e4686e0c48ff8e9407ff544bcd9f5339371287c23449333"} Oct 01 14:55:58 crc kubenswrapper[4771]: I1001 14:55:58.008852 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 14:55:58 crc kubenswrapper[4771]: I1001 14:55:58.010122 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:55:58 crc kubenswrapper[4771]: I1001 14:55:58.010160 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:55:58 crc kubenswrapper[4771]: I1001 14:55:58.010175 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:55:58 crc kubenswrapper[4771]: I1001 14:55:58.012286 4771 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2" exitCode=0 Oct 01 14:55:58 crc kubenswrapper[4771]: I1001 14:55:58.012343 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2"} Oct 01 14:55:58 crc kubenswrapper[4771]: I1001 14:55:58.012402 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 14:55:58 crc kubenswrapper[4771]: I1001 14:55:58.013664 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:55:58 crc kubenswrapper[4771]: I1001 14:55:58.013704 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:55:58 crc kubenswrapper[4771]: I1001 14:55:58.013718 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:55:58 crc kubenswrapper[4771]: I1001 14:55:58.016886 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 14:55:58 crc kubenswrapper[4771]: I1001 14:55:58.018097 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:55:58 crc kubenswrapper[4771]: I1001 14:55:58.018134 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:55:58 crc kubenswrapper[4771]: I1001 14:55:58.018145 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:55:58 crc kubenswrapper[4771]: W1001 14:55:58.809977 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Oct 01 14:55:58 crc kubenswrapper[4771]: E1001 14:55:58.810109 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.234:6443: connect: connection refused" logger="UnhandledError" Oct 01 14:55:58 crc kubenswrapper[4771]: I1001 14:55:58.909555 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Oct 01 14:55:58 crc kubenswrapper[4771]: E1001 14:55:58.916367 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.234:6443: connect: connection refused" interval="3.2s" Oct 01 14:55:59 crc kubenswrapper[4771]: I1001 14:55:59.018985 4771 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a" exitCode=0 Oct 01 14:55:59 crc kubenswrapper[4771]: I1001 14:55:59.019086 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a"} Oct 01 14:55:59 crc kubenswrapper[4771]: I1001 14:55:59.019135 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 14:55:59 crc kubenswrapper[4771]: I1001 14:55:59.020615 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:55:59 crc kubenswrapper[4771]: I1001 14:55:59.020644 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:55:59 crc kubenswrapper[4771]: I1001 14:55:59.020654 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:55:59 crc kubenswrapper[4771]: I1001 14:55:59.021904 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 14:55:59 crc kubenswrapper[4771]: I1001 14:55:59.021894 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"621005989b46f09ca22c899db00d49019b74cf946212ec59e70ed6d11fd88118"} Oct 01 14:55:59 crc kubenswrapper[4771]: I1001 14:55:59.022664 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:55:59 crc kubenswrapper[4771]: I1001 14:55:59.022692 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:55:59 crc kubenswrapper[4771]: I1001 14:55:59.022700 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:55:59 crc kubenswrapper[4771]: I1001 14:55:59.026257 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7fb83a6932caf24a6b2430836bc2800415028a8134a2355a816ccbb73efbaf46"} Oct 01 14:55:59 crc kubenswrapper[4771]: I1001 14:55:59.026312 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 14:55:59 crc kubenswrapper[4771]: I1001 14:55:59.028374 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:55:59 crc kubenswrapper[4771]: I1001 14:55:59.028399 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:55:59 crc kubenswrapper[4771]: I1001 14:55:59.028411 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:55:59 crc kubenswrapper[4771]: I1001 14:55:59.031668 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e47111864672ec3e393187147b7390f995634d4d32bf75915b5cdbb3915aca9a"} Oct 01 14:55:59 crc kubenswrapper[4771]: I1001 14:55:59.031710 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 14:55:59 crc kubenswrapper[4771]: I1001 14:55:59.031751 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b5837a628a7ed87d4bc032e06b4732df175e922bf49ecbffee596f79c5357c53"} Oct 01 14:55:59 crc kubenswrapper[4771]: I1001 14:55:59.031773 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"fe139fdfee8f2ebb2368fa660edd669455c3b903836d7ef6212dea9921d8488b"} Oct 01 14:55:59 crc kubenswrapper[4771]: I1001 14:55:59.032553 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:55:59 crc kubenswrapper[4771]: I1001 14:55:59.032593 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:55:59 crc kubenswrapper[4771]: I1001 14:55:59.032609 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:55:59 crc kubenswrapper[4771]: W1001 14:55:59.034187 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Oct 01 14:55:59 crc kubenswrapper[4771]: E1001 14:55:59.034269 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.234:6443: connect: connection refused" logger="UnhandledError" Oct 01 14:55:59 crc kubenswrapper[4771]: I1001 14:55:59.034877 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3dd6957bcec2f931ddc0f584bb947cc6c8e19aee87c4fa2d2727151312e00bc2"} Oct 01 14:55:59 crc kubenswrapper[4771]: I1001 14:55:59.034905 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"eea2a54d9ce79c067adf62a1a112758028562f68fd877d7b5c1f0ac808fde931"} Oct 01 14:55:59 crc kubenswrapper[4771]: I1001 14:55:59.034917 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8551072e274183e6126e06729369468bcf9a98a9a487ff2febe02b088a6a51aa"} Oct 01 14:55:59 crc kubenswrapper[4771]: I1001 14:55:59.223920 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 14:55:59 crc kubenswrapper[4771]: I1001 14:55:59.225629 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:55:59 crc kubenswrapper[4771]: I1001 14:55:59.225682 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:55:59 crc kubenswrapper[4771]: I1001 14:55:59.225696 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:55:59 crc kubenswrapper[4771]: I1001 14:55:59.225777 4771 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 14:55:59 crc kubenswrapper[4771]: E1001 14:55:59.226502 4771 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.234:6443: connect: connection refused" node="crc" Oct 01 14:55:59 crc kubenswrapper[4771]: W1001 14:55:59.567156 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Oct 01 14:55:59 crc kubenswrapper[4771]: E1001 14:55:59.567261 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.234:6443: connect: connection refused" logger="UnhandledError" Oct 01 14:55:59 crc kubenswrapper[4771]: I1001 14:55:59.909359 4771 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Oct 01 14:56:00 crc kubenswrapper[4771]: I1001 14:56:00.043576 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"485cfe681f34198bd9ca680708ea086be68766fab847c4bf30049444b74f7355"} Oct 01 14:56:00 crc kubenswrapper[4771]: I1001 14:56:00.043644 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d920faabd05e65200c4defe1588344af5ae2b34148b4213bdc86abe9dd19d236"} Oct 01 14:56:00 crc kubenswrapper[4771]: I1001 14:56:00.043713 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 14:56:00 crc kubenswrapper[4771]: I1001 14:56:00.044837 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:00 crc kubenswrapper[4771]: I1001 14:56:00.044902 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:00 crc kubenswrapper[4771]: I1001 14:56:00.044923 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:00 crc kubenswrapper[4771]: I1001 14:56:00.046068 4771 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446" exitCode=0 Oct 01 14:56:00 crc kubenswrapper[4771]: I1001 14:56:00.046134 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446"} Oct 01 14:56:00 crc kubenswrapper[4771]: I1001 14:56:00.046213 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 14:56:00 crc kubenswrapper[4771]: I1001 14:56:00.046243 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 14:56:00 crc kubenswrapper[4771]: I1001 14:56:00.046258 4771 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 14:56:00 crc kubenswrapper[4771]: I1001 14:56:00.046293 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 14:56:00 crc kubenswrapper[4771]: I1001 14:56:00.046304 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 14:56:00 crc kubenswrapper[4771]: I1001 14:56:00.048433 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:00 crc kubenswrapper[4771]: I1001 14:56:00.048465 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:00 crc kubenswrapper[4771]: I1001 14:56:00.048482 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:00 crc kubenswrapper[4771]: I1001 14:56:00.048926 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:00 crc kubenswrapper[4771]: I1001 14:56:00.048986 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:00 crc kubenswrapper[4771]: I1001 14:56:00.049004 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:00 crc kubenswrapper[4771]: I1001 14:56:00.049187 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:00 crc kubenswrapper[4771]: I1001 14:56:00.049232 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:00 crc kubenswrapper[4771]: I1001 14:56:00.049252 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:00 crc kubenswrapper[4771]: I1001 14:56:00.049401 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:00 crc kubenswrapper[4771]: I1001 14:56:00.049423 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:00 crc kubenswrapper[4771]: I1001 14:56:00.049445 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:00 crc kubenswrapper[4771]: I1001 14:56:00.210262 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 14:56:00 crc kubenswrapper[4771]: I1001 14:56:00.220799 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 14:56:00 crc kubenswrapper[4771]: I1001 14:56:00.570173 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 14:56:00 crc kubenswrapper[4771]: I1001 14:56:00.681933 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 14:56:01 crc kubenswrapper[4771]: I1001 14:56:01.056129 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"cb9cd85dad256af814a22cd9787f3c1d4061ff2b76a0fb1a92b547a10c809a52"} Oct 01 14:56:01 crc kubenswrapper[4771]: I1001 14:56:01.056189 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7dbd5c544c544544637a62c6e11372b59bdddc28c6ddb14e2cdd93d2d02a85b3"} Oct 01 14:56:01 crc kubenswrapper[4771]: I1001 14:56:01.056203 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"dbe4f8936af05697692a6547962fa77f3ac8bb357e3484324073f7768b374060"} Oct 01 14:56:01 crc kubenswrapper[4771]: I1001 14:56:01.056220 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 14:56:01 crc kubenswrapper[4771]: I1001 14:56:01.056283 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 14:56:01 crc kubenswrapper[4771]: I1001 14:56:01.057833 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:01 crc kubenswrapper[4771]: I1001 14:56:01.057861 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:01 crc kubenswrapper[4771]: I1001 14:56:01.057872 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:01 crc kubenswrapper[4771]: I1001 14:56:01.057859 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:01 crc kubenswrapper[4771]: I1001 14:56:01.058040 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:01 crc kubenswrapper[4771]: I1001 14:56:01.058067 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:01 crc kubenswrapper[4771]: I1001 14:56:01.397490 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 14:56:01 crc kubenswrapper[4771]: I1001 14:56:01.605448 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 14:56:02 crc kubenswrapper[4771]: I1001 14:56:02.064059 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6def3f33927474e3a3c86835c9606ccfd32e235441e1646cdba4cef0b41e2f57"} Oct 01 14:56:02 crc kubenswrapper[4771]: I1001 14:56:02.064121 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 14:56:02 crc kubenswrapper[4771]: I1001 14:56:02.064141 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d50224da1e43f0ad8835be229e79e94d6603a0fcb0879b9dd306e4cc357c96f1"} Oct 01 14:56:02 crc kubenswrapper[4771]: I1001 14:56:02.064338 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 14:56:02 crc kubenswrapper[4771]: I1001 14:56:02.064472 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 14:56:02 crc kubenswrapper[4771]: I1001 14:56:02.064948 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:02 crc kubenswrapper[4771]: I1001 14:56:02.064970 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:02 crc kubenswrapper[4771]: I1001 14:56:02.064979 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:02 crc kubenswrapper[4771]: I1001 14:56:02.065799 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:02 crc kubenswrapper[4771]: I1001 14:56:02.065829 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:02 crc kubenswrapper[4771]: I1001 14:56:02.065838 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:02 crc kubenswrapper[4771]: I1001 14:56:02.066197 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:02 crc kubenswrapper[4771]: I1001 14:56:02.066266 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:02 crc kubenswrapper[4771]: I1001 14:56:02.066287 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:02 crc kubenswrapper[4771]: I1001 14:56:02.427545 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 14:56:02 crc kubenswrapper[4771]: I1001 14:56:02.429722 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:02 crc kubenswrapper[4771]: I1001 14:56:02.429856 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:02 crc kubenswrapper[4771]: I1001 14:56:02.429878 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:02 crc kubenswrapper[4771]: I1001 14:56:02.429930 4771 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 14:56:02 crc kubenswrapper[4771]: I1001 14:56:02.546632 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 14:56:03 crc kubenswrapper[4771]: I1001 14:56:03.058351 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 14:56:03 crc kubenswrapper[4771]: I1001 14:56:03.068949 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 14:56:03 crc kubenswrapper[4771]: I1001 14:56:03.069023 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 14:56:03 crc kubenswrapper[4771]: I1001 14:56:03.068992 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 14:56:03 crc kubenswrapper[4771]: I1001 14:56:03.073556 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:03 crc kubenswrapper[4771]: I1001 14:56:03.073776 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:03 crc kubenswrapper[4771]: I1001 14:56:03.073866 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:03 crc kubenswrapper[4771]: I1001 14:56:03.073890 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:03 crc kubenswrapper[4771]: I1001 14:56:03.073917 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:03 crc kubenswrapper[4771]: I1001 14:56:03.073958 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:03 crc kubenswrapper[4771]: I1001 14:56:03.074811 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:03 crc kubenswrapper[4771]: I1001 14:56:03.074870 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:03 crc kubenswrapper[4771]: I1001 14:56:03.074895 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:04 crc kubenswrapper[4771]: I1001 14:56:04.071804 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 14:56:04 crc kubenswrapper[4771]: I1001 14:56:04.071945 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 14:56:04 crc kubenswrapper[4771]: I1001 14:56:04.073125 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:04 crc kubenswrapper[4771]: I1001 14:56:04.073188 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:04 crc kubenswrapper[4771]: I1001 14:56:04.073199 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:04 crc kubenswrapper[4771]: I1001 14:56:04.073308 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:04 crc kubenswrapper[4771]: I1001 14:56:04.073331 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:04 crc kubenswrapper[4771]: I1001 14:56:04.073339 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:04 crc kubenswrapper[4771]: I1001 14:56:04.315448 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 14:56:04 crc kubenswrapper[4771]: I1001 14:56:04.315839 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 14:56:04 crc kubenswrapper[4771]: I1001 14:56:04.317911 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:04 crc kubenswrapper[4771]: I1001 14:56:04.318001 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:04 crc kubenswrapper[4771]: I1001 14:56:04.318022 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:05 crc kubenswrapper[4771]: I1001 14:56:05.547272 4771 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 01 14:56:05 crc kubenswrapper[4771]: I1001 14:56:05.547379 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 01 14:56:05 crc kubenswrapper[4771]: I1001 14:56:05.582692 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 01 14:56:05 crc kubenswrapper[4771]: I1001 14:56:05.583123 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 14:56:05 crc kubenswrapper[4771]: I1001 14:56:05.585205 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:05 crc kubenswrapper[4771]: I1001 14:56:05.585269 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:05 crc kubenswrapper[4771]: I1001 14:56:05.585282 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:06 crc kubenswrapper[4771]: E1001 14:56:06.119559 4771 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 01 14:56:07 crc kubenswrapper[4771]: I1001 14:56:07.177479 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 01 14:56:07 crc kubenswrapper[4771]: I1001 14:56:07.177805 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 14:56:07 crc kubenswrapper[4771]: I1001 14:56:07.179602 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:07 crc kubenswrapper[4771]: I1001 14:56:07.179660 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:07 crc kubenswrapper[4771]: I1001 14:56:07.179674 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:10 crc kubenswrapper[4771]: W1001 14:56:10.404460 4771 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 01 14:56:10 crc kubenswrapper[4771]: I1001 14:56:10.404567 4771 trace.go:236] Trace[300636753]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Oct-2025 14:56:00.402) (total time: 10001ms): Oct 01 14:56:10 crc kubenswrapper[4771]: Trace[300636753]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (14:56:10.404) Oct 01 14:56:10 crc kubenswrapper[4771]: Trace[300636753]: [10.001885907s] [10.001885907s] END Oct 01 14:56:10 crc kubenswrapper[4771]: E1001 14:56:10.404593 4771 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 01 14:56:10 crc kubenswrapper[4771]: I1001 14:56:10.454420 4771 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 01 14:56:10 crc kubenswrapper[4771]: I1001 14:56:10.454513 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 01 14:56:10 crc kubenswrapper[4771]: I1001 14:56:10.458413 4771 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 01 14:56:10 crc kubenswrapper[4771]: I1001 14:56:10.458513 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 01 14:56:10 crc kubenswrapper[4771]: I1001 14:56:10.571711 4771 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 01 14:56:10 crc kubenswrapper[4771]: I1001 14:56:10.571859 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 01 14:56:10 crc kubenswrapper[4771]: I1001 14:56:10.688938 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 14:56:10 crc kubenswrapper[4771]: I1001 14:56:10.689121 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 14:56:10 crc kubenswrapper[4771]: I1001 14:56:10.690654 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:10 crc kubenswrapper[4771]: I1001 14:56:10.690724 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:10 crc kubenswrapper[4771]: I1001 14:56:10.690789 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:11 crc kubenswrapper[4771]: I1001 14:56:11.095795 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 01 14:56:11 crc kubenswrapper[4771]: I1001 14:56:11.098808 4771 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="485cfe681f34198bd9ca680708ea086be68766fab847c4bf30049444b74f7355" exitCode=255 Oct 01 14:56:11 crc kubenswrapper[4771]: I1001 14:56:11.098866 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"485cfe681f34198bd9ca680708ea086be68766fab847c4bf30049444b74f7355"} Oct 01 14:56:11 crc kubenswrapper[4771]: I1001 14:56:11.099059 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 14:56:11 crc kubenswrapper[4771]: I1001 14:56:11.100171 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:11 crc kubenswrapper[4771]: I1001 14:56:11.100210 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:11 crc kubenswrapper[4771]: I1001 14:56:11.100225 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:11 crc kubenswrapper[4771]: I1001 14:56:11.100932 4771 scope.go:117] "RemoveContainer" containerID="485cfe681f34198bd9ca680708ea086be68766fab847c4bf30049444b74f7355" Oct 01 14:56:11 crc kubenswrapper[4771]: I1001 14:56:11.404360 4771 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 01 14:56:11 crc kubenswrapper[4771]: [+]log ok Oct 01 14:56:11 crc kubenswrapper[4771]: [+]etcd ok Oct 01 14:56:11 crc kubenswrapper[4771]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 01 14:56:11 crc kubenswrapper[4771]: [+]poststarthook/openshift.io-api-request-count-filter ok Oct 01 14:56:11 crc kubenswrapper[4771]: [+]poststarthook/openshift.io-startkubeinformers ok Oct 01 14:56:11 crc kubenswrapper[4771]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Oct 01 14:56:11 crc kubenswrapper[4771]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Oct 01 14:56:11 crc kubenswrapper[4771]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 01 14:56:11 crc kubenswrapper[4771]: [+]poststarthook/generic-apiserver-start-informers ok Oct 01 14:56:11 crc kubenswrapper[4771]: [+]poststarthook/priority-and-fairness-config-consumer ok Oct 01 14:56:11 crc kubenswrapper[4771]: [+]poststarthook/priority-and-fairness-filter ok Oct 01 14:56:11 crc kubenswrapper[4771]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 01 14:56:11 crc kubenswrapper[4771]: [+]poststarthook/start-apiextensions-informers ok Oct 01 14:56:11 crc kubenswrapper[4771]: [+]poststarthook/start-apiextensions-controllers ok Oct 01 14:56:11 crc kubenswrapper[4771]: [+]poststarthook/crd-informer-synced ok Oct 01 14:56:11 crc kubenswrapper[4771]: [+]poststarthook/start-system-namespaces-controller ok Oct 01 14:56:11 crc kubenswrapper[4771]: [+]poststarthook/start-cluster-authentication-info-controller ok Oct 01 14:56:11 crc kubenswrapper[4771]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Oct 01 14:56:11 crc kubenswrapper[4771]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Oct 01 14:56:11 crc kubenswrapper[4771]: [+]poststarthook/start-legacy-token-tracking-controller ok Oct 01 14:56:11 crc kubenswrapper[4771]: [+]poststarthook/start-service-ip-repair-controllers ok Oct 01 14:56:11 crc kubenswrapper[4771]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Oct 01 14:56:11 crc kubenswrapper[4771]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Oct 01 14:56:11 crc kubenswrapper[4771]: [+]poststarthook/priority-and-fairness-config-producer ok Oct 01 14:56:11 crc kubenswrapper[4771]: [+]poststarthook/bootstrap-controller ok Oct 01 14:56:11 crc kubenswrapper[4771]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Oct 01 14:56:11 crc kubenswrapper[4771]: [+]poststarthook/start-kube-aggregator-informers ok Oct 01 14:56:11 crc kubenswrapper[4771]: [+]poststarthook/apiservice-status-local-available-controller ok Oct 01 14:56:11 crc kubenswrapper[4771]: [+]poststarthook/apiservice-status-remote-available-controller ok Oct 01 14:56:11 crc kubenswrapper[4771]: [+]poststarthook/apiservice-registration-controller ok Oct 01 14:56:11 crc kubenswrapper[4771]: [+]poststarthook/apiservice-wait-for-first-sync ok Oct 01 14:56:11 crc kubenswrapper[4771]: [+]poststarthook/apiservice-discovery-controller ok Oct 01 14:56:11 crc kubenswrapper[4771]: [+]poststarthook/kube-apiserver-autoregistration ok Oct 01 14:56:11 crc kubenswrapper[4771]: [+]autoregister-completion ok Oct 01 14:56:11 crc kubenswrapper[4771]: [+]poststarthook/apiservice-openapi-controller ok Oct 01 14:56:11 crc kubenswrapper[4771]: [+]poststarthook/apiservice-openapiv3-controller ok Oct 01 14:56:11 crc kubenswrapper[4771]: livez check failed Oct 01 14:56:11 crc kubenswrapper[4771]: I1001 14:56:11.404434 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 14:56:12 crc kubenswrapper[4771]: I1001 14:56:12.107043 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 01 14:56:12 crc kubenswrapper[4771]: I1001 14:56:12.110769 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570"} Oct 01 14:56:12 crc kubenswrapper[4771]: I1001 14:56:12.111082 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 14:56:12 crc kubenswrapper[4771]: I1001 14:56:12.112471 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:12 crc kubenswrapper[4771]: I1001 14:56:12.112524 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:12 crc kubenswrapper[4771]: I1001 14:56:12.112542 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:15 crc kubenswrapper[4771]: E1001 14:56:15.442030 4771 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 01 14:56:15 crc kubenswrapper[4771]: I1001 14:56:15.444965 4771 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 01 14:56:15 crc kubenswrapper[4771]: I1001 14:56:15.446009 4771 trace.go:236] Trace[332185425]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Oct-2025 14:56:02.386) (total time: 13059ms): Oct 01 14:56:15 crc kubenswrapper[4771]: Trace[332185425]: ---"Objects listed" error: 13058ms (14:56:15.445) Oct 01 14:56:15 crc kubenswrapper[4771]: Trace[332185425]: [13.059064222s] [13.059064222s] END Oct 01 14:56:15 crc kubenswrapper[4771]: I1001 14:56:15.446075 4771 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 01 14:56:15 crc kubenswrapper[4771]: E1001 14:56:15.448063 4771 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 01 14:56:15 crc kubenswrapper[4771]: I1001 14:56:15.449413 4771 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 01 14:56:15 crc kubenswrapper[4771]: I1001 14:56:15.450469 4771 trace.go:236] Trace[1070844422]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (01-Oct-2025 14:56:02.754) (total time: 12695ms): Oct 01 14:56:15 crc kubenswrapper[4771]: Trace[1070844422]: ---"Objects listed" error: 12695ms (14:56:15.450) Oct 01 14:56:15 crc kubenswrapper[4771]: Trace[1070844422]: [12.695418137s] [12.695418137s] END Oct 01 14:56:15 crc kubenswrapper[4771]: I1001 14:56:15.450515 4771 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 01 14:56:15 crc kubenswrapper[4771]: I1001 14:56:15.547617 4771 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 01 14:56:15 crc kubenswrapper[4771]: I1001 14:56:15.547795 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 01 14:56:16 crc kubenswrapper[4771]: E1001 14:56:16.119664 4771 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 01 14:56:16 crc kubenswrapper[4771]: I1001 14:56:16.404423 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 14:56:16 crc kubenswrapper[4771]: I1001 14:56:16.404672 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 14:56:16 crc kubenswrapper[4771]: I1001 14:56:16.404831 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 14:56:16 crc kubenswrapper[4771]: I1001 14:56:16.406359 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:16 crc kubenswrapper[4771]: I1001 14:56:16.406434 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:16 crc kubenswrapper[4771]: I1001 14:56:16.406462 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:16 crc kubenswrapper[4771]: I1001 14:56:16.410515 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 14:56:16 crc kubenswrapper[4771]: I1001 14:56:16.421696 4771 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 01 14:56:16 crc kubenswrapper[4771]: I1001 14:56:16.905406 4771 apiserver.go:52] "Watching apiserver" Oct 01 14:56:16 crc kubenswrapper[4771]: I1001 14:56:16.910433 4771 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 01 14:56:16 crc kubenswrapper[4771]: I1001 14:56:16.910804 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb"] Oct 01 14:56:16 crc kubenswrapper[4771]: I1001 14:56:16.911223 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 14:56:16 crc kubenswrapper[4771]: I1001 14:56:16.911344 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:56:16 crc kubenswrapper[4771]: I1001 14:56:16.911380 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:56:16 crc kubenswrapper[4771]: E1001 14:56:16.911556 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:56:16 crc kubenswrapper[4771]: I1001 14:56:16.911613 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 14:56:16 crc kubenswrapper[4771]: E1001 14:56:16.912034 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:56:16 crc kubenswrapper[4771]: I1001 14:56:16.912400 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:56:16 crc kubenswrapper[4771]: I1001 14:56:16.912445 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 14:56:16 crc kubenswrapper[4771]: E1001 14:56:16.912467 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:56:16 crc kubenswrapper[4771]: I1001 14:56:16.913581 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 01 14:56:16 crc kubenswrapper[4771]: I1001 14:56:16.914472 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 01 14:56:16 crc kubenswrapper[4771]: I1001 14:56:16.919163 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 01 14:56:16 crc kubenswrapper[4771]: I1001 14:56:16.919394 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 01 14:56:16 crc kubenswrapper[4771]: I1001 14:56:16.919511 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 01 14:56:16 crc kubenswrapper[4771]: I1001 14:56:16.919935 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 01 14:56:16 crc kubenswrapper[4771]: I1001 14:56:16.919984 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 01 14:56:16 crc kubenswrapper[4771]: I1001 14:56:16.920074 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 01 14:56:16 crc kubenswrapper[4771]: I1001 14:56:16.920152 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 01 14:56:16 crc kubenswrapper[4771]: I1001 14:56:16.947177 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 14:56:16 crc kubenswrapper[4771]: I1001 14:56:16.965200 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 14:56:16 crc kubenswrapper[4771]: I1001 14:56:16.978632 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 14:56:16 crc kubenswrapper[4771]: I1001 14:56:16.991704 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.003539 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.010106 4771 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.014615 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.024790 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.059245 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.059348 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.059405 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.059449 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.059465 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.059480 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.059496 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.059511 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.059533 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.059549 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.059564 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.059581 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.059596 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.059615 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.059657 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.059674 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.059691 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.059715 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.059752 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.059773 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.059791 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.059858 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.059873 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.059889 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.059942 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.059991 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.060135 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.060159 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.060181 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.060198 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.060215 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.060231 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.060247 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.060264 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.060286 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.060307 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.060330 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.060362 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.060343 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.060384 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.060463 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.060471 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.060508 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.060519 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.060543 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.060561 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.060580 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.060600 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.060621 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.060640 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.060662 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.060679 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.060680 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.060698 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.060716 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.060717 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.060756 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.060776 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.060796 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.060815 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.060832 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.060850 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.060868 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.060885 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.060900 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.060907 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.060919 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.060940 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.060959 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.060976 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.060993 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061011 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061029 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061086 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061102 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061118 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061133 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061148 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061168 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061186 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061204 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061220 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061235 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061254 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061273 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061291 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061313 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061373 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061408 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061431 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061451 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061524 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061549 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061567 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061584 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061600 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061617 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061635 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061658 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061701 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061718 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061762 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061781 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061797 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061815 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061832 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061850 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061866 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061883 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061900 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061917 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061933 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061953 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061969 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061985 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062002 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062017 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062033 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062050 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062066 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062083 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062100 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062118 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062134 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062150 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062166 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062181 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062196 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062212 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062228 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062245 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062263 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062279 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062294 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062311 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062331 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062361 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062384 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062402 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062418 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062434 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062451 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062479 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062496 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062511 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062528 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062545 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062563 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062586 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062603 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062621 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062638 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062654 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062672 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062689 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062706 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062722 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062760 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062779 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062796 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062813 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062830 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062850 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062867 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062883 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062900 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062916 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062932 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062950 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062966 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062983 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.062999 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.063017 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.063035 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.063053 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.063075 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.063161 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.063182 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.063201 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.063250 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.063269 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.063288 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.063307 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.063326 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.063345 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.063364 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.063382 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.063399 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.063421 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.063448 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.063473 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.063493 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.063511 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.063530 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.063550 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.063567 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.063584 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.063601 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.063617 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.063634 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.063652 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.063697 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.063721 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.075162 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.075223 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.075251 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.075279 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.075301 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.075328 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.075353 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.075379 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.075402 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.075420 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.075442 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.075460 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.075515 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.075530 4771 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.075545 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.075559 4771 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.075570 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.075585 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.076142 4771 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.076501 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.079657 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.087825 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061075 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061283 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.091672 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061585 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061749 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061823 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.061832 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.075345 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.075359 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.075431 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.075358 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.075496 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.075484 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.075570 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.075532 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.075566 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.075645 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.075654 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.075638 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.075637 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.075716 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.075715 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.075671 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.075935 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: E1001 14:56:17.075994 4771 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 14:56:17 crc kubenswrapper[4771]: E1001 14:56:17.076012 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:56:17.575985289 +0000 UTC m=+22.195160680 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.076192 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.076206 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.076284 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.076478 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.076639 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: E1001 14:56:17.076656 4771 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.076694 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.077261 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.077271 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.077392 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.077402 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.077431 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.077508 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.077610 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.077882 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.078043 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.078287 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.078362 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.078376 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.078208 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.079119 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.079211 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.079251 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.079626 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.079920 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.080521 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.080565 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.080569 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.080642 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.080777 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.080959 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.081000 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.081468 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.086940 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.087281 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.087489 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.087480 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.087697 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.087921 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.088164 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.088190 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.088271 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.088412 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.088576 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.088676 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.088679 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.089034 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.089805 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.090042 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.090128 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.090442 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.090911 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.090985 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.091188 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.091234 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.091405 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.091444 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.091589 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.091961 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.091985 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.092215 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.092320 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.092754 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.092773 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.092873 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.093013 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.093246 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.093464 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.093551 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.093619 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.093653 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.093780 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.093930 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.094030 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.094282 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.094530 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.094554 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.094837 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.094854 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.095070 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.095142 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.095189 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.095266 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.095294 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.095472 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.095927 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.095961 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.096011 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.096093 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.096153 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.099173 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.100049 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.100442 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.100664 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.100752 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.100826 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.100821 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.101048 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: E1001 14:56:17.101166 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.101256 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: E1001 14:56:17.101289 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 14:56:17 crc kubenswrapper[4771]: E1001 14:56:17.101316 4771 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.101192 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.101594 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.102330 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: E1001 14:56:17.102551 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 14:56:17 crc kubenswrapper[4771]: E1001 14:56:17.102574 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 14:56:17 crc kubenswrapper[4771]: E1001 14:56:17.102590 4771 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.103577 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.103871 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.104168 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.104334 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.104743 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.104882 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.105407 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.105479 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.105561 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: E1001 14:56:17.105606 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 14:56:17.605565121 +0000 UTC m=+22.224740292 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 14:56:17 crc kubenswrapper[4771]: E1001 14:56:17.105638 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 14:56:17.605630433 +0000 UTC m=+22.224805604 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 14:56:17 crc kubenswrapper[4771]: E1001 14:56:17.105900 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 14:56:17.605889159 +0000 UTC m=+22.225064330 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 14:56:17 crc kubenswrapper[4771]: E1001 14:56:17.105925 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 14:56:17.60591715 +0000 UTC m=+22.225092341 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.105949 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.106306 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.106331 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.106698 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.106776 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.108717 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.108760 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.108963 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.110213 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.111006 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.111054 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.111279 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.113299 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.113394 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.113462 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.113494 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.113622 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.113724 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.113766 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.114059 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.114440 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.114434 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.114599 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.118229 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.119005 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.119018 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.119321 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.120256 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.121833 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.121866 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.122042 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.122212 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.122315 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.122416 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.122711 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.123083 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.122539 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.123262 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.122853 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.123599 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.123610 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.123802 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.123861 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.123958 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.124056 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.124175 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.125637 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.125648 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.125856 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.126069 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.126512 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.127300 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.127485 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.128767 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.129617 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.129635 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.131335 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.135908 4771 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570" exitCode=255 Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.135996 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570"} Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.136058 4771 scope.go:117] "RemoveContainer" containerID="485cfe681f34198bd9ca680708ea086be68766fab847c4bf30049444b74f7355" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.144376 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.149673 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.152855 4771 scope.go:117] "RemoveContainer" containerID="c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570" Oct 01 14:56:17 crc kubenswrapper[4771]: E1001 14:56:17.153255 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.154989 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.160862 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.162008 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.167992 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.170020 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.176814 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.176928 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.177189 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.177473 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.177633 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.177784 4771 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.177938 4771 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.178089 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.178206 4771 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.178364 4771 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.178515 4771 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.178595 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.178675 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.178836 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.178912 4771 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.178983 4771 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.179175 4771 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.179270 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.179358 4771 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.179442 4771 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.179527 4771 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.179611 4771 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.179701 4771 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.179817 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.179912 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.180005 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.180093 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.179036 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.180173 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.180712 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.180763 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.180778 4771 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.180791 4771 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.180802 4771 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.180814 4771 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.180846 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.180858 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.180869 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.180881 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.180891 4771 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.180926 4771 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.180941 4771 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.180953 4771 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.180966 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181004 4771 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181019 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181035 4771 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181049 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181087 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181104 4771 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181120 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181135 4771 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181171 4771 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181184 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181196 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181208 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181240 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181254 4771 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181266 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181280 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181293 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181327 4771 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181341 4771 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181355 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181368 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181403 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181417 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181432 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181444 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181457 4771 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181488 4771 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181501 4771 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181513 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181526 4771 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181541 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181579 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181594 4771 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181608 4771 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181621 4771 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181661 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181677 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181692 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181723 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181752 4771 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181764 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181776 4771 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181810 4771 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181822 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181834 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181849 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181861 4771 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181894 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181932 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181965 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181979 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.181990 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.182001 4771 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.182013 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.182059 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.182071 4771 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.182084 4771 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.182094 4771 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.182128 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.182141 4771 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.182151 4771 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.182161 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.182171 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.182209 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.182221 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.182234 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.177556 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.182245 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.182321 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.182405 4771 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.182417 4771 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.182430 4771 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.182466 4771 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.182478 4771 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.182488 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.182498 4771 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.182507 4771 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.182517 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.182550 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.182561 4771 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.182569 4771 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.182578 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.182588 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.182597 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.182631 4771 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.182640 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.182658 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.182668 4771 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.182983 4771 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.184537 4771 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.184588 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.184606 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.184617 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.184654 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.184666 4771 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.184676 4771 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.184687 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.184718 4771 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.184753 4771 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.184763 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.184772 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.184781 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.184790 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.184818 4771 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.184828 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.184837 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.184849 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.184894 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.184908 4771 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.184920 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.184929 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.184981 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.184992 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.185002 4771 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.185012 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.185021 4771 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.185031 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.185041 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.185051 4771 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.185062 4771 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.185072 4771 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.185085 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.185095 4771 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.185106 4771 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.185116 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.185125 4771 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.185135 4771 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.185145 4771 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.185155 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.185164 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.185174 4771 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.185183 4771 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.185195 4771 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.185205 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.185214 4771 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.185224 4771 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.185234 4771 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.185244 4771 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.185259 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.185269 4771 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.185278 4771 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.185288 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.185300 4771 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.185310 4771 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.185320 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.185330 4771 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.185340 4771 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.185350 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.185359 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.199088 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.202465 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.211327 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.214751 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.224221 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.234442 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.238189 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b439925b-c79c-4b66-957f-5be27680bbc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8551072e274183e6126e06729369468bcf9a98a9a487ff2febe02b088a6a51aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd6957bcec2f931ddc0f584bb947cc6c8e19aee87c4fa2d2727151312e00bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea2a54d9ce79c067adf62a1a112758028562f68fd877d7b5c1f0ac808fde931\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485cfe681f34198bd9ca680708ea086be68766fab847c4bf30049444b74f7355\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T14:56:10Z\\\",\\\"message\\\":\\\"W1001 14:55:59.498873 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 14:55:59.499342 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759330559 cert, and key in /tmp/serving-cert-3096108512/serving-signer.crt, /tmp/serving-cert-3096108512/serving-signer.key\\\\nI1001 14:55:59.752087 1 observer_polling.go:159] Starting file observer\\\\nW1001 14:55:59.754939 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 14:55:59.755123 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 14:55:59.758070 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3096108512/tls.crt::/tmp/serving-cert-3096108512/tls.key\\\\\\\"\\\\nF1001 14:56:10.209098 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 14:56:16.707999 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 14:56:16.708248 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 14:56:16.709042 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1763409596/tls.crt::/tmp/serving-cert-1763409596/tls.key\\\\\\\"\\\\nI1001 14:56:17.040394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 14:56:17.043951 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 14:56:17.043976 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 14:56:17.043997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 14:56:17.044002 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 14:56:17.049470 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1001 14:56:17.049534 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 14:56:17.049584 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049607 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 14:56:17.049653 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 14:56:17.049673 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 14:56:17.049696 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 14:56:17.051084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d920faabd05e65200c4defe1588344af5ae2b34148b4213bdc86abe9dd19d236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.244871 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.249576 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 14:56:17 crc kubenswrapper[4771]: W1001 14:56:17.254897 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-519d5fd0c3d823b374e2030caa459730a53bc66618033ad76e0c98de0d428576 WatchSource:0}: Error finding container 519d5fd0c3d823b374e2030caa459730a53bc66618033ad76e0c98de0d428576: Status 404 returned error can't find the container with id 519d5fd0c3d823b374e2030caa459730a53bc66618033ad76e0c98de0d428576 Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.256709 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.258620 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.266109 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 14:56:17 crc kubenswrapper[4771]: W1001 14:56:17.278638 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-fd1e9ceb8935649818cdb24f8634382d44069a29f86c461fa9389f0945183da7 WatchSource:0}: Error finding container fd1e9ceb8935649818cdb24f8634382d44069a29f86c461fa9389f0945183da7: Status 404 returned error can't find the container with id fd1e9ceb8935649818cdb24f8634382d44069a29f86c461fa9389f0945183da7 Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.280037 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.294127 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.310810 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.326295 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.588522 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:56:17 crc kubenswrapper[4771]: E1001 14:56:17.588754 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:56:18.588708674 +0000 UTC m=+23.207883845 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.690215 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.690304 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.690334 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.690579 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:56:17 crc kubenswrapper[4771]: E1001 14:56:17.690504 4771 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 14:56:17 crc kubenswrapper[4771]: E1001 14:56:17.690665 4771 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 14:56:17 crc kubenswrapper[4771]: E1001 14:56:17.690777 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 14:56:18.690710412 +0000 UTC m=+23.309885583 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 14:56:17 crc kubenswrapper[4771]: E1001 14:56:17.690513 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 14:56:17 crc kubenswrapper[4771]: E1001 14:56:17.690822 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 14:56:17 crc kubenswrapper[4771]: E1001 14:56:17.690513 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 14:56:17 crc kubenswrapper[4771]: E1001 14:56:17.690845 4771 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 14:56:17 crc kubenswrapper[4771]: E1001 14:56:17.690876 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 14:56:17 crc kubenswrapper[4771]: E1001 14:56:17.690923 4771 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 14:56:17 crc kubenswrapper[4771]: E1001 14:56:17.690809 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 14:56:18.690797084 +0000 UTC m=+23.309972255 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 14:56:17 crc kubenswrapper[4771]: E1001 14:56:17.691086 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 14:56:18.69105067 +0000 UTC m=+23.310225841 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 14:56:17 crc kubenswrapper[4771]: E1001 14:56:17.691110 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 14:56:18.691097531 +0000 UTC m=+23.310272712 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.988271 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.988888 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.990118 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.991129 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.993148 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.993705 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.994541 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.995554 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.996295 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.997345 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.997883 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.999066 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 01 14:56:17 crc kubenswrapper[4771]: I1001 14:56:17.999535 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.000452 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.001027 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.001902 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.002451 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.002852 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.004010 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.004621 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.005112 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.006238 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.006654 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.007651 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.008054 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.009310 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.009957 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.010409 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.011330 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.011803 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.012679 4771 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.012807 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.014641 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.015541 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.016076 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.017587 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.019028 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.019772 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.020420 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.021462 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.021940 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.023062 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.024059 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.024680 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.025564 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.026166 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.027206 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.027948 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.028804 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.029270 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.029772 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.030614 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.031249 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.032232 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.140748 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e6ef7bbeb1f3ba58b4d1cfa2d55bcc4d7e7264d3db53e61495625b58a8503ee3"} Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.140824 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"679e6aca450f387cb31946627423ce8bc164d9815a7caf225e6cf1512f189cff"} Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.140842 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"519d5fd0c3d823b374e2030caa459730a53bc66618033ad76e0c98de0d428576"} Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.142291 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"02063fddc7a255292ea7cdd6a318546b1573a0701787f4ab839ac7bea5b1ccb7"} Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.142360 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"fdc2fd6b8f6d0405712b04c3ee8ebf30efe2e30bf58fce8e659d995662d13b14"} Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.144195 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.147652 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"fd1e9ceb8935649818cdb24f8634382d44069a29f86c461fa9389f0945183da7"} Oct 01 14:56:18 crc kubenswrapper[4771]: E1001 14:56:18.154837 4771 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-crc\" already exists" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.155276 4771 scope.go:117] "RemoveContainer" containerID="c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570" Oct 01 14:56:18 crc kubenswrapper[4771]: E1001 14:56:18.155529 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.159269 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:18Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.172897 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6ef7bbeb1f3ba58b4d1cfa2d55bcc4d7e7264d3db53e61495625b58a8503ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://679e6aca450f387cb31946627423ce8bc164d9815a7caf225e6cf1512f189cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:18Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.188080 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:18Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.201322 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:18Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.212316 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:18Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.231298 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8156f536-4329-4bc1-8ea3-653c5d719a19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dbd5c544c544544637a62c6e11372b59bdddc28c6ddb14e2cdd93d2d02a85b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9cd85dad256af814a22cd9787f3c1d4061ff2b76a0fb1a92b547a10c809a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d50224da1e43f0ad8835be229e79e94d6603a0fcb0879b9dd306e4cc357c96f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6def3f33927474e3a3c86835c9606ccfd32e235441e1646cdba4cef0b41e2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe4f8936af05697692a6547962fa77f3ac8bb357e3484324073f7768b374060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:18Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.246220 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b439925b-c79c-4b66-957f-5be27680bbc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8551072e274183e6126e06729369468bcf9a98a9a487ff2febe02b088a6a51aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd6957bcec2f931ddc0f584bb947cc6c8e19aee87c4fa2d2727151312e00bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea2a54d9ce79c067adf62a1a112758028562f68fd877d7b5c1f0ac808fde931\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485cfe681f34198bd9ca680708ea086be68766fab847c4bf30049444b74f7355\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T14:56:10Z\\\",\\\"message\\\":\\\"W1001 14:55:59.498873 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI1001 14:55:59.499342 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759330559 cert, and key in /tmp/serving-cert-3096108512/serving-signer.crt, /tmp/serving-cert-3096108512/serving-signer.key\\\\nI1001 14:55:59.752087 1 observer_polling.go:159] Starting file observer\\\\nW1001 14:55:59.754939 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI1001 14:55:59.755123 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 14:55:59.758070 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3096108512/tls.crt::/tmp/serving-cert-3096108512/tls.key\\\\\\\"\\\\nF1001 14:56:10.209098 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 14:56:16.707999 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 14:56:16.708248 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 14:56:16.709042 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1763409596/tls.crt::/tmp/serving-cert-1763409596/tls.key\\\\\\\"\\\\nI1001 14:56:17.040394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 14:56:17.043951 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 14:56:17.043976 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 14:56:17.043997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 14:56:17.044002 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 14:56:17.049470 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1001 14:56:17.049534 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 14:56:17.049584 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049607 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 14:56:17.049653 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 14:56:17.049673 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 14:56:17.049696 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 14:56:17.051084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d920faabd05e65200c4defe1588344af5ae2b34148b4213bdc86abe9dd19d236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:18Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.262916 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:18Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.280379 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:18Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.295572 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6ef7bbeb1f3ba58b4d1cfa2d55bcc4d7e7264d3db53e61495625b58a8503ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://679e6aca450f387cb31946627423ce8bc164d9815a7caf225e6cf1512f189cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:18Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.312782 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:18Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.335751 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8156f536-4329-4bc1-8ea3-653c5d719a19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dbd5c544c544544637a62c6e11372b59bdddc28c6ddb14e2cdd93d2d02a85b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9cd85dad256af814a22cd9787f3c1d4061ff2b76a0fb1a92b547a10c809a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d50224da1e43f0ad8835be229e79e94d6603a0fcb0879b9dd306e4cc357c96f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6def3f33927474e3a3c86835c9606ccfd32e235441e1646cdba4cef0b41e2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe4f8936af05697692a6547962fa77f3ac8bb357e3484324073f7768b374060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:18Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.349709 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b439925b-c79c-4b66-957f-5be27680bbc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8551072e274183e6126e06729369468bcf9a98a9a487ff2febe02b088a6a51aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd6957bcec2f931ddc0f584bb947cc6c8e19aee87c4fa2d2727151312e00bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea2a54d9ce79c067adf62a1a112758028562f68fd877d7b5c1f0ac808fde931\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 14:56:16.707999 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 14:56:16.708248 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 14:56:16.709042 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1763409596/tls.crt::/tmp/serving-cert-1763409596/tls.key\\\\\\\"\\\\nI1001 14:56:17.040394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 14:56:17.043951 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 14:56:17.043976 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 14:56:17.043997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 14:56:17.044002 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 14:56:17.049470 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1001 14:56:17.049534 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 14:56:17.049584 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049607 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 14:56:17.049653 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 14:56:17.049673 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 14:56:17.049696 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 14:56:17.051084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d920faabd05e65200c4defe1588344af5ae2b34148b4213bdc86abe9dd19d236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:18Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.367438 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02063fddc7a255292ea7cdd6a318546b1573a0701787f4ab839ac7bea5b1ccb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:18Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.387928 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:18Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.406821 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:18Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.598653 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:56:18 crc kubenswrapper[4771]: E1001 14:56:18.598904 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:56:20.598862376 +0000 UTC m=+25.218037547 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.700160 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.700205 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.700225 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.700265 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:56:18 crc kubenswrapper[4771]: E1001 14:56:18.700380 4771 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 14:56:18 crc kubenswrapper[4771]: E1001 14:56:18.700454 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 14:56:20.700438243 +0000 UTC m=+25.319613404 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 14:56:18 crc kubenswrapper[4771]: E1001 14:56:18.700500 4771 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 14:56:18 crc kubenswrapper[4771]: E1001 14:56:18.700595 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 14:56:18 crc kubenswrapper[4771]: E1001 14:56:18.700679 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 14:56:18 crc kubenswrapper[4771]: E1001 14:56:18.700638 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 14:56:20.700598587 +0000 UTC m=+25.319773758 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 14:56:18 crc kubenswrapper[4771]: E1001 14:56:18.700516 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 14:56:18 crc kubenswrapper[4771]: E1001 14:56:18.700806 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 14:56:18 crc kubenswrapper[4771]: E1001 14:56:18.700831 4771 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 14:56:18 crc kubenswrapper[4771]: E1001 14:56:18.700703 4771 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 14:56:18 crc kubenswrapper[4771]: E1001 14:56:18.700926 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 14:56:20.700895645 +0000 UTC m=+25.320070856 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 14:56:18 crc kubenswrapper[4771]: E1001 14:56:18.700997 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 14:56:20.700962826 +0000 UTC m=+25.320138207 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.984961 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.984961 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:56:18 crc kubenswrapper[4771]: E1001 14:56:18.985230 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:56:18 crc kubenswrapper[4771]: I1001 14:56:18.985021 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:56:18 crc kubenswrapper[4771]: E1001 14:56:18.985295 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:56:18 crc kubenswrapper[4771]: E1001 14:56:18.985333 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:56:19 crc kubenswrapper[4771]: I1001 14:56:19.151098 4771 scope.go:117] "RemoveContainer" containerID="c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570" Oct 01 14:56:19 crc kubenswrapper[4771]: E1001 14:56:19.151398 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 01 14:56:20 crc kubenswrapper[4771]: I1001 14:56:20.619138 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:56:20 crc kubenswrapper[4771]: E1001 14:56:20.619383 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:56:24.619342604 +0000 UTC m=+29.238517785 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:56:20 crc kubenswrapper[4771]: I1001 14:56:20.719799 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:56:20 crc kubenswrapper[4771]: I1001 14:56:20.719856 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:56:20 crc kubenswrapper[4771]: I1001 14:56:20.719886 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:56:20 crc kubenswrapper[4771]: I1001 14:56:20.719910 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:56:20 crc kubenswrapper[4771]: E1001 14:56:20.720007 4771 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 14:56:20 crc kubenswrapper[4771]: E1001 14:56:20.720065 4771 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 14:56:20 crc kubenswrapper[4771]: E1001 14:56:20.720031 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 14:56:20 crc kubenswrapper[4771]: E1001 14:56:20.720125 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 14:56:24.720098952 +0000 UTC m=+29.339274113 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 14:56:20 crc kubenswrapper[4771]: E1001 14:56:20.720146 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 14:56:20 crc kubenswrapper[4771]: E1001 14:56:20.720153 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 14:56:24.720141623 +0000 UTC m=+29.339316794 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 14:56:20 crc kubenswrapper[4771]: E1001 14:56:20.720164 4771 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 14:56:20 crc kubenswrapper[4771]: E1001 14:56:20.720214 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 14:56:24.720189054 +0000 UTC m=+29.339364235 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 14:56:20 crc kubenswrapper[4771]: E1001 14:56:20.720031 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 14:56:20 crc kubenswrapper[4771]: E1001 14:56:20.720247 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 14:56:20 crc kubenswrapper[4771]: E1001 14:56:20.720262 4771 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 14:56:20 crc kubenswrapper[4771]: E1001 14:56:20.720301 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 14:56:24.720290756 +0000 UTC m=+29.339466047 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 14:56:20 crc kubenswrapper[4771]: I1001 14:56:20.984586 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:56:20 crc kubenswrapper[4771]: I1001 14:56:20.984668 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:56:20 crc kubenswrapper[4771]: I1001 14:56:20.984762 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:56:20 crc kubenswrapper[4771]: E1001 14:56:20.984790 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:56:20 crc kubenswrapper[4771]: E1001 14:56:20.984982 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:56:20 crc kubenswrapper[4771]: E1001 14:56:20.985057 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:56:21 crc kubenswrapper[4771]: I1001 14:56:21.157438 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"ad0b1d01c4ff14005e40574c1663d3fafddd1527b8d283ee2f2364a861e3c351"} Oct 01 14:56:21 crc kubenswrapper[4771]: I1001 14:56:21.172438 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:21Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:21 crc kubenswrapper[4771]: I1001 14:56:21.189940 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6ef7bbeb1f3ba58b4d1cfa2d55bcc4d7e7264d3db53e61495625b58a8503ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://679e6aca450f387cb31946627423ce8bc164d9815a7caf225e6cf1512f189cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:21Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:21 crc kubenswrapper[4771]: I1001 14:56:21.207930 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:21Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:21 crc kubenswrapper[4771]: I1001 14:56:21.224387 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:21Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:21 crc kubenswrapper[4771]: I1001 14:56:21.239822 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad0b1d01c4ff14005e40574c1663d3fafddd1527b8d283ee2f2364a861e3c351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:21Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:21 crc kubenswrapper[4771]: I1001 14:56:21.265203 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8156f536-4329-4bc1-8ea3-653c5d719a19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dbd5c544c544544637a62c6e11372b59bdddc28c6ddb14e2cdd93d2d02a85b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9cd85dad256af814a22cd9787f3c1d4061ff2b76a0fb1a92b547a10c809a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d50224da1e43f0ad8835be229e79e94d6603a0fcb0879b9dd306e4cc357c96f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6def3f33927474e3a3c86835c9606ccfd32e235441e1646cdba4cef0b41e2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe4f8936af05697692a6547962fa77f3ac8bb357e3484324073f7768b374060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:21Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:21 crc kubenswrapper[4771]: I1001 14:56:21.283610 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b439925b-c79c-4b66-957f-5be27680bbc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8551072e274183e6126e06729369468bcf9a98a9a487ff2febe02b088a6a51aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd6957bcec2f931ddc0f584bb947cc6c8e19aee87c4fa2d2727151312e00bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea2a54d9ce79c067adf62a1a112758028562f68fd877d7b5c1f0ac808fde931\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 14:56:16.707999 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 14:56:16.708248 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 14:56:16.709042 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1763409596/tls.crt::/tmp/serving-cert-1763409596/tls.key\\\\\\\"\\\\nI1001 14:56:17.040394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 14:56:17.043951 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 14:56:17.043976 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 14:56:17.043997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 14:56:17.044002 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 14:56:17.049470 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1001 14:56:17.049534 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 14:56:17.049584 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049607 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 14:56:17.049653 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 14:56:17.049673 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 14:56:17.049696 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 14:56:17.051084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d920faabd05e65200c4defe1588344af5ae2b34148b4213bdc86abe9dd19d236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:21Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:21 crc kubenswrapper[4771]: I1001 14:56:21.298782 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02063fddc7a255292ea7cdd6a318546b1573a0701787f4ab839ac7bea5b1ccb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:21Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:21 crc kubenswrapper[4771]: I1001 14:56:21.848801 4771 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 01 14:56:21 crc kubenswrapper[4771]: I1001 14:56:21.851472 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:21 crc kubenswrapper[4771]: I1001 14:56:21.851566 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:21 crc kubenswrapper[4771]: I1001 14:56:21.851587 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:21 crc kubenswrapper[4771]: I1001 14:56:21.851696 4771 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 01 14:56:21 crc kubenswrapper[4771]: I1001 14:56:21.865225 4771 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 01 14:56:21 crc kubenswrapper[4771]: I1001 14:56:21.865362 4771 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 01 14:56:21 crc kubenswrapper[4771]: I1001 14:56:21.866525 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:21 crc kubenswrapper[4771]: I1001 14:56:21.866564 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:21 crc kubenswrapper[4771]: I1001 14:56:21.866578 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:21 crc kubenswrapper[4771]: I1001 14:56:21.866599 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:21 crc kubenswrapper[4771]: I1001 14:56:21.866614 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:21Z","lastTransitionTime":"2025-10-01T14:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:21 crc kubenswrapper[4771]: E1001 14:56:21.909779 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f03ada0f-e2c8-42c8-86e3-3e9572f1e63b\\\",\\\"systemUUID\\\":\\\"ab8b87ec-94d1-4eae-9ea3-b28f83991d01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:21Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:21 crc kubenswrapper[4771]: I1001 14:56:21.916527 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:21 crc kubenswrapper[4771]: I1001 14:56:21.916619 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:21 crc kubenswrapper[4771]: I1001 14:56:21.916632 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:21 crc kubenswrapper[4771]: I1001 14:56:21.916652 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:21 crc kubenswrapper[4771]: I1001 14:56:21.916665 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:21Z","lastTransitionTime":"2025-10-01T14:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:21 crc kubenswrapper[4771]: E1001 14:56:21.930298 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f03ada0f-e2c8-42c8-86e3-3e9572f1e63b\\\",\\\"systemUUID\\\":\\\"ab8b87ec-94d1-4eae-9ea3-b28f83991d01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:21Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:21 crc kubenswrapper[4771]: I1001 14:56:21.934406 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:21 crc kubenswrapper[4771]: I1001 14:56:21.934468 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:21 crc kubenswrapper[4771]: I1001 14:56:21.934481 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:21 crc kubenswrapper[4771]: I1001 14:56:21.934504 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:21 crc kubenswrapper[4771]: I1001 14:56:21.934519 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:21Z","lastTransitionTime":"2025-10-01T14:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:21 crc kubenswrapper[4771]: E1001 14:56:21.965076 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f03ada0f-e2c8-42c8-86e3-3e9572f1e63b\\\",\\\"systemUUID\\\":\\\"ab8b87ec-94d1-4eae-9ea3-b28f83991d01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:21Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:21 crc kubenswrapper[4771]: I1001 14:56:21.970140 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:21 crc kubenswrapper[4771]: I1001 14:56:21.970192 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:21 crc kubenswrapper[4771]: I1001 14:56:21.970202 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:21 crc kubenswrapper[4771]: I1001 14:56:21.970223 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:21 crc kubenswrapper[4771]: I1001 14:56:21.970237 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:21Z","lastTransitionTime":"2025-10-01T14:56:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:21 crc kubenswrapper[4771]: E1001 14:56:21.995889 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f03ada0f-e2c8-42c8-86e3-3e9572f1e63b\\\",\\\"systemUUID\\\":\\\"ab8b87ec-94d1-4eae-9ea3-b28f83991d01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:21Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.000249 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.000312 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.000325 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.000345 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.000359 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:22Z","lastTransitionTime":"2025-10-01T14:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:22 crc kubenswrapper[4771]: E1001 14:56:22.013314 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f03ada0f-e2c8-42c8-86e3-3e9572f1e63b\\\",\\\"systemUUID\\\":\\\"ab8b87ec-94d1-4eae-9ea3-b28f83991d01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:22Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:22 crc kubenswrapper[4771]: E1001 14:56:22.013468 4771 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.015678 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.015716 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.015741 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.015767 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.015782 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:22Z","lastTransitionTime":"2025-10-01T14:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.118280 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.118319 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.118329 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.118346 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.118356 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:22Z","lastTransitionTime":"2025-10-01T14:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.225578 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.225643 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.225656 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.225680 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.225697 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:22Z","lastTransitionTime":"2025-10-01T14:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.293661 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-kmlgz"] Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.294035 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kmlgz" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.296464 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.296678 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.296763 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.296943 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.318305 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:22Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.327845 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.327904 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.327919 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.327944 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.327961 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:22Z","lastTransitionTime":"2025-10-01T14:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.347013 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6ef7bbeb1f3ba58b4d1cfa2d55bcc4d7e7264d3db53e61495625b58a8503ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://679e6aca450f387cb31946627423ce8bc164d9815a7caf225e6cf1512f189cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:22Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.368292 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:22Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.417654 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8156f536-4329-4bc1-8ea3-653c5d719a19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dbd5c544c544544637a62c6e11372b59bdddc28c6ddb14e2cdd93d2d02a85b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9cd85dad256af814a22cd9787f3c1d4061ff2b76a0fb1a92b547a10c809a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d50224da1e43f0ad8835be229e79e94d6603a0fcb0879b9dd306e4cc357c96f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6def3f33927474e3a3c86835c9606ccfd32e235441e1646cdba4cef0b41e2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe4f8936af05697692a6547962fa77f3ac8bb357e3484324073f7768b374060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:22Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.430861 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.430933 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.430946 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.430966 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.430978 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:22Z","lastTransitionTime":"2025-10-01T14:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.434324 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2129365d-0a99-4cf0-a561-fd4126d1bfc7-serviceca\") pod \"node-ca-kmlgz\" (UID: \"2129365d-0a99-4cf0-a561-fd4126d1bfc7\") " pod="openshift-image-registry/node-ca-kmlgz" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.434408 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2129365d-0a99-4cf0-a561-fd4126d1bfc7-host\") pod \"node-ca-kmlgz\" (UID: \"2129365d-0a99-4cf0-a561-fd4126d1bfc7\") " pod="openshift-image-registry/node-ca-kmlgz" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.434431 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqx4k\" (UniqueName: \"kubernetes.io/projected/2129365d-0a99-4cf0-a561-fd4126d1bfc7-kube-api-access-qqx4k\") pod \"node-ca-kmlgz\" (UID: \"2129365d-0a99-4cf0-a561-fd4126d1bfc7\") " pod="openshift-image-registry/node-ca-kmlgz" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.435275 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b439925b-c79c-4b66-957f-5be27680bbc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8551072e274183e6126e06729369468bcf9a98a9a487ff2febe02b088a6a51aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd6957bcec2f931ddc0f584bb947cc6c8e19aee87c4fa2d2727151312e00bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea2a54d9ce79c067adf62a1a112758028562f68fd877d7b5c1f0ac808fde931\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 14:56:16.707999 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 14:56:16.708248 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 14:56:16.709042 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1763409596/tls.crt::/tmp/serving-cert-1763409596/tls.key\\\\\\\"\\\\nI1001 14:56:17.040394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 14:56:17.043951 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 14:56:17.043976 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 14:56:17.043997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 14:56:17.044002 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 14:56:17.049470 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1001 14:56:17.049534 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 14:56:17.049584 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049607 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 14:56:17.049653 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 14:56:17.049673 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 14:56:17.049696 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 14:56:17.051084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d920faabd05e65200c4defe1588344af5ae2b34148b4213bdc86abe9dd19d236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:22Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.453385 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02063fddc7a255292ea7cdd6a318546b1573a0701787f4ab839ac7bea5b1ccb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:22Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.467519 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:22Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.483840 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad0b1d01c4ff14005e40574c1663d3fafddd1527b8d283ee2f2364a861e3c351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:22Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.495878 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kmlgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2129365d-0a99-4cf0-a561-fd4126d1bfc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqx4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kmlgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:22Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.534699 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.534767 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.534781 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.534851 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.534866 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:22Z","lastTransitionTime":"2025-10-01T14:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.535100 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2129365d-0a99-4cf0-a561-fd4126d1bfc7-host\") pod \"node-ca-kmlgz\" (UID: \"2129365d-0a99-4cf0-a561-fd4126d1bfc7\") " pod="openshift-image-registry/node-ca-kmlgz" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.535158 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqx4k\" (UniqueName: \"kubernetes.io/projected/2129365d-0a99-4cf0-a561-fd4126d1bfc7-kube-api-access-qqx4k\") pod \"node-ca-kmlgz\" (UID: \"2129365d-0a99-4cf0-a561-fd4126d1bfc7\") " pod="openshift-image-registry/node-ca-kmlgz" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.535194 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2129365d-0a99-4cf0-a561-fd4126d1bfc7-serviceca\") pod \"node-ca-kmlgz\" (UID: \"2129365d-0a99-4cf0-a561-fd4126d1bfc7\") " pod="openshift-image-registry/node-ca-kmlgz" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.535277 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2129365d-0a99-4cf0-a561-fd4126d1bfc7-host\") pod \"node-ca-kmlgz\" (UID: \"2129365d-0a99-4cf0-a561-fd4126d1bfc7\") " pod="openshift-image-registry/node-ca-kmlgz" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.545979 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2129365d-0a99-4cf0-a561-fd4126d1bfc7-serviceca\") pod \"node-ca-kmlgz\" (UID: \"2129365d-0a99-4cf0-a561-fd4126d1bfc7\") " pod="openshift-image-registry/node-ca-kmlgz" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.551651 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.557082 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqx4k\" (UniqueName: \"kubernetes.io/projected/2129365d-0a99-4cf0-a561-fd4126d1bfc7-kube-api-access-qqx4k\") pod \"node-ca-kmlgz\" (UID: \"2129365d-0a99-4cf0-a561-fd4126d1bfc7\") " pod="openshift-image-registry/node-ca-kmlgz" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.558139 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.570599 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b439925b-c79c-4b66-957f-5be27680bbc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8551072e274183e6126e06729369468bcf9a98a9a487ff2febe02b088a6a51aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd6957bcec2f931ddc0f584bb947cc6c8e19aee87c4fa2d2727151312e00bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea2a54d9ce79c067adf62a1a112758028562f68fd877d7b5c1f0ac808fde931\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 14:56:16.707999 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 14:56:16.708248 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 14:56:16.709042 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1763409596/tls.crt::/tmp/serving-cert-1763409596/tls.key\\\\\\\"\\\\nI1001 14:56:17.040394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 14:56:17.043951 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 14:56:17.043976 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 14:56:17.043997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 14:56:17.044002 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 14:56:17.049470 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1001 14:56:17.049534 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 14:56:17.049584 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049607 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 14:56:17.049653 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 14:56:17.049673 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 14:56:17.049696 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 14:56:17.051084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d920faabd05e65200c4defe1588344af5ae2b34148b4213bdc86abe9dd19d236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:22Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.587628 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02063fddc7a255292ea7cdd6a318546b1573a0701787f4ab839ac7bea5b1ccb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:22Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.596289 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.602261 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:22Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.608553 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kmlgz" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.614544 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad0b1d01c4ff14005e40574c1663d3fafddd1527b8d283ee2f2364a861e3c351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:22Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:22 crc kubenswrapper[4771]: W1001 14:56:22.621278 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2129365d_0a99_4cf0_a561_fd4126d1bfc7.slice/crio-af231a52b857039efe3c82d5e5874dc62448e2b68c29d2fea532d447153c22ec WatchSource:0}: Error finding container af231a52b857039efe3c82d5e5874dc62448e2b68c29d2fea532d447153c22ec: Status 404 returned error can't find the container with id af231a52b857039efe3c82d5e5874dc62448e2b68c29d2fea532d447153c22ec Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.629822 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kmlgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2129365d-0a99-4cf0-a561-fd4126d1bfc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqx4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kmlgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:22Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.637758 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.637828 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.637845 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.637872 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.637891 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:22Z","lastTransitionTime":"2025-10-01T14:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.661826 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8156f536-4329-4bc1-8ea3-653c5d719a19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dbd5c544c544544637a62c6e11372b59bdddc28c6ddb14e2cdd93d2d02a85b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9cd85dad256af814a22cd9787f3c1d4061ff2b76a0fb1a92b547a10c809a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d50224da1e43f0ad8835be229e79e94d6603a0fcb0879b9dd306e4cc357c96f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6def3f33927474e3a3c86835c9606ccfd32e235441e1646cdba4cef0b41e2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe4f8936af05697692a6547962fa77f3ac8bb357e3484324073f7768b374060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:22Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.678238 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:22Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.692020 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6ef7bbeb1f3ba58b4d1cfa2d55bcc4d7e7264d3db53e61495625b58a8503ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://679e6aca450f387cb31946627423ce8bc164d9815a7caf225e6cf1512f189cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:22Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.707031 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-7wr7q"] Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.707427 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7wr7q" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.710391 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.711033 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.711307 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.713375 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:22Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.732487 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa69b46a-20c4-4ce1-8be9-e945c6865b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0651d298d7c1364b4de3b030574abd7d6ba8ccacab5872d74162878f92592cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b72097f8016d0e79134956d3555973cd7c137c30ab5eade1bd92805d16f66bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://495c07855c881f2a9cd788740e2197c165a38891c729a8563f3540bf5c6d8dcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fb83a6932caf24a6b2430836bc2800415028a8134a2355a816ccbb73efbaf46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:22Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.740141 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.740181 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.740195 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.740212 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.740226 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:22Z","lastTransitionTime":"2025-10-01T14:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.754893 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02063fddc7a255292ea7cdd6a318546b1573a0701787f4ab839ac7bea5b1ccb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:22Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.770142 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:22Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.797536 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad0b1d01c4ff14005e40574c1663d3fafddd1527b8d283ee2f2364a861e3c351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:22Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.813401 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kmlgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2129365d-0a99-4cf0-a561-fd4126d1bfc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqx4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kmlgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:22Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.837063 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8156f536-4329-4bc1-8ea3-653c5d719a19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dbd5c544c544544637a62c6e11372b59bdddc28c6ddb14e2cdd93d2d02a85b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9cd85dad256af814a22cd9787f3c1d4061ff2b76a0fb1a92b547a10c809a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d50224da1e43f0ad8835be229e79e94d6603a0fcb0879b9dd306e4cc357c96f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6def3f33927474e3a3c86835c9606ccfd32e235441e1646cdba4cef0b41e2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe4f8936af05697692a6547962fa77f3ac8bb357e3484324073f7768b374060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:22Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.838173 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdn7t\" (UniqueName: \"kubernetes.io/projected/0bb66959-800a-45dc-909f-1a093c578823-kube-api-access-bdn7t\") pod \"node-resolver-7wr7q\" (UID: \"0bb66959-800a-45dc-909f-1a093c578823\") " pod="openshift-dns/node-resolver-7wr7q" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.838224 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0bb66959-800a-45dc-909f-1a093c578823-hosts-file\") pod \"node-resolver-7wr7q\" (UID: \"0bb66959-800a-45dc-909f-1a093c578823\") " pod="openshift-dns/node-resolver-7wr7q" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.847898 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.847944 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.847956 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.847974 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.847985 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:22Z","lastTransitionTime":"2025-10-01T14:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.882796 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b439925b-c79c-4b66-957f-5be27680bbc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8551072e274183e6126e06729369468bcf9a98a9a487ff2febe02b088a6a51aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd6957bcec2f931ddc0f584bb947cc6c8e19aee87c4fa2d2727151312e00bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea2a54d9ce79c067adf62a1a112758028562f68fd877d7b5c1f0ac808fde931\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 14:56:16.707999 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 14:56:16.708248 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 14:56:16.709042 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1763409596/tls.crt::/tmp/serving-cert-1763409596/tls.key\\\\\\\"\\\\nI1001 14:56:17.040394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 14:56:17.043951 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 14:56:17.043976 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 14:56:17.043997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 14:56:17.044002 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 14:56:17.049470 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1001 14:56:17.049534 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 14:56:17.049584 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049607 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 14:56:17.049653 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 14:56:17.049673 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 14:56:17.049696 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 14:56:17.051084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d920faabd05e65200c4defe1588344af5ae2b34148b4213bdc86abe9dd19d236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:22Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.926020 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7wr7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bb66959-800a-45dc-909f-1a093c578823\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bdn7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7wr7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:22Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.939617 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0bb66959-800a-45dc-909f-1a093c578823-hosts-file\") pod \"node-resolver-7wr7q\" (UID: \"0bb66959-800a-45dc-909f-1a093c578823\") " pod="openshift-dns/node-resolver-7wr7q" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.939710 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdn7t\" (UniqueName: \"kubernetes.io/projected/0bb66959-800a-45dc-909f-1a093c578823-kube-api-access-bdn7t\") pod \"node-resolver-7wr7q\" (UID: \"0bb66959-800a-45dc-909f-1a093c578823\") " pod="openshift-dns/node-resolver-7wr7q" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.939861 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0bb66959-800a-45dc-909f-1a093c578823-hosts-file\") pod \"node-resolver-7wr7q\" (UID: \"0bb66959-800a-45dc-909f-1a093c578823\") " pod="openshift-dns/node-resolver-7wr7q" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.951061 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.951100 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.951112 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.951132 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.951143 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:22Z","lastTransitionTime":"2025-10-01T14:56:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.958432 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:22Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.963081 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdn7t\" (UniqueName: \"kubernetes.io/projected/0bb66959-800a-45dc-909f-1a093c578823-kube-api-access-bdn7t\") pod \"node-resolver-7wr7q\" (UID: \"0bb66959-800a-45dc-909f-1a093c578823\") " pod="openshift-dns/node-resolver-7wr7q" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.976896 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:22Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.984264 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.984305 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.984305 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:56:22 crc kubenswrapper[4771]: E1001 14:56:22.984404 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:56:22 crc kubenswrapper[4771]: E1001 14:56:22.984580 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:56:22 crc kubenswrapper[4771]: E1001 14:56:22.984693 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:56:22 crc kubenswrapper[4771]: I1001 14:56:22.995766 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6ef7bbeb1f3ba58b4d1cfa2d55bcc4d7e7264d3db53e61495625b58a8503ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://679e6aca450f387cb31946627423ce8bc164d9815a7caf225e6cf1512f189cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:22Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.018420 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7wr7q" Oct 01 14:56:23 crc kubenswrapper[4771]: W1001 14:56:23.034208 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bb66959_800a_45dc_909f_1a093c578823.slice/crio-1aba95ebda7912a40ddc6e30c67d2718804bd06fff380986541df01575d1e013 WatchSource:0}: Error finding container 1aba95ebda7912a40ddc6e30c67d2718804bd06fff380986541df01575d1e013: Status 404 returned error can't find the container with id 1aba95ebda7912a40ddc6e30c67d2718804bd06fff380986541df01575d1e013 Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.054225 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.054269 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.054281 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.054301 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.054315 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:23Z","lastTransitionTime":"2025-10-01T14:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.112609 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-9lvcz"] Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.113022 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.113611 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-vck47"] Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.114120 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-vck47" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.117347 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.117345 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.117533 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-j7ntp"] Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.118310 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.118518 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-jj6k4"] Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.119355 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.120752 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.120924 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.121187 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.121404 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.121538 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.121657 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.121795 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.121910 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.122076 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.122562 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.124127 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.124152 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.124167 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.124229 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.124137 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.124533 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.124721 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.134987 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:23Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.148228 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9lvcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96a3328-c79b-4528-b9b5-badbc7380dd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs5q5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9lvcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:23Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.157251 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.157297 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.157310 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.157330 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.157344 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:23Z","lastTransitionTime":"2025-10-01T14:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.162674 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:23Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.164870 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kmlgz" event={"ID":"2129365d-0a99-4cf0-a561-fd4126d1bfc7","Type":"ContainerStarted","Data":"1dabcd9bb31c364a82e0015bb58c48344f35fd73013cb9eb2c9d178ea6befbdc"} Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.164915 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kmlgz" event={"ID":"2129365d-0a99-4cf0-a561-fd4126d1bfc7","Type":"ContainerStarted","Data":"af231a52b857039efe3c82d5e5874dc62448e2b68c29d2fea532d447153c22ec"} Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.167201 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7wr7q" event={"ID":"0bb66959-800a-45dc-909f-1a093c578823","Type":"ContainerStarted","Data":"1aba95ebda7912a40ddc6e30c67d2718804bd06fff380986541df01575d1e013"} Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.180147 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b439925b-c79c-4b66-957f-5be27680bbc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8551072e274183e6126e06729369468bcf9a98a9a487ff2febe02b088a6a51aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd6957bcec2f931ddc0f584bb947cc6c8e19aee87c4fa2d2727151312e00bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea2a54d9ce79c067adf62a1a112758028562f68fd877d7b5c1f0ac808fde931\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 14:56:16.707999 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 14:56:16.708248 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 14:56:16.709042 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1763409596/tls.crt::/tmp/serving-cert-1763409596/tls.key\\\\\\\"\\\\nI1001 14:56:17.040394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 14:56:17.043951 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 14:56:17.043976 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 14:56:17.043997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 14:56:17.044002 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 14:56:17.049470 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1001 14:56:17.049534 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 14:56:17.049584 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049607 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 14:56:17.049653 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 14:56:17.049673 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 14:56:17.049696 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 14:56:17.051084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d920faabd05e65200c4defe1588344af5ae2b34148b4213bdc86abe9dd19d236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:23Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.199225 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa69b46a-20c4-4ce1-8be9-e945c6865b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0651d298d7c1364b4de3b030574abd7d6ba8ccacab5872d74162878f92592cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b72097f8016d0e79134956d3555973cd7c137c30ab5eade1bd92805d16f66bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://495c07855c881f2a9cd788740e2197c165a38891c729a8563f3540bf5c6d8dcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fb83a6932caf24a6b2430836bc2800415028a8134a2355a816ccbb73efbaf46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:23Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.233337 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad0b1d01c4ff14005e40574c1663d3fafddd1527b8d283ee2f2364a861e3c351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:23Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.242172 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4be36f4b-1171-4281-a7ac-43e411e080f7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jj6k4\" (UID: \"4be36f4b-1171-4281-a7ac-43e411e080f7\") " pod="openshift-multus/multus-additional-cni-plugins-jj6k4" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.242573 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c96a3328-c79b-4528-b9b5-badbc7380dd6-multus-socket-dir-parent\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.242706 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-host-kubelet\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.242888 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a061b8e2-74a8-4953-bfa2-5090a2f70459-ovnkube-script-lib\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.243009 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/289ee6d3-fabe-417f-964c-76ca03c143cc-proxy-tls\") pod \"machine-config-daemon-vck47\" (UID: \"289ee6d3-fabe-417f-964c-76ca03c143cc\") " pod="openshift-machine-config-operator/machine-config-daemon-vck47" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.243112 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4be36f4b-1171-4281-a7ac-43e411e080f7-system-cni-dir\") pod \"multus-additional-cni-plugins-jj6k4\" (UID: \"4be36f4b-1171-4281-a7ac-43e411e080f7\") " pod="openshift-multus/multus-additional-cni-plugins-jj6k4" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.243217 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.243349 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-log-socket\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.243463 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-host-cni-bin\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.243570 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c96a3328-c79b-4528-b9b5-badbc7380dd6-cni-binary-copy\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.243660 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/289ee6d3-fabe-417f-964c-76ca03c143cc-mcd-auth-proxy-config\") pod \"machine-config-daemon-vck47\" (UID: \"289ee6d3-fabe-417f-964c-76ca03c143cc\") " pod="openshift-machine-config-operator/machine-config-daemon-vck47" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.243783 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-systemd-units\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.243899 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-host-run-ovn-kubernetes\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.244045 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c96a3328-c79b-4528-b9b5-badbc7380dd6-cnibin\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.244110 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c96a3328-c79b-4528-b9b5-badbc7380dd6-host-var-lib-cni-bin\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.244132 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c96a3328-c79b-4528-b9b5-badbc7380dd6-hostroot\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.244177 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c96a3328-c79b-4528-b9b5-badbc7380dd6-host-run-multus-certs\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.244195 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs5q5\" (UniqueName: \"kubernetes.io/projected/c96a3328-c79b-4528-b9b5-badbc7380dd6-kube-api-access-xs5q5\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.244223 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-node-log\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.244241 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-host-cni-netd\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.244258 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c96a3328-c79b-4528-b9b5-badbc7380dd6-host-run-netns\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.244277 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c96a3328-c79b-4528-b9b5-badbc7380dd6-etc-kubernetes\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.244294 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-host-run-netns\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.244309 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-run-systemd\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.244325 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a061b8e2-74a8-4953-bfa2-5090a2f70459-ovn-node-metrics-cert\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.244343 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4be36f4b-1171-4281-a7ac-43e411e080f7-cni-binary-copy\") pod \"multus-additional-cni-plugins-jj6k4\" (UID: \"4be36f4b-1171-4281-a7ac-43e411e080f7\") " pod="openshift-multus/multus-additional-cni-plugins-jj6k4" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.244360 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27krx\" (UniqueName: \"kubernetes.io/projected/4be36f4b-1171-4281-a7ac-43e411e080f7-kube-api-access-27krx\") pod \"multus-additional-cni-plugins-jj6k4\" (UID: \"4be36f4b-1171-4281-a7ac-43e411e080f7\") " pod="openshift-multus/multus-additional-cni-plugins-jj6k4" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.244386 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-host-slash\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.244406 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c96a3328-c79b-4528-b9b5-badbc7380dd6-multus-cni-dir\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.244422 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4be36f4b-1171-4281-a7ac-43e411e080f7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jj6k4\" (UID: \"4be36f4b-1171-4281-a7ac-43e411e080f7\") " pod="openshift-multus/multus-additional-cni-plugins-jj6k4" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.244441 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-etc-openvswitch\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.244459 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-run-ovn\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.244474 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c96a3328-c79b-4528-b9b5-badbc7380dd6-multus-daemon-config\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.244490 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4be36f4b-1171-4281-a7ac-43e411e080f7-cnibin\") pod \"multus-additional-cni-plugins-jj6k4\" (UID: \"4be36f4b-1171-4281-a7ac-43e411e080f7\") " pod="openshift-multus/multus-additional-cni-plugins-jj6k4" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.244518 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-run-openvswitch\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.244534 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a061b8e2-74a8-4953-bfa2-5090a2f70459-env-overrides\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.244551 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c96a3328-c79b-4528-b9b5-badbc7380dd6-host-var-lib-kubelet\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.244570 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/289ee6d3-fabe-417f-964c-76ca03c143cc-rootfs\") pod \"machine-config-daemon-vck47\" (UID: \"289ee6d3-fabe-417f-964c-76ca03c143cc\") " pod="openshift-machine-config-operator/machine-config-daemon-vck47" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.244590 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c96a3328-c79b-4528-b9b5-badbc7380dd6-os-release\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.244605 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c96a3328-c79b-4528-b9b5-badbc7380dd6-multus-conf-dir\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.244624 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4be36f4b-1171-4281-a7ac-43e411e080f7-os-release\") pod \"multus-additional-cni-plugins-jj6k4\" (UID: \"4be36f4b-1171-4281-a7ac-43e411e080f7\") " pod="openshift-multus/multus-additional-cni-plugins-jj6k4" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.244650 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a061b8e2-74a8-4953-bfa2-5090a2f70459-ovnkube-config\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.244665 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c96a3328-c79b-4528-b9b5-badbc7380dd6-system-cni-dir\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.244683 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2svt4\" (UniqueName: \"kubernetes.io/projected/289ee6d3-fabe-417f-964c-76ca03c143cc-kube-api-access-2svt4\") pod \"machine-config-daemon-vck47\" (UID: \"289ee6d3-fabe-417f-964c-76ca03c143cc\") " pod="openshift-machine-config-operator/machine-config-daemon-vck47" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.244716 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-var-lib-openvswitch\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.244749 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnspx\" (UniqueName: \"kubernetes.io/projected/a061b8e2-74a8-4953-bfa2-5090a2f70459-kube-api-access-nnspx\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.244765 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c96a3328-c79b-4528-b9b5-badbc7380dd6-host-var-lib-cni-multus\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.244788 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c96a3328-c79b-4528-b9b5-badbc7380dd6-host-run-k8s-cni-cncf-io\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.248156 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kmlgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2129365d-0a99-4cf0-a561-fd4126d1bfc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqx4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kmlgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:23Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.260576 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.260636 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.260648 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.260669 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.260684 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:23Z","lastTransitionTime":"2025-10-01T14:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.273345 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8156f536-4329-4bc1-8ea3-653c5d719a19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dbd5c544c544544637a62c6e11372b59bdddc28c6ddb14e2cdd93d2d02a85b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9cd85dad256af814a22cd9787f3c1d4061ff2b76a0fb1a92b547a10c809a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d50224da1e43f0ad8835be229e79e94d6603a0fcb0879b9dd306e4cc357c96f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6def3f33927474e3a3c86835c9606ccfd32e235441e1646cdba4cef0b41e2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe4f8936af05697692a6547962fa77f3ac8bb357e3484324073f7768b374060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:23Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.286341 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6ef7bbeb1f3ba58b4d1cfa2d55bcc4d7e7264d3db53e61495625b58a8503ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://679e6aca450f387cb31946627423ce8bc164d9815a7caf225e6cf1512f189cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:23Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.300783 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02063fddc7a255292ea7cdd6a318546b1573a0701787f4ab839ac7bea5b1ccb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:23Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.315898 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:23Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.326471 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7wr7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bb66959-800a-45dc-909f-1a093c578823\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bdn7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7wr7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:23Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.345688 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-var-lib-openvswitch\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.345769 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnspx\" (UniqueName: \"kubernetes.io/projected/a061b8e2-74a8-4953-bfa2-5090a2f70459-kube-api-access-nnspx\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.345802 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c96a3328-c79b-4528-b9b5-badbc7380dd6-host-var-lib-cni-multus\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.345845 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c96a3328-c79b-4528-b9b5-badbc7380dd6-host-run-k8s-cni-cncf-io\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.345888 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4be36f4b-1171-4281-a7ac-43e411e080f7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jj6k4\" (UID: \"4be36f4b-1171-4281-a7ac-43e411e080f7\") " pod="openshift-multus/multus-additional-cni-plugins-jj6k4" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.345847 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-var-lib-openvswitch\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.345921 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c96a3328-c79b-4528-b9b5-badbc7380dd6-multus-socket-dir-parent\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.345941 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-host-kubelet\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.345959 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a061b8e2-74a8-4953-bfa2-5090a2f70459-ovnkube-script-lib\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.346018 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c96a3328-c79b-4528-b9b5-badbc7380dd6-multus-socket-dir-parent\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.346070 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/289ee6d3-fabe-417f-964c-76ca03c143cc-proxy-tls\") pod \"machine-config-daemon-vck47\" (UID: \"289ee6d3-fabe-417f-964c-76ca03c143cc\") " pod="openshift-machine-config-operator/machine-config-daemon-vck47" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.346097 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-host-kubelet\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.345907 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c96a3328-c79b-4528-b9b5-badbc7380dd6-host-var-lib-cni-multus\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.346128 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4be36f4b-1171-4281-a7ac-43e411e080f7-system-cni-dir\") pod \"multus-additional-cni-plugins-jj6k4\" (UID: \"4be36f4b-1171-4281-a7ac-43e411e080f7\") " pod="openshift-multus/multus-additional-cni-plugins-jj6k4" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.346173 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.346211 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-log-socket\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.346229 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-host-cni-bin\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.346250 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c96a3328-c79b-4528-b9b5-badbc7380dd6-cni-binary-copy\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.346269 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/289ee6d3-fabe-417f-964c-76ca03c143cc-mcd-auth-proxy-config\") pod \"machine-config-daemon-vck47\" (UID: \"289ee6d3-fabe-417f-964c-76ca03c143cc\") " pod="openshift-machine-config-operator/machine-config-daemon-vck47" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.346294 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-systemd-units\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.346309 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-host-run-ovn-kubernetes\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.346327 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c96a3328-c79b-4528-b9b5-badbc7380dd6-cnibin\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.346344 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c96a3328-c79b-4528-b9b5-badbc7380dd6-host-var-lib-cni-bin\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.346358 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c96a3328-c79b-4528-b9b5-badbc7380dd6-hostroot\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.346388 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c96a3328-c79b-4528-b9b5-badbc7380dd6-host-run-multus-certs\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.346404 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs5q5\" (UniqueName: \"kubernetes.io/projected/c96a3328-c79b-4528-b9b5-badbc7380dd6-kube-api-access-xs5q5\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.346422 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-node-log\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.346438 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-host-cni-netd\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.346454 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c96a3328-c79b-4528-b9b5-badbc7380dd6-host-run-netns\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.346468 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c96a3328-c79b-4528-b9b5-badbc7380dd6-etc-kubernetes\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.346489 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-host-run-netns\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.346505 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-run-systemd\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.346521 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a061b8e2-74a8-4953-bfa2-5090a2f70459-ovn-node-metrics-cert\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.346538 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4be36f4b-1171-4281-a7ac-43e411e080f7-cni-binary-copy\") pod \"multus-additional-cni-plugins-jj6k4\" (UID: \"4be36f4b-1171-4281-a7ac-43e411e080f7\") " pod="openshift-multus/multus-additional-cni-plugins-jj6k4" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.346555 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27krx\" (UniqueName: \"kubernetes.io/projected/4be36f4b-1171-4281-a7ac-43e411e080f7-kube-api-access-27krx\") pod \"multus-additional-cni-plugins-jj6k4\" (UID: \"4be36f4b-1171-4281-a7ac-43e411e080f7\") " pod="openshift-multus/multus-additional-cni-plugins-jj6k4" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.346574 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-host-slash\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.346610 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c96a3328-c79b-4528-b9b5-badbc7380dd6-multus-cni-dir\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.346633 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4be36f4b-1171-4281-a7ac-43e411e080f7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jj6k4\" (UID: \"4be36f4b-1171-4281-a7ac-43e411e080f7\") " pod="openshift-multus/multus-additional-cni-plugins-jj6k4" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.346652 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-etc-openvswitch\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.346668 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-run-ovn\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.346683 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c96a3328-c79b-4528-b9b5-badbc7380dd6-multus-daemon-config\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.346698 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4be36f4b-1171-4281-a7ac-43e411e080f7-cnibin\") pod \"multus-additional-cni-plugins-jj6k4\" (UID: \"4be36f4b-1171-4281-a7ac-43e411e080f7\") " pod="openshift-multus/multus-additional-cni-plugins-jj6k4" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.346741 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-run-openvswitch\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.346760 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a061b8e2-74a8-4953-bfa2-5090a2f70459-env-overrides\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.346779 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c96a3328-c79b-4528-b9b5-badbc7380dd6-host-var-lib-kubelet\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.346796 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/289ee6d3-fabe-417f-964c-76ca03c143cc-rootfs\") pod \"machine-config-daemon-vck47\" (UID: \"289ee6d3-fabe-417f-964c-76ca03c143cc\") " pod="openshift-machine-config-operator/machine-config-daemon-vck47" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.346817 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c96a3328-c79b-4528-b9b5-badbc7380dd6-os-release\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.346833 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c96a3328-c79b-4528-b9b5-badbc7380dd6-multus-conf-dir\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.346853 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4be36f4b-1171-4281-a7ac-43e411e080f7-os-release\") pod \"multus-additional-cni-plugins-jj6k4\" (UID: \"4be36f4b-1171-4281-a7ac-43e411e080f7\") " pod="openshift-multus/multus-additional-cni-plugins-jj6k4" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.346877 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a061b8e2-74a8-4953-bfa2-5090a2f70459-ovnkube-config\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.346887 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a061b8e2-74a8-4953-bfa2-5090a2f70459-ovnkube-script-lib\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.346949 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4be36f4b-1171-4281-a7ac-43e411e080f7-system-cni-dir\") pod \"multus-additional-cni-plugins-jj6k4\" (UID: \"4be36f4b-1171-4281-a7ac-43e411e080f7\") " pod="openshift-multus/multus-additional-cni-plugins-jj6k4" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.346970 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c96a3328-c79b-4528-b9b5-badbc7380dd6-system-cni-dir\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.346983 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.346892 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c96a3328-c79b-4528-b9b5-badbc7380dd6-system-cni-dir\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.347013 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-log-socket\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.347022 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2svt4\" (UniqueName: \"kubernetes.io/projected/289ee6d3-fabe-417f-964c-76ca03c143cc-kube-api-access-2svt4\") pod \"machine-config-daemon-vck47\" (UID: \"289ee6d3-fabe-417f-964c-76ca03c143cc\") " pod="openshift-machine-config-operator/machine-config-daemon-vck47" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.347038 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-host-cni-bin\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.347606 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-systemd-units\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.347657 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-host-run-ovn-kubernetes\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.347707 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c96a3328-c79b-4528-b9b5-badbc7380dd6-cnibin\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.347767 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c96a3328-c79b-4528-b9b5-badbc7380dd6-host-var-lib-cni-bin\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.347795 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c96a3328-c79b-4528-b9b5-badbc7380dd6-cni-binary-copy\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.347807 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c96a3328-c79b-4528-b9b5-badbc7380dd6-hostroot\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.347827 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/289ee6d3-fabe-417f-964c-76ca03c143cc-mcd-auth-proxy-config\") pod \"machine-config-daemon-vck47\" (UID: \"289ee6d3-fabe-417f-964c-76ca03c143cc\") " pod="openshift-machine-config-operator/machine-config-daemon-vck47" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.347866 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c96a3328-c79b-4528-b9b5-badbc7380dd6-host-run-multus-certs\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.347872 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-host-slash\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.347957 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-run-openvswitch\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.348022 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-node-log\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.348066 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-host-cni-netd\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.348101 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c96a3328-c79b-4528-b9b5-badbc7380dd6-host-run-netns\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.348119 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c96a3328-c79b-4528-b9b5-badbc7380dd6-multus-cni-dir\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.348141 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c96a3328-c79b-4528-b9b5-badbc7380dd6-etc-kubernetes\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.348186 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-host-run-netns\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.348225 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-run-systemd\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.348545 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a061b8e2-74a8-4953-bfa2-5090a2f70459-env-overrides\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.348589 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c96a3328-c79b-4528-b9b5-badbc7380dd6-host-var-lib-kubelet\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.348614 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/289ee6d3-fabe-417f-964c-76ca03c143cc-rootfs\") pod \"machine-config-daemon-vck47\" (UID: \"289ee6d3-fabe-417f-964c-76ca03c143cc\") " pod="openshift-machine-config-operator/machine-config-daemon-vck47" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.348666 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c96a3328-c79b-4528-b9b5-badbc7380dd6-os-release\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.348698 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c96a3328-c79b-4528-b9b5-badbc7380dd6-multus-conf-dir\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.348766 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4be36f4b-1171-4281-a7ac-43e411e080f7-os-release\") pod \"multus-additional-cni-plugins-jj6k4\" (UID: \"4be36f4b-1171-4281-a7ac-43e411e080f7\") " pod="openshift-multus/multus-additional-cni-plugins-jj6k4" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.348873 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4be36f4b-1171-4281-a7ac-43e411e080f7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-jj6k4\" (UID: \"4be36f4b-1171-4281-a7ac-43e411e080f7\") " pod="openshift-multus/multus-additional-cni-plugins-jj6k4" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.348934 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-etc-openvswitch\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.348977 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-run-ovn\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.349238 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a061b8e2-74a8-4953-bfa2-5090a2f70459-ovnkube-config\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.349287 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4be36f4b-1171-4281-a7ac-43e411e080f7-cnibin\") pod \"multus-additional-cni-plugins-jj6k4\" (UID: \"4be36f4b-1171-4281-a7ac-43e411e080f7\") " pod="openshift-multus/multus-additional-cni-plugins-jj6k4" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.349474 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c96a3328-c79b-4528-b9b5-badbc7380dd6-multus-daemon-config\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.349768 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4be36f4b-1171-4281-a7ac-43e411e080f7-cni-binary-copy\") pod \"multus-additional-cni-plugins-jj6k4\" (UID: \"4be36f4b-1171-4281-a7ac-43e411e080f7\") " pod="openshift-multus/multus-additional-cni-plugins-jj6k4" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.349838 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c96a3328-c79b-4528-b9b5-badbc7380dd6-host-run-k8s-cni-cncf-io\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.351061 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/289ee6d3-fabe-417f-964c-76ca03c143cc-proxy-tls\") pod \"machine-config-daemon-vck47\" (UID: \"289ee6d3-fabe-417f-964c-76ca03c143cc\") " pod="openshift-machine-config-operator/machine-config-daemon-vck47" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.355700 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4be36f4b-1171-4281-a7ac-43e411e080f7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-jj6k4\" (UID: \"4be36f4b-1171-4281-a7ac-43e411e080f7\") " pod="openshift-multus/multus-additional-cni-plugins-jj6k4" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.358387 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a061b8e2-74a8-4953-bfa2-5090a2f70459-ovn-node-metrics-cert\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.362994 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.363032 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.363043 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.363062 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.363074 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:23Z","lastTransitionTime":"2025-10-01T14:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.368568 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"289ee6d3-fabe-417f-964c-76ca03c143cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vck47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:23Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.369653 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2svt4\" (UniqueName: \"kubernetes.io/projected/289ee6d3-fabe-417f-964c-76ca03c143cc-kube-api-access-2svt4\") pod \"machine-config-daemon-vck47\" (UID: \"289ee6d3-fabe-417f-964c-76ca03c143cc\") " pod="openshift-machine-config-operator/machine-config-daemon-vck47" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.370052 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnspx\" (UniqueName: \"kubernetes.io/projected/a061b8e2-74a8-4953-bfa2-5090a2f70459-kube-api-access-nnspx\") pod \"ovnkube-node-j7ntp\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.372937 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27krx\" (UniqueName: \"kubernetes.io/projected/4be36f4b-1171-4281-a7ac-43e411e080f7-kube-api-access-27krx\") pod \"multus-additional-cni-plugins-jj6k4\" (UID: \"4be36f4b-1171-4281-a7ac-43e411e080f7\") " pod="openshift-multus/multus-additional-cni-plugins-jj6k4" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.374952 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs5q5\" (UniqueName: \"kubernetes.io/projected/c96a3328-c79b-4528-b9b5-badbc7380dd6-kube-api-access-xs5q5\") pod \"multus-9lvcz\" (UID: \"c96a3328-c79b-4528-b9b5-badbc7380dd6\") " pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.390709 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a061b8e2-74a8-4953-bfa2-5090a2f70459\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j7ntp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:23Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.410319 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02063fddc7a255292ea7cdd6a318546b1573a0701787f4ab839ac7bea5b1ccb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:23Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.427038 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-9lvcz" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.433470 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.440698 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-vck47" Oct 01 14:56:23 crc kubenswrapper[4771]: W1001 14:56:23.443407 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc96a3328_c79b_4528_b9b5_badbc7380dd6.slice/crio-4d7fb60f4ef8f9d0bb725ee260c87d5c9073124544252739eeceee6fafc24da9 WatchSource:0}: Error finding container 4d7fb60f4ef8f9d0bb725ee260c87d5c9073124544252739eeceee6fafc24da9: Status 404 returned error can't find the container with id 4d7fb60f4ef8f9d0bb725ee260c87d5c9073124544252739eeceee6fafc24da9 Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.447020 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.453522 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:23Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.465681 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.465746 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.465758 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.465778 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.465789 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:23Z","lastTransitionTime":"2025-10-01T14:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.473509 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7wr7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bb66959-800a-45dc-909f-1a093c578823\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bdn7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7wr7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:23Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.492724 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:23Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:23 crc kubenswrapper[4771]: W1001 14:56:23.498123 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod289ee6d3_fabe_417f_964c_76ca03c143cc.slice/crio-7579089563cbb494cb6f7ffd5a6e13c9241437d28ce47000d68a8b1bd1557e03 WatchSource:0}: Error finding container 7579089563cbb494cb6f7ffd5a6e13c9241437d28ce47000d68a8b1bd1557e03: Status 404 returned error can't find the container with id 7579089563cbb494cb6f7ffd5a6e13c9241437d28ce47000d68a8b1bd1557e03 Oct 01 14:56:23 crc kubenswrapper[4771]: W1001 14:56:23.500268 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda061b8e2_74a8_4953_bfa2_5090a2f70459.slice/crio-6f55c7ea12abf59b4a544c6ba7573cdcbc6029096e356a953a56fbd258cc3dd1 WatchSource:0}: Error finding container 6f55c7ea12abf59b4a544c6ba7573cdcbc6029096e356a953a56fbd258cc3dd1: Status 404 returned error can't find the container with id 6f55c7ea12abf59b4a544c6ba7573cdcbc6029096e356a953a56fbd258cc3dd1 Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.513123 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:23Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.544037 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9lvcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96a3328-c79b-4528-b9b5-badbc7380dd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs5q5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9lvcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:23Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.555099 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kmlgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2129365d-0a99-4cf0-a561-fd4126d1bfc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dabcd9bb31c364a82e0015bb58c48344f35fd73013cb9eb2c9d178ea6befbdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqx4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kmlgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:23Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.573075 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.573118 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.573129 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.573154 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.573166 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:23Z","lastTransitionTime":"2025-10-01T14:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.573938 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be36f4b-1171-4281-a7ac-43e411e080f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jj6k4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:23Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.597091 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8156f536-4329-4bc1-8ea3-653c5d719a19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dbd5c544c544544637a62c6e11372b59bdddc28c6ddb14e2cdd93d2d02a85b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9cd85dad256af814a22cd9787f3c1d4061ff2b76a0fb1a92b547a10c809a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d50224da1e43f0ad8835be229e79e94d6603a0fcb0879b9dd306e4cc357c96f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6def3f33927474e3a3c86835c9606ccfd32e235441e1646cdba4cef0b41e2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe4f8936af05697692a6547962fa77f3ac8bb357e3484324073f7768b374060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:23Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.610420 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b439925b-c79c-4b66-957f-5be27680bbc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8551072e274183e6126e06729369468bcf9a98a9a487ff2febe02b088a6a51aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd6957bcec2f931ddc0f584bb947cc6c8e19aee87c4fa2d2727151312e00bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea2a54d9ce79c067adf62a1a112758028562f68fd877d7b5c1f0ac808fde931\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 14:56:16.707999 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 14:56:16.708248 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 14:56:16.709042 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1763409596/tls.crt::/tmp/serving-cert-1763409596/tls.key\\\\\\\"\\\\nI1001 14:56:17.040394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 14:56:17.043951 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 14:56:17.043976 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 14:56:17.043997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 14:56:17.044002 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 14:56:17.049470 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1001 14:56:17.049534 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 14:56:17.049584 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049607 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 14:56:17.049653 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 14:56:17.049673 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 14:56:17.049696 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 14:56:17.051084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d920faabd05e65200c4defe1588344af5ae2b34148b4213bdc86abe9dd19d236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:23Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.624226 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa69b46a-20c4-4ce1-8be9-e945c6865b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0651d298d7c1364b4de3b030574abd7d6ba8ccacab5872d74162878f92592cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b72097f8016d0e79134956d3555973cd7c137c30ab5eade1bd92805d16f66bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://495c07855c881f2a9cd788740e2197c165a38891c729a8563f3540bf5c6d8dcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fb83a6932caf24a6b2430836bc2800415028a8134a2355a816ccbb73efbaf46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:23Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.638602 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad0b1d01c4ff14005e40574c1663d3fafddd1527b8d283ee2f2364a861e3c351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:23Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.651955 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6ef7bbeb1f3ba58b4d1cfa2d55bcc4d7e7264d3db53e61495625b58a8503ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://679e6aca450f387cb31946627423ce8bc164d9815a7caf225e6cf1512f189cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:23Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.677855 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.677899 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.677908 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.677926 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.677937 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:23Z","lastTransitionTime":"2025-10-01T14:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.781253 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.781313 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.781328 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.781352 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.781366 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:23Z","lastTransitionTime":"2025-10-01T14:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.884296 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.884342 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.884357 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.884377 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.884391 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:23Z","lastTransitionTime":"2025-10-01T14:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.987538 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.987587 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.987600 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.987619 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:23 crc kubenswrapper[4771]: I1001 14:56:23.987635 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:23Z","lastTransitionTime":"2025-10-01T14:56:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.045275 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.046077 4771 scope.go:117] "RemoveContainer" containerID="c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570" Oct 01 14:56:24 crc kubenswrapper[4771]: E1001 14:56:24.046261 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.091393 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.091440 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.091455 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.091475 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.091515 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:24Z","lastTransitionTime":"2025-10-01T14:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.173698 4771 generic.go:334] "Generic (PLEG): container finished" podID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerID="834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b" exitCode=0 Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.173798 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" event={"ID":"a061b8e2-74a8-4953-bfa2-5090a2f70459","Type":"ContainerDied","Data":"834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b"} Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.173902 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" event={"ID":"a061b8e2-74a8-4953-bfa2-5090a2f70459","Type":"ContainerStarted","Data":"6f55c7ea12abf59b4a544c6ba7573cdcbc6029096e356a953a56fbd258cc3dd1"} Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.175786 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" event={"ID":"289ee6d3-fabe-417f-964c-76ca03c143cc","Type":"ContainerStarted","Data":"37230c61c3cdf57d73df404731eb692cf20c46a8d983ee40c0aef7ee1f3ad839"} Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.175864 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" event={"ID":"289ee6d3-fabe-417f-964c-76ca03c143cc","Type":"ContainerStarted","Data":"161a9b5c96d73213aa3d0261956487464c5e1398e47d221db9fccafdfdc51856"} Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.175879 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" event={"ID":"289ee6d3-fabe-417f-964c-76ca03c143cc","Type":"ContainerStarted","Data":"7579089563cbb494cb6f7ffd5a6e13c9241437d28ce47000d68a8b1bd1557e03"} Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.177451 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" event={"ID":"4be36f4b-1171-4281-a7ac-43e411e080f7","Type":"ContainerStarted","Data":"3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a"} Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.177479 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" event={"ID":"4be36f4b-1171-4281-a7ac-43e411e080f7","Type":"ContainerStarted","Data":"d0419c21c7e2251600bd716a4fcb200efc4474d7ed5e555a28ea3ae97f4e393e"} Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.179480 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9lvcz" event={"ID":"c96a3328-c79b-4528-b9b5-badbc7380dd6","Type":"ContainerStarted","Data":"1dea414c74a97184e0507113cc8069ee732fe85125aee58c080165c511835f1d"} Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.179556 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9lvcz" event={"ID":"c96a3328-c79b-4528-b9b5-badbc7380dd6","Type":"ContainerStarted","Data":"4d7fb60f4ef8f9d0bb725ee260c87d5c9073124544252739eeceee6fafc24da9"} Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.181134 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7wr7q" event={"ID":"0bb66959-800a-45dc-909f-1a093c578823","Type":"ContainerStarted","Data":"7abffda55d62e4f219933292ded99619fb5bfbbe87a5091c8aaaee6ea6162353"} Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.188248 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7wr7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bb66959-800a-45dc-909f-1a093c578823\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bdn7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7wr7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:24Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.193672 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.193713 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.193724 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.193766 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.193780 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:24Z","lastTransitionTime":"2025-10-01T14:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.202673 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"289ee6d3-fabe-417f-964c-76ca03c143cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vck47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:24Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.226657 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a061b8e2-74a8-4953-bfa2-5090a2f70459\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j7ntp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:24Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.247427 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02063fddc7a255292ea7cdd6a318546b1573a0701787f4ab839ac7bea5b1ccb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:24Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.262766 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:24Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.277106 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:24Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.290025 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:24Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.296902 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.296942 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.297004 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.297024 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.297038 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:24Z","lastTransitionTime":"2025-10-01T14:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.324063 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9lvcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96a3328-c79b-4528-b9b5-badbc7380dd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs5q5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9lvcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:24Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.337960 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad0b1d01c4ff14005e40574c1663d3fafddd1527b8d283ee2f2364a861e3c351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:24Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.349755 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kmlgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2129365d-0a99-4cf0-a561-fd4126d1bfc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dabcd9bb31c364a82e0015bb58c48344f35fd73013cb9eb2c9d178ea6befbdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqx4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kmlgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:24Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.363488 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be36f4b-1171-4281-a7ac-43e411e080f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jj6k4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:24Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.409353 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8156f536-4329-4bc1-8ea3-653c5d719a19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dbd5c544c544544637a62c6e11372b59bdddc28c6ddb14e2cdd93d2d02a85b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9cd85dad256af814a22cd9787f3c1d4061ff2b76a0fb1a92b547a10c809a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d50224da1e43f0ad8835be229e79e94d6603a0fcb0879b9dd306e4cc357c96f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6def3f33927474e3a3c86835c9606ccfd32e235441e1646cdba4cef0b41e2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe4f8936af05697692a6547962fa77f3ac8bb357e3484324073f7768b374060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:24Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.416518 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.416570 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.416580 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.416604 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.416615 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:24Z","lastTransitionTime":"2025-10-01T14:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.428152 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b439925b-c79c-4b66-957f-5be27680bbc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8551072e274183e6126e06729369468bcf9a98a9a487ff2febe02b088a6a51aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd6957bcec2f931ddc0f584bb947cc6c8e19aee87c4fa2d2727151312e00bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea2a54d9ce79c067adf62a1a112758028562f68fd877d7b5c1f0ac808fde931\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 14:56:16.707999 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 14:56:16.708248 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 14:56:16.709042 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1763409596/tls.crt::/tmp/serving-cert-1763409596/tls.key\\\\\\\"\\\\nI1001 14:56:17.040394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 14:56:17.043951 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 14:56:17.043976 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 14:56:17.043997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 14:56:17.044002 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 14:56:17.049470 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1001 14:56:17.049534 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 14:56:17.049584 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049607 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 14:56:17.049653 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 14:56:17.049673 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 14:56:17.049696 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 14:56:17.051084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d920faabd05e65200c4defe1588344af5ae2b34148b4213bdc86abe9dd19d236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:24Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.440603 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa69b46a-20c4-4ce1-8be9-e945c6865b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0651d298d7c1364b4de3b030574abd7d6ba8ccacab5872d74162878f92592cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b72097f8016d0e79134956d3555973cd7c137c30ab5eade1bd92805d16f66bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://495c07855c881f2a9cd788740e2197c165a38891c729a8563f3540bf5c6d8dcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fb83a6932caf24a6b2430836bc2800415028a8134a2355a816ccbb73efbaf46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:24Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.456021 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6ef7bbeb1f3ba58b4d1cfa2d55bcc4d7e7264d3db53e61495625b58a8503ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://679e6aca450f387cb31946627423ce8bc164d9815a7caf225e6cf1512f189cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:24Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.469628 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa69b46a-20c4-4ce1-8be9-e945c6865b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0651d298d7c1364b4de3b030574abd7d6ba8ccacab5872d74162878f92592cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b72097f8016d0e79134956d3555973cd7c137c30ab5eade1bd92805d16f66bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://495c07855c881f2a9cd788740e2197c165a38891c729a8563f3540bf5c6d8dcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fb83a6932caf24a6b2430836bc2800415028a8134a2355a816ccbb73efbaf46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:24Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.485465 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad0b1d01c4ff14005e40574c1663d3fafddd1527b8d283ee2f2364a861e3c351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:24Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.494897 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kmlgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2129365d-0a99-4cf0-a561-fd4126d1bfc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dabcd9bb31c364a82e0015bb58c48344f35fd73013cb9eb2c9d178ea6befbdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqx4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kmlgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:24Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.509912 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be36f4b-1171-4281-a7ac-43e411e080f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jj6k4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:24Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.520012 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.520083 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.520102 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.520128 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.520146 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:24Z","lastTransitionTime":"2025-10-01T14:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.531629 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8156f536-4329-4bc1-8ea3-653c5d719a19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dbd5c544c544544637a62c6e11372b59bdddc28c6ddb14e2cdd93d2d02a85b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9cd85dad256af814a22cd9787f3c1d4061ff2b76a0fb1a92b547a10c809a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d50224da1e43f0ad8835be229e79e94d6603a0fcb0879b9dd306e4cc357c96f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6def3f33927474e3a3c86835c9606ccfd32e235441e1646cdba4cef0b41e2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe4f8936af05697692a6547962fa77f3ac8bb357e3484324073f7768b374060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:24Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.545840 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b439925b-c79c-4b66-957f-5be27680bbc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8551072e274183e6126e06729369468bcf9a98a9a487ff2febe02b088a6a51aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd6957bcec2f931ddc0f584bb947cc6c8e19aee87c4fa2d2727151312e00bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea2a54d9ce79c067adf62a1a112758028562f68fd877d7b5c1f0ac808fde931\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 14:56:16.707999 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 14:56:16.708248 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 14:56:16.709042 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1763409596/tls.crt::/tmp/serving-cert-1763409596/tls.key\\\\\\\"\\\\nI1001 14:56:17.040394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 14:56:17.043951 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 14:56:17.043976 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 14:56:17.043997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 14:56:17.044002 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 14:56:17.049470 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1001 14:56:17.049534 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 14:56:17.049584 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049607 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 14:56:17.049653 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 14:56:17.049673 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 14:56:17.049696 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 14:56:17.051084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d920faabd05e65200c4defe1588344af5ae2b34148b4213bdc86abe9dd19d236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:24Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.559005 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6ef7bbeb1f3ba58b4d1cfa2d55bcc4d7e7264d3db53e61495625b58a8503ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://679e6aca450f387cb31946627423ce8bc164d9815a7caf225e6cf1512f189cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:24Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.570010 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:24Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.579522 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7wr7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bb66959-800a-45dc-909f-1a093c578823\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abffda55d62e4f219933292ded99619fb5bfbbe87a5091c8aaaee6ea6162353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bdn7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7wr7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:24Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.592284 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"289ee6d3-fabe-417f-964c-76ca03c143cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37230c61c3cdf57d73df404731eb692cf20c46a8d983ee40c0aef7ee1f3ad839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://161a9b5c96d73213aa3d0261956487464c5e1398e47d221db9fccafdfdc51856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vck47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:24Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.611092 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a061b8e2-74a8-4953-bfa2-5090a2f70459\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j7ntp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:24Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.623827 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.623881 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.623895 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.623914 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.623927 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:24Z","lastTransitionTime":"2025-10-01T14:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.627049 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02063fddc7a255292ea7cdd6a318546b1573a0701787f4ab839ac7bea5b1ccb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:24Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.640104 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9lvcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96a3328-c79b-4528-b9b5-badbc7380dd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dea414c74a97184e0507113cc8069ee732fe85125aee58c080165c511835f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs5q5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9lvcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:24Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.651713 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:24Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.660048 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:56:24 crc kubenswrapper[4771]: E1001 14:56:24.660328 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:56:32.660283869 +0000 UTC m=+37.279459080 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.665416 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:24Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.727936 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.727999 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.728024 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.728058 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.728082 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:24Z","lastTransitionTime":"2025-10-01T14:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.760875 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.760930 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.760967 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.761000 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:56:24 crc kubenswrapper[4771]: E1001 14:56:24.761100 4771 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 14:56:24 crc kubenswrapper[4771]: E1001 14:56:24.761152 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 14:56:24 crc kubenswrapper[4771]: E1001 14:56:24.761194 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 14:56:24 crc kubenswrapper[4771]: E1001 14:56:24.761196 4771 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 14:56:24 crc kubenswrapper[4771]: E1001 14:56:24.761223 4771 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 14:56:24 crc kubenswrapper[4771]: E1001 14:56:24.761157 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 14:56:24 crc kubenswrapper[4771]: E1001 14:56:24.761321 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 14:56:24 crc kubenswrapper[4771]: E1001 14:56:24.761342 4771 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 14:56:24 crc kubenswrapper[4771]: E1001 14:56:24.761167 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 14:56:32.761146489 +0000 UTC m=+37.380321650 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 14:56:24 crc kubenswrapper[4771]: E1001 14:56:24.761426 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 14:56:32.761401225 +0000 UTC m=+37.380576426 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 14:56:24 crc kubenswrapper[4771]: E1001 14:56:24.761457 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 14:56:32.761440366 +0000 UTC m=+37.380615577 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 14:56:24 crc kubenswrapper[4771]: E1001 14:56:24.761484 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 14:56:32.761468977 +0000 UTC m=+37.380644228 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.831626 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.831670 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.831682 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.831703 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.831716 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:24Z","lastTransitionTime":"2025-10-01T14:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.934131 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.934197 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.934215 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.934243 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.934262 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:24Z","lastTransitionTime":"2025-10-01T14:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.985354 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.985366 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:56:24 crc kubenswrapper[4771]: E1001 14:56:24.985551 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:56:24 crc kubenswrapper[4771]: I1001 14:56:24.985392 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:56:24 crc kubenswrapper[4771]: E1001 14:56:24.985630 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:56:24 crc kubenswrapper[4771]: E1001 14:56:24.985658 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.036576 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.036614 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.036625 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.036646 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.036664 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:25Z","lastTransitionTime":"2025-10-01T14:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.139957 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.140020 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.140032 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.140051 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.140064 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:25Z","lastTransitionTime":"2025-10-01T14:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.189690 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" event={"ID":"a061b8e2-74a8-4953-bfa2-5090a2f70459","Type":"ContainerStarted","Data":"1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101"} Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.189779 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" event={"ID":"a061b8e2-74a8-4953-bfa2-5090a2f70459","Type":"ContainerStarted","Data":"211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70"} Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.189794 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" event={"ID":"a061b8e2-74a8-4953-bfa2-5090a2f70459","Type":"ContainerStarted","Data":"c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f"} Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.189803 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" event={"ID":"a061b8e2-74a8-4953-bfa2-5090a2f70459","Type":"ContainerStarted","Data":"1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b"} Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.189813 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" event={"ID":"a061b8e2-74a8-4953-bfa2-5090a2f70459","Type":"ContainerStarted","Data":"4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183"} Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.191430 4771 generic.go:334] "Generic (PLEG): container finished" podID="4be36f4b-1171-4281-a7ac-43e411e080f7" containerID="3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a" exitCode=0 Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.191835 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" event={"ID":"4be36f4b-1171-4281-a7ac-43e411e080f7","Type":"ContainerDied","Data":"3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a"} Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.208097 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"289ee6d3-fabe-417f-964c-76ca03c143cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37230c61c3cdf57d73df404731eb692cf20c46a8d983ee40c0aef7ee1f3ad839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://161a9b5c96d73213aa3d0261956487464c5e1398e47d221db9fccafdfdc51856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vck47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:25Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.230239 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a061b8e2-74a8-4953-bfa2-5090a2f70459\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j7ntp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:25Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.243448 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.243491 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.243502 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.243520 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.243531 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:25Z","lastTransitionTime":"2025-10-01T14:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.253675 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02063fddc7a255292ea7cdd6a318546b1573a0701787f4ab839ac7bea5b1ccb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:25Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.273411 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:25Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.289168 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7wr7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bb66959-800a-45dc-909f-1a093c578823\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abffda55d62e4f219933292ded99619fb5bfbbe87a5091c8aaaee6ea6162353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bdn7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7wr7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:25Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.305287 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:25Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.321653 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:25Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.338116 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9lvcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96a3328-c79b-4528-b9b5-badbc7380dd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dea414c74a97184e0507113cc8069ee732fe85125aee58c080165c511835f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs5q5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9lvcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:25Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.345984 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.346036 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.346060 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.346084 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.346098 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:25Z","lastTransitionTime":"2025-10-01T14:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.352781 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kmlgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2129365d-0a99-4cf0-a561-fd4126d1bfc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dabcd9bb31c364a82e0015bb58c48344f35fd73013cb9eb2c9d178ea6befbdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqx4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kmlgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:25Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.377005 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be36f4b-1171-4281-a7ac-43e411e080f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jj6k4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:25Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.398659 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8156f536-4329-4bc1-8ea3-653c5d719a19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dbd5c544c544544637a62c6e11372b59bdddc28c6ddb14e2cdd93d2d02a85b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9cd85dad256af814a22cd9787f3c1d4061ff2b76a0fb1a92b547a10c809a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d50224da1e43f0ad8835be229e79e94d6603a0fcb0879b9dd306e4cc357c96f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6def3f33927474e3a3c86835c9606ccfd32e235441e1646cdba4cef0b41e2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe4f8936af05697692a6547962fa77f3ac8bb357e3484324073f7768b374060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:25Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.435377 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b439925b-c79c-4b66-957f-5be27680bbc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8551072e274183e6126e06729369468bcf9a98a9a487ff2febe02b088a6a51aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd6957bcec2f931ddc0f584bb947cc6c8e19aee87c4fa2d2727151312e00bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea2a54d9ce79c067adf62a1a112758028562f68fd877d7b5c1f0ac808fde931\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 14:56:16.707999 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 14:56:16.708248 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 14:56:16.709042 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1763409596/tls.crt::/tmp/serving-cert-1763409596/tls.key\\\\\\\"\\\\nI1001 14:56:17.040394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 14:56:17.043951 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 14:56:17.043976 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 14:56:17.043997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 14:56:17.044002 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 14:56:17.049470 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1001 14:56:17.049534 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 14:56:17.049584 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049607 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 14:56:17.049653 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 14:56:17.049673 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 14:56:17.049696 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 14:56:17.051084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d920faabd05e65200c4defe1588344af5ae2b34148b4213bdc86abe9dd19d236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:25Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.448624 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.448706 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.448743 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.448762 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.448775 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:25Z","lastTransitionTime":"2025-10-01T14:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.457767 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa69b46a-20c4-4ce1-8be9-e945c6865b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0651d298d7c1364b4de3b030574abd7d6ba8ccacab5872d74162878f92592cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b72097f8016d0e79134956d3555973cd7c137c30ab5eade1bd92805d16f66bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://495c07855c881f2a9cd788740e2197c165a38891c729a8563f3540bf5c6d8dcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fb83a6932caf24a6b2430836bc2800415028a8134a2355a816ccbb73efbaf46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:25Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.478710 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad0b1d01c4ff14005e40574c1663d3fafddd1527b8d283ee2f2364a861e3c351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:25Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.497033 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6ef7bbeb1f3ba58b4d1cfa2d55bcc4d7e7264d3db53e61495625b58a8503ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://679e6aca450f387cb31946627423ce8bc164d9815a7caf225e6cf1512f189cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:25Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.554811 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.554885 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.554897 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.554914 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.554925 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:25Z","lastTransitionTime":"2025-10-01T14:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.658205 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.658258 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.658269 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.658289 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.658304 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:25Z","lastTransitionTime":"2025-10-01T14:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.760592 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.760644 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.760658 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.760676 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.760689 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:25Z","lastTransitionTime":"2025-10-01T14:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.864341 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.864400 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.864415 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.864441 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.864457 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:25Z","lastTransitionTime":"2025-10-01T14:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.968463 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.968510 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.968523 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.968544 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:25 crc kubenswrapper[4771]: I1001 14:56:25.968559 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:25Z","lastTransitionTime":"2025-10-01T14:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.007458 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.029514 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.054593 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9lvcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96a3328-c79b-4528-b9b5-badbc7380dd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dea414c74a97184e0507113cc8069ee732fe85125aee58c080165c511835f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs5q5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9lvcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.071327 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.071365 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.071377 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.071396 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.071412 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:26Z","lastTransitionTime":"2025-10-01T14:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.103653 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8156f536-4329-4bc1-8ea3-653c5d719a19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dbd5c544c544544637a62c6e11372b59bdddc28c6ddb14e2cdd93d2d02a85b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9cd85dad256af814a22cd9787f3c1d4061ff2b76a0fb1a92b547a10c809a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d50224da1e43f0ad8835be229e79e94d6603a0fcb0879b9dd306e4cc357c96f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6def3f33927474e3a3c86835c9606ccfd32e235441e1646cdba4cef0b41e2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe4f8936af05697692a6547962fa77f3ac8bb357e3484324073f7768b374060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.125194 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b439925b-c79c-4b66-957f-5be27680bbc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8551072e274183e6126e06729369468bcf9a98a9a487ff2febe02b088a6a51aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd6957bcec2f931ddc0f584bb947cc6c8e19aee87c4fa2d2727151312e00bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea2a54d9ce79c067adf62a1a112758028562f68fd877d7b5c1f0ac808fde931\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 14:56:16.707999 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 14:56:16.708248 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 14:56:16.709042 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1763409596/tls.crt::/tmp/serving-cert-1763409596/tls.key\\\\\\\"\\\\nI1001 14:56:17.040394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 14:56:17.043951 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 14:56:17.043976 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 14:56:17.043997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 14:56:17.044002 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 14:56:17.049470 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1001 14:56:17.049534 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 14:56:17.049584 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049607 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 14:56:17.049653 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 14:56:17.049673 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 14:56:17.049696 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 14:56:17.051084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d920faabd05e65200c4defe1588344af5ae2b34148b4213bdc86abe9dd19d236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.144359 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa69b46a-20c4-4ce1-8be9-e945c6865b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0651d298d7c1364b4de3b030574abd7d6ba8ccacab5872d74162878f92592cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b72097f8016d0e79134956d3555973cd7c137c30ab5eade1bd92805d16f66bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://495c07855c881f2a9cd788740e2197c165a38891c729a8563f3540bf5c6d8dcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fb83a6932caf24a6b2430836bc2800415028a8134a2355a816ccbb73efbaf46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.163574 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad0b1d01c4ff14005e40574c1663d3fafddd1527b8d283ee2f2364a861e3c351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.173479 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.173545 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.173563 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.173589 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.173608 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:26Z","lastTransitionTime":"2025-10-01T14:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.181929 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kmlgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2129365d-0a99-4cf0-a561-fd4126d1bfc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dabcd9bb31c364a82e0015bb58c48344f35fd73013cb9eb2c9d178ea6befbdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqx4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kmlgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.206257 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" event={"ID":"4be36f4b-1171-4281-a7ac-43e411e080f7","Type":"ContainerStarted","Data":"e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68"} Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.207225 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be36f4b-1171-4281-a7ac-43e411e080f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jj6k4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.213224 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" event={"ID":"a061b8e2-74a8-4953-bfa2-5090a2f70459","Type":"ContainerStarted","Data":"eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6"} Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.231046 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6ef7bbeb1f3ba58b4d1cfa2d55bcc4d7e7264d3db53e61495625b58a8503ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://679e6aca450f387cb31946627423ce8bc164d9815a7caf225e6cf1512f189cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.249456 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02063fddc7a255292ea7cdd6a318546b1573a0701787f4ab839ac7bea5b1ccb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.265373 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.277601 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.277657 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.277671 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.277699 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.277718 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:26Z","lastTransitionTime":"2025-10-01T14:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.279403 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7wr7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bb66959-800a-45dc-909f-1a093c578823\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abffda55d62e4f219933292ded99619fb5bfbbe87a5091c8aaaee6ea6162353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bdn7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7wr7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.293304 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"289ee6d3-fabe-417f-964c-76ca03c143cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37230c61c3cdf57d73df404731eb692cf20c46a8d983ee40c0aef7ee1f3ad839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://161a9b5c96d73213aa3d0261956487464c5e1398e47d221db9fccafdfdc51856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vck47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.312940 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a061b8e2-74a8-4953-bfa2-5090a2f70459\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j7ntp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.328468 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6ef7bbeb1f3ba58b4d1cfa2d55bcc4d7e7264d3db53e61495625b58a8503ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://679e6aca450f387cb31946627423ce8bc164d9815a7caf225e6cf1512f189cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.344440 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02063fddc7a255292ea7cdd6a318546b1573a0701787f4ab839ac7bea5b1ccb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.363341 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.376146 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7wr7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bb66959-800a-45dc-909f-1a093c578823\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abffda55d62e4f219933292ded99619fb5bfbbe87a5091c8aaaee6ea6162353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bdn7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7wr7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.380466 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.380509 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.380522 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.380543 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.380556 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:26Z","lastTransitionTime":"2025-10-01T14:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.390830 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"289ee6d3-fabe-417f-964c-76ca03c143cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37230c61c3cdf57d73df404731eb692cf20c46a8d983ee40c0aef7ee1f3ad839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://161a9b5c96d73213aa3d0261956487464c5e1398e47d221db9fccafdfdc51856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vck47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.410865 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a061b8e2-74a8-4953-bfa2-5090a2f70459\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j7ntp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.427551 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.444177 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.459428 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9lvcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96a3328-c79b-4528-b9b5-badbc7380dd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dea414c74a97184e0507113cc8069ee732fe85125aee58c080165c511835f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs5q5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9lvcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.484406 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.484442 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.484460 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.484477 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.484490 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:26Z","lastTransitionTime":"2025-10-01T14:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.485305 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8156f536-4329-4bc1-8ea3-653c5d719a19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dbd5c544c544544637a62c6e11372b59bdddc28c6ddb14e2cdd93d2d02a85b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9cd85dad256af814a22cd9787f3c1d4061ff2b76a0fb1a92b547a10c809a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d50224da1e43f0ad8835be229e79e94d6603a0fcb0879b9dd306e4cc357c96f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6def3f33927474e3a3c86835c9606ccfd32e235441e1646cdba4cef0b41e2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe4f8936af05697692a6547962fa77f3ac8bb357e3484324073f7768b374060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.502529 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b439925b-c79c-4b66-957f-5be27680bbc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8551072e274183e6126e06729369468bcf9a98a9a487ff2febe02b088a6a51aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd6957bcec2f931ddc0f584bb947cc6c8e19aee87c4fa2d2727151312e00bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea2a54d9ce79c067adf62a1a112758028562f68fd877d7b5c1f0ac808fde931\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 14:56:16.707999 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 14:56:16.708248 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 14:56:16.709042 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1763409596/tls.crt::/tmp/serving-cert-1763409596/tls.key\\\\\\\"\\\\nI1001 14:56:17.040394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 14:56:17.043951 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 14:56:17.043976 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 14:56:17.043997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 14:56:17.044002 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 14:56:17.049470 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1001 14:56:17.049534 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 14:56:17.049584 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049607 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 14:56:17.049653 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 14:56:17.049673 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 14:56:17.049696 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 14:56:17.051084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d920faabd05e65200c4defe1588344af5ae2b34148b4213bdc86abe9dd19d236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.516385 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa69b46a-20c4-4ce1-8be9-e945c6865b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0651d298d7c1364b4de3b030574abd7d6ba8ccacab5872d74162878f92592cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b72097f8016d0e79134956d3555973cd7c137c30ab5eade1bd92805d16f66bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://495c07855c881f2a9cd788740e2197c165a38891c729a8563f3540bf5c6d8dcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fb83a6932caf24a6b2430836bc2800415028a8134a2355a816ccbb73efbaf46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.531318 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad0b1d01c4ff14005e40574c1663d3fafddd1527b8d283ee2f2364a861e3c351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.542902 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kmlgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2129365d-0a99-4cf0-a561-fd4126d1bfc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dabcd9bb31c364a82e0015bb58c48344f35fd73013cb9eb2c9d178ea6befbdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqx4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kmlgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.562873 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be36f4b-1171-4281-a7ac-43e411e080f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jj6k4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.586957 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.587020 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.587033 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.587054 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.587068 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:26Z","lastTransitionTime":"2025-10-01T14:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.689554 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.689608 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.689623 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.689642 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.689652 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:26Z","lastTransitionTime":"2025-10-01T14:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.792869 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.792918 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.792931 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.792956 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.792972 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:26Z","lastTransitionTime":"2025-10-01T14:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.897106 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.897173 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.897191 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.897217 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.897236 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:26Z","lastTransitionTime":"2025-10-01T14:56:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.984527 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.984583 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:56:26 crc kubenswrapper[4771]: I1001 14:56:26.984704 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:56:26 crc kubenswrapper[4771]: E1001 14:56:26.984913 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:56:26 crc kubenswrapper[4771]: E1001 14:56:26.985091 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:56:26 crc kubenswrapper[4771]: E1001 14:56:26.985338 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.000426 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.000506 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.000526 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.000559 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.000592 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:27Z","lastTransitionTime":"2025-10-01T14:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.104905 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.104950 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.104966 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.104989 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.105005 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:27Z","lastTransitionTime":"2025-10-01T14:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.208115 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.208167 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.208184 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.208211 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.208229 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:27Z","lastTransitionTime":"2025-10-01T14:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.219808 4771 generic.go:334] "Generic (PLEG): container finished" podID="4be36f4b-1171-4281-a7ac-43e411e080f7" containerID="e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68" exitCode=0 Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.219871 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" event={"ID":"4be36f4b-1171-4281-a7ac-43e411e080f7","Type":"ContainerDied","Data":"e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68"} Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.247946 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6ef7bbeb1f3ba58b4d1cfa2d55bcc4d7e7264d3db53e61495625b58a8503ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://679e6aca450f387cb31946627423ce8bc164d9815a7caf225e6cf1512f189cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:27Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.274881 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7wr7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bb66959-800a-45dc-909f-1a093c578823\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abffda55d62e4f219933292ded99619fb5bfbbe87a5091c8aaaee6ea6162353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bdn7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7wr7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:27Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.293590 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"289ee6d3-fabe-417f-964c-76ca03c143cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37230c61c3cdf57d73df404731eb692cf20c46a8d983ee40c0aef7ee1f3ad839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://161a9b5c96d73213aa3d0261956487464c5e1398e47d221db9fccafdfdc51856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vck47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:27Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.311421 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.311515 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.311542 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.311578 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.311601 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:27Z","lastTransitionTime":"2025-10-01T14:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.318296 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a061b8e2-74a8-4953-bfa2-5090a2f70459\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j7ntp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:27Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.340815 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02063fddc7a255292ea7cdd6a318546b1573a0701787f4ab839ac7bea5b1ccb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:27Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.360724 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:27Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.382900 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:27Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.409299 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:27Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.414111 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.414159 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.414173 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.414192 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.414205 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:27Z","lastTransitionTime":"2025-10-01T14:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.424025 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9lvcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96a3328-c79b-4528-b9b5-badbc7380dd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dea414c74a97184e0507113cc8069ee732fe85125aee58c080165c511835f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs5q5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9lvcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:27Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.438024 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad0b1d01c4ff14005e40574c1663d3fafddd1527b8d283ee2f2364a861e3c351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:27Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.451093 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kmlgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2129365d-0a99-4cf0-a561-fd4126d1bfc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dabcd9bb31c364a82e0015bb58c48344f35fd73013cb9eb2c9d178ea6befbdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqx4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kmlgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:27Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.469663 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be36f4b-1171-4281-a7ac-43e411e080f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jj6k4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:27Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.499167 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8156f536-4329-4bc1-8ea3-653c5d719a19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dbd5c544c544544637a62c6e11372b59bdddc28c6ddb14e2cdd93d2d02a85b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9cd85dad256af814a22cd9787f3c1d4061ff2b76a0fb1a92b547a10c809a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d50224da1e43f0ad8835be229e79e94d6603a0fcb0879b9dd306e4cc357c96f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6def3f33927474e3a3c86835c9606ccfd32e235441e1646cdba4cef0b41e2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe4f8936af05697692a6547962fa77f3ac8bb357e3484324073f7768b374060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:27Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.514704 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b439925b-c79c-4b66-957f-5be27680bbc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8551072e274183e6126e06729369468bcf9a98a9a487ff2febe02b088a6a51aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd6957bcec2f931ddc0f584bb947cc6c8e19aee87c4fa2d2727151312e00bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea2a54d9ce79c067adf62a1a112758028562f68fd877d7b5c1f0ac808fde931\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 14:56:16.707999 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 14:56:16.708248 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 14:56:16.709042 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1763409596/tls.crt::/tmp/serving-cert-1763409596/tls.key\\\\\\\"\\\\nI1001 14:56:17.040394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 14:56:17.043951 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 14:56:17.043976 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 14:56:17.043997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 14:56:17.044002 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 14:56:17.049470 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1001 14:56:17.049534 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 14:56:17.049584 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049607 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 14:56:17.049653 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 14:56:17.049673 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 14:56:17.049696 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 14:56:17.051084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d920faabd05e65200c4defe1588344af5ae2b34148b4213bdc86abe9dd19d236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:27Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.517291 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.517318 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.517330 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.517350 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.517362 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:27Z","lastTransitionTime":"2025-10-01T14:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.530100 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa69b46a-20c4-4ce1-8be9-e945c6865b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0651d298d7c1364b4de3b030574abd7d6ba8ccacab5872d74162878f92592cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b72097f8016d0e79134956d3555973cd7c137c30ab5eade1bd92805d16f66bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://495c07855c881f2a9cd788740e2197c165a38891c729a8563f3540bf5c6d8dcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fb83a6932caf24a6b2430836bc2800415028a8134a2355a816ccbb73efbaf46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:27Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.620077 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.620665 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.620681 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.620707 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.620722 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:27Z","lastTransitionTime":"2025-10-01T14:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.723880 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.723919 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.723928 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.723947 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.723957 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:27Z","lastTransitionTime":"2025-10-01T14:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.826470 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.826526 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.826536 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.826557 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.826568 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:27Z","lastTransitionTime":"2025-10-01T14:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.929623 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.929689 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.929712 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.929777 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:27 crc kubenswrapper[4771]: I1001 14:56:27.929805 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:27Z","lastTransitionTime":"2025-10-01T14:56:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.033033 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.033124 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.033147 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.033178 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.033200 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:28Z","lastTransitionTime":"2025-10-01T14:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.136591 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.136626 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.136637 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.136657 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.136670 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:28Z","lastTransitionTime":"2025-10-01T14:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.227588 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" event={"ID":"a061b8e2-74a8-4953-bfa2-5090a2f70459","Type":"ContainerStarted","Data":"4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32"} Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.229988 4771 generic.go:334] "Generic (PLEG): container finished" podID="4be36f4b-1171-4281-a7ac-43e411e080f7" containerID="bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd" exitCode=0 Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.230070 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" event={"ID":"4be36f4b-1171-4281-a7ac-43e411e080f7","Type":"ContainerDied","Data":"bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd"} Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.239820 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.239873 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.239891 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.239915 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.239928 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:28Z","lastTransitionTime":"2025-10-01T14:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.250624 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:28Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.266228 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:28Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.286016 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9lvcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96a3328-c79b-4528-b9b5-badbc7380dd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dea414c74a97184e0507113cc8069ee732fe85125aee58c080165c511835f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs5q5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9lvcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:28Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.297924 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kmlgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2129365d-0a99-4cf0-a561-fd4126d1bfc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dabcd9bb31c364a82e0015bb58c48344f35fd73013cb9eb2c9d178ea6befbdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqx4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kmlgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:28Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.325492 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be36f4b-1171-4281-a7ac-43e411e080f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jj6k4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:28Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.342874 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.342956 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.342972 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.343022 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.343038 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:28Z","lastTransitionTime":"2025-10-01T14:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.350232 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8156f536-4329-4bc1-8ea3-653c5d719a19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dbd5c544c544544637a62c6e11372b59bdddc28c6ddb14e2cdd93d2d02a85b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9cd85dad256af814a22cd9787f3c1d4061ff2b76a0fb1a92b547a10c809a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d50224da1e43f0ad8835be229e79e94d6603a0fcb0879b9dd306e4cc357c96f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6def3f33927474e3a3c86835c9606ccfd32e235441e1646cdba4cef0b41e2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe4f8936af05697692a6547962fa77f3ac8bb357e3484324073f7768b374060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:28Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.368771 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b439925b-c79c-4b66-957f-5be27680bbc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8551072e274183e6126e06729369468bcf9a98a9a487ff2febe02b088a6a51aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd6957bcec2f931ddc0f584bb947cc6c8e19aee87c4fa2d2727151312e00bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea2a54d9ce79c067adf62a1a112758028562f68fd877d7b5c1f0ac808fde931\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 14:56:16.707999 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 14:56:16.708248 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 14:56:16.709042 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1763409596/tls.crt::/tmp/serving-cert-1763409596/tls.key\\\\\\\"\\\\nI1001 14:56:17.040394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 14:56:17.043951 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 14:56:17.043976 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 14:56:17.043997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 14:56:17.044002 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 14:56:17.049470 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1001 14:56:17.049534 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 14:56:17.049584 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049607 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 14:56:17.049653 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 14:56:17.049673 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 14:56:17.049696 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 14:56:17.051084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d920faabd05e65200c4defe1588344af5ae2b34148b4213bdc86abe9dd19d236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:28Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.383811 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa69b46a-20c4-4ce1-8be9-e945c6865b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0651d298d7c1364b4de3b030574abd7d6ba8ccacab5872d74162878f92592cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b72097f8016d0e79134956d3555973cd7c137c30ab5eade1bd92805d16f66bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://495c07855c881f2a9cd788740e2197c165a38891c729a8563f3540bf5c6d8dcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fb83a6932caf24a6b2430836bc2800415028a8134a2355a816ccbb73efbaf46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:28Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.402404 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad0b1d01c4ff14005e40574c1663d3fafddd1527b8d283ee2f2364a861e3c351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:28Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.416017 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6ef7bbeb1f3ba58b4d1cfa2d55bcc4d7e7264d3db53e61495625b58a8503ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://679e6aca450f387cb31946627423ce8bc164d9815a7caf225e6cf1512f189cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:28Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.429505 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"289ee6d3-fabe-417f-964c-76ca03c143cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37230c61c3cdf57d73df404731eb692cf20c46a8d983ee40c0aef7ee1f3ad839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://161a9b5c96d73213aa3d0261956487464c5e1398e47d221db9fccafdfdc51856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vck47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:28Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.446383 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.446428 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.446438 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.446456 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.446468 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:28Z","lastTransitionTime":"2025-10-01T14:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.454120 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a061b8e2-74a8-4953-bfa2-5090a2f70459\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j7ntp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:28Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.472818 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02063fddc7a255292ea7cdd6a318546b1573a0701787f4ab839ac7bea5b1ccb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:28Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.487494 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:28Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.506029 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7wr7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bb66959-800a-45dc-909f-1a093c578823\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abffda55d62e4f219933292ded99619fb5bfbbe87a5091c8aaaee6ea6162353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bdn7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7wr7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:28Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.549712 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.549775 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.549786 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.549805 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.549819 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:28Z","lastTransitionTime":"2025-10-01T14:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.653778 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.653863 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.653886 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.653924 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.653948 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:28Z","lastTransitionTime":"2025-10-01T14:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.756904 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.756977 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.757001 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.757034 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.757055 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:28Z","lastTransitionTime":"2025-10-01T14:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.860653 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.860770 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.860799 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.860831 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.860881 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:28Z","lastTransitionTime":"2025-10-01T14:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.963692 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.963788 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.963808 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.963840 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.963859 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:28Z","lastTransitionTime":"2025-10-01T14:56:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.984669 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.984765 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:56:28 crc kubenswrapper[4771]: E1001 14:56:28.984896 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:56:28 crc kubenswrapper[4771]: I1001 14:56:28.985840 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:56:28 crc kubenswrapper[4771]: E1001 14:56:28.985925 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:56:28 crc kubenswrapper[4771]: E1001 14:56:28.986003 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.067157 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.067244 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.067268 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.067305 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.067330 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:29Z","lastTransitionTime":"2025-10-01T14:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.172113 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.172194 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.172214 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.172241 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.172263 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:29Z","lastTransitionTime":"2025-10-01T14:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.239147 4771 generic.go:334] "Generic (PLEG): container finished" podID="4be36f4b-1171-4281-a7ac-43e411e080f7" containerID="2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96" exitCode=0 Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.239219 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" event={"ID":"4be36f4b-1171-4281-a7ac-43e411e080f7","Type":"ContainerDied","Data":"2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96"} Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.256305 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad0b1d01c4ff14005e40574c1663d3fafddd1527b8d283ee2f2364a861e3c351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:29Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.269908 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kmlgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2129365d-0a99-4cf0-a561-fd4126d1bfc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dabcd9bb31c364a82e0015bb58c48344f35fd73013cb9eb2c9d178ea6befbdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqx4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kmlgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:29Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.275194 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.275235 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.275244 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.275262 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.275275 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:29Z","lastTransitionTime":"2025-10-01T14:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.289208 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be36f4b-1171-4281-a7ac-43e411e080f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jj6k4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:29Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.320639 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8156f536-4329-4bc1-8ea3-653c5d719a19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dbd5c544c544544637a62c6e11372b59bdddc28c6ddb14e2cdd93d2d02a85b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9cd85dad256af814a22cd9787f3c1d4061ff2b76a0fb1a92b547a10c809a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d50224da1e43f0ad8835be229e79e94d6603a0fcb0879b9dd306e4cc357c96f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6def3f33927474e3a3c86835c9606ccfd32e235441e1646cdba4cef0b41e2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe4f8936af05697692a6547962fa77f3ac8bb357e3484324073f7768b374060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:29Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.341103 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b439925b-c79c-4b66-957f-5be27680bbc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8551072e274183e6126e06729369468bcf9a98a9a487ff2febe02b088a6a51aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd6957bcec2f931ddc0f584bb947cc6c8e19aee87c4fa2d2727151312e00bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea2a54d9ce79c067adf62a1a112758028562f68fd877d7b5c1f0ac808fde931\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 14:56:16.707999 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 14:56:16.708248 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 14:56:16.709042 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1763409596/tls.crt::/tmp/serving-cert-1763409596/tls.key\\\\\\\"\\\\nI1001 14:56:17.040394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 14:56:17.043951 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 14:56:17.043976 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 14:56:17.043997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 14:56:17.044002 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 14:56:17.049470 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1001 14:56:17.049534 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 14:56:17.049584 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049607 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 14:56:17.049653 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 14:56:17.049673 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 14:56:17.049696 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 14:56:17.051084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d920faabd05e65200c4defe1588344af5ae2b34148b4213bdc86abe9dd19d236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:29Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.357240 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa69b46a-20c4-4ce1-8be9-e945c6865b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0651d298d7c1364b4de3b030574abd7d6ba8ccacab5872d74162878f92592cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b72097f8016d0e79134956d3555973cd7c137c30ab5eade1bd92805d16f66bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://495c07855c881f2a9cd788740e2197c165a38891c729a8563f3540bf5c6d8dcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fb83a6932caf24a6b2430836bc2800415028a8134a2355a816ccbb73efbaf46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:29Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.377591 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6ef7bbeb1f3ba58b4d1cfa2d55bcc4d7e7264d3db53e61495625b58a8503ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://679e6aca450f387cb31946627423ce8bc164d9815a7caf225e6cf1512f189cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:29Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.378606 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.378648 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.378663 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.378685 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.378701 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:29Z","lastTransitionTime":"2025-10-01T14:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.393698 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7wr7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bb66959-800a-45dc-909f-1a093c578823\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abffda55d62e4f219933292ded99619fb5bfbbe87a5091c8aaaee6ea6162353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bdn7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7wr7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:29Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.408391 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"289ee6d3-fabe-417f-964c-76ca03c143cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37230c61c3cdf57d73df404731eb692cf20c46a8d983ee40c0aef7ee1f3ad839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://161a9b5c96d73213aa3d0261956487464c5e1398e47d221db9fccafdfdc51856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vck47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:29Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.427116 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a061b8e2-74a8-4953-bfa2-5090a2f70459\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j7ntp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:29Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.441830 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02063fddc7a255292ea7cdd6a318546b1573a0701787f4ab839ac7bea5b1ccb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:29Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.456436 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:29Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.473443 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:29Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.482499 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.482565 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.482583 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.482608 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.482621 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:29Z","lastTransitionTime":"2025-10-01T14:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.497451 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:29Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.531955 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9lvcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96a3328-c79b-4528-b9b5-badbc7380dd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dea414c74a97184e0507113cc8069ee732fe85125aee58c080165c511835f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs5q5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9lvcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:29Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.586571 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.586623 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.586635 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.586653 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.586664 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:29Z","lastTransitionTime":"2025-10-01T14:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.690760 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.690820 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.690832 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.690854 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.690872 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:29Z","lastTransitionTime":"2025-10-01T14:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.795020 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.795516 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.795526 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.795544 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.795554 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:29Z","lastTransitionTime":"2025-10-01T14:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.899047 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.899099 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.899110 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.899130 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:29 crc kubenswrapper[4771]: I1001 14:56:29.899146 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:29Z","lastTransitionTime":"2025-10-01T14:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.002227 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.002479 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.002784 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.003064 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.003326 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:30Z","lastTransitionTime":"2025-10-01T14:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.106985 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.107020 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.107031 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.107049 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.107059 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:30Z","lastTransitionTime":"2025-10-01T14:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.210524 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.210607 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.210619 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.210639 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.210652 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:30Z","lastTransitionTime":"2025-10-01T14:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.248546 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" event={"ID":"a061b8e2-74a8-4953-bfa2-5090a2f70459","Type":"ContainerStarted","Data":"2720b235ad3d306800d8680e8ece6483b40235bf69b19ba91a8aeb99cf24cd4c"} Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.248910 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.251788 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" event={"ID":"4be36f4b-1171-4281-a7ac-43e411e080f7","Type":"ContainerStarted","Data":"bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917"} Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.265978 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"289ee6d3-fabe-417f-964c-76ca03c143cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37230c61c3cdf57d73df404731eb692cf20c46a8d983ee40c0aef7ee1f3ad839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://161a9b5c96d73213aa3d0261956487464c5e1398e47d221db9fccafdfdc51856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vck47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:30Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.282530 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.290910 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a061b8e2-74a8-4953-bfa2-5090a2f70459\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2720b235ad3d306800d8680e8ece6483b40235bf69b19ba91a8aeb99cf24cd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j7ntp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:30Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.307466 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02063fddc7a255292ea7cdd6a318546b1573a0701787f4ab839ac7bea5b1ccb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:30Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.313144 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.313195 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.313211 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.313280 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.313305 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:30Z","lastTransitionTime":"2025-10-01T14:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.321208 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:30Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.335719 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7wr7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bb66959-800a-45dc-909f-1a093c578823\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abffda55d62e4f219933292ded99619fb5bfbbe87a5091c8aaaee6ea6162353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bdn7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7wr7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:30Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.350520 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:30Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.363575 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:30Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.374885 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9lvcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96a3328-c79b-4528-b9b5-badbc7380dd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dea414c74a97184e0507113cc8069ee732fe85125aee58c080165c511835f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs5q5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9lvcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:30Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.384088 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kmlgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2129365d-0a99-4cf0-a561-fd4126d1bfc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dabcd9bb31c364a82e0015bb58c48344f35fd73013cb9eb2c9d178ea6befbdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqx4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kmlgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:30Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.398416 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be36f4b-1171-4281-a7ac-43e411e080f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jj6k4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:30Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.418574 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.418619 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.418628 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.418646 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.418657 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:30Z","lastTransitionTime":"2025-10-01T14:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.434265 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8156f536-4329-4bc1-8ea3-653c5d719a19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dbd5c544c544544637a62c6e11372b59bdddc28c6ddb14e2cdd93d2d02a85b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9cd85dad256af814a22cd9787f3c1d4061ff2b76a0fb1a92b547a10c809a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d50224da1e43f0ad8835be229e79e94d6603a0fcb0879b9dd306e4cc357c96f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6def3f33927474e3a3c86835c9606ccfd32e235441e1646cdba4cef0b41e2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe4f8936af05697692a6547962fa77f3ac8bb357e3484324073f7768b374060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:30Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.449097 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b439925b-c79c-4b66-957f-5be27680bbc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8551072e274183e6126e06729369468bcf9a98a9a487ff2febe02b088a6a51aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd6957bcec2f931ddc0f584bb947cc6c8e19aee87c4fa2d2727151312e00bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea2a54d9ce79c067adf62a1a112758028562f68fd877d7b5c1f0ac808fde931\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 14:56:16.707999 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 14:56:16.708248 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 14:56:16.709042 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1763409596/tls.crt::/tmp/serving-cert-1763409596/tls.key\\\\\\\"\\\\nI1001 14:56:17.040394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 14:56:17.043951 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 14:56:17.043976 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 14:56:17.043997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 14:56:17.044002 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 14:56:17.049470 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1001 14:56:17.049534 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 14:56:17.049584 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049607 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 14:56:17.049653 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 14:56:17.049673 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 14:56:17.049696 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 14:56:17.051084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d920faabd05e65200c4defe1588344af5ae2b34148b4213bdc86abe9dd19d236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:30Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.459632 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa69b46a-20c4-4ce1-8be9-e945c6865b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0651d298d7c1364b4de3b030574abd7d6ba8ccacab5872d74162878f92592cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b72097f8016d0e79134956d3555973cd7c137c30ab5eade1bd92805d16f66bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://495c07855c881f2a9cd788740e2197c165a38891c729a8563f3540bf5c6d8dcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fb83a6932caf24a6b2430836bc2800415028a8134a2355a816ccbb73efbaf46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:30Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.481842 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad0b1d01c4ff14005e40574c1663d3fafddd1527b8d283ee2f2364a861e3c351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:30Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.502400 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6ef7bbeb1f3ba58b4d1cfa2d55bcc4d7e7264d3db53e61495625b58a8503ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://679e6aca450f387cb31946627423ce8bc164d9815a7caf225e6cf1512f189cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:30Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.521659 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.521698 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.521711 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.521744 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.521758 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:30Z","lastTransitionTime":"2025-10-01T14:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.530027 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02063fddc7a255292ea7cdd6a318546b1573a0701787f4ab839ac7bea5b1ccb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:30Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.546157 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:30Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.559553 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7wr7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bb66959-800a-45dc-909f-1a093c578823\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abffda55d62e4f219933292ded99619fb5bfbbe87a5091c8aaaee6ea6162353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bdn7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7wr7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:30Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.571998 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"289ee6d3-fabe-417f-964c-76ca03c143cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37230c61c3cdf57d73df404731eb692cf20c46a8d983ee40c0aef7ee1f3ad839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://161a9b5c96d73213aa3d0261956487464c5e1398e47d221db9fccafdfdc51856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vck47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:30Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.598584 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a061b8e2-74a8-4953-bfa2-5090a2f70459\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2720b235ad3d306800d8680e8ece6483b40235bf69b19ba91a8aeb99cf24cd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j7ntp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:30Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.612201 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:30Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.624474 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.624519 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.624530 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.624547 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.624561 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:30Z","lastTransitionTime":"2025-10-01T14:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.624779 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:30Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.637757 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9lvcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96a3328-c79b-4528-b9b5-badbc7380dd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dea414c74a97184e0507113cc8069ee732fe85125aee58c080165c511835f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs5q5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9lvcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:30Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.656475 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8156f536-4329-4bc1-8ea3-653c5d719a19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dbd5c544c544544637a62c6e11372b59bdddc28c6ddb14e2cdd93d2d02a85b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9cd85dad256af814a22cd9787f3c1d4061ff2b76a0fb1a92b547a10c809a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d50224da1e43f0ad8835be229e79e94d6603a0fcb0879b9dd306e4cc357c96f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6def3f33927474e3a3c86835c9606ccfd32e235441e1646cdba4cef0b41e2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe4f8936af05697692a6547962fa77f3ac8bb357e3484324073f7768b374060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:30Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.669410 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b439925b-c79c-4b66-957f-5be27680bbc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8551072e274183e6126e06729369468bcf9a98a9a487ff2febe02b088a6a51aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd6957bcec2f931ddc0f584bb947cc6c8e19aee87c4fa2d2727151312e00bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea2a54d9ce79c067adf62a1a112758028562f68fd877d7b5c1f0ac808fde931\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 14:56:16.707999 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 14:56:16.708248 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 14:56:16.709042 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1763409596/tls.crt::/tmp/serving-cert-1763409596/tls.key\\\\\\\"\\\\nI1001 14:56:17.040394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 14:56:17.043951 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 14:56:17.043976 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 14:56:17.043997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 14:56:17.044002 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 14:56:17.049470 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1001 14:56:17.049534 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 14:56:17.049584 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049607 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 14:56:17.049653 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 14:56:17.049673 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 14:56:17.049696 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 14:56:17.051084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d920faabd05e65200c4defe1588344af5ae2b34148b4213bdc86abe9dd19d236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:30Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.685173 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa69b46a-20c4-4ce1-8be9-e945c6865b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0651d298d7c1364b4de3b030574abd7d6ba8ccacab5872d74162878f92592cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b72097f8016d0e79134956d3555973cd7c137c30ab5eade1bd92805d16f66bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://495c07855c881f2a9cd788740e2197c165a38891c729a8563f3540bf5c6d8dcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fb83a6932caf24a6b2430836bc2800415028a8134a2355a816ccbb73efbaf46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:30Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.698891 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad0b1d01c4ff14005e40574c1663d3fafddd1527b8d283ee2f2364a861e3c351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:30Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.710757 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kmlgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2129365d-0a99-4cf0-a561-fd4126d1bfc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dabcd9bb31c364a82e0015bb58c48344f35fd73013cb9eb2c9d178ea6befbdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqx4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kmlgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:30Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.724991 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be36f4b-1171-4281-a7ac-43e411e080f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jj6k4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:30Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.727159 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.727193 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.727202 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.727221 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.727231 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:30Z","lastTransitionTime":"2025-10-01T14:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.738350 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6ef7bbeb1f3ba58b4d1cfa2d55bcc4d7e7264d3db53e61495625b58a8503ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://679e6aca450f387cb31946627423ce8bc164d9815a7caf225e6cf1512f189cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:30Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.830118 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.830171 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.830182 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.830200 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.830211 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:30Z","lastTransitionTime":"2025-10-01T14:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.932831 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.932903 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.932920 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.932947 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.932966 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:30Z","lastTransitionTime":"2025-10-01T14:56:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.985196 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:56:30 crc kubenswrapper[4771]: E1001 14:56:30.985397 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.985963 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:56:30 crc kubenswrapper[4771]: I1001 14:56:30.986014 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:56:30 crc kubenswrapper[4771]: E1001 14:56:30.986149 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:56:30 crc kubenswrapper[4771]: E1001 14:56:30.986284 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.037032 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.037118 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.037141 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.037172 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.037193 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:31Z","lastTransitionTime":"2025-10-01T14:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.141532 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.141654 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.141680 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.141720 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.141794 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:31Z","lastTransitionTime":"2025-10-01T14:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.244832 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.244910 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.244930 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.244955 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.244973 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:31Z","lastTransitionTime":"2025-10-01T14:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.261181 4771 generic.go:334] "Generic (PLEG): container finished" podID="4be36f4b-1171-4281-a7ac-43e411e080f7" containerID="bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917" exitCode=0 Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.261319 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" event={"ID":"4be36f4b-1171-4281-a7ac-43e411e080f7","Type":"ContainerDied","Data":"bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917"} Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.261378 4771 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.261454 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.303902 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.306226 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8156f536-4329-4bc1-8ea3-653c5d719a19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dbd5c544c544544637a62c6e11372b59bdddc28c6ddb14e2cdd93d2d02a85b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9cd85dad256af814a22cd9787f3c1d4061ff2b76a0fb1a92b547a10c809a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d50224da1e43f0ad8835be229e79e94d6603a0fcb0879b9dd306e4cc357c96f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6def3f33927474e3a3c86835c9606ccfd32e235441e1646cdba4cef0b41e2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe4f8936af05697692a6547962fa77f3ac8bb357e3484324073f7768b374060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:31Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.330467 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b439925b-c79c-4b66-957f-5be27680bbc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8551072e274183e6126e06729369468bcf9a98a9a487ff2febe02b088a6a51aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd6957bcec2f931ddc0f584bb947cc6c8e19aee87c4fa2d2727151312e00bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea2a54d9ce79c067adf62a1a112758028562f68fd877d7b5c1f0ac808fde931\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 14:56:16.707999 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 14:56:16.708248 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 14:56:16.709042 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1763409596/tls.crt::/tmp/serving-cert-1763409596/tls.key\\\\\\\"\\\\nI1001 14:56:17.040394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 14:56:17.043951 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 14:56:17.043976 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 14:56:17.043997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 14:56:17.044002 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 14:56:17.049470 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1001 14:56:17.049534 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 14:56:17.049584 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049607 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 14:56:17.049653 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 14:56:17.049673 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 14:56:17.049696 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 14:56:17.051084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d920faabd05e65200c4defe1588344af5ae2b34148b4213bdc86abe9dd19d236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:31Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.348680 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.348784 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.348807 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.348834 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.348853 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:31Z","lastTransitionTime":"2025-10-01T14:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.351720 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa69b46a-20c4-4ce1-8be9-e945c6865b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0651d298d7c1364b4de3b030574abd7d6ba8ccacab5872d74162878f92592cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b72097f8016d0e79134956d3555973cd7c137c30ab5eade1bd92805d16f66bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://495c07855c881f2a9cd788740e2197c165a38891c729a8563f3540bf5c6d8dcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fb83a6932caf24a6b2430836bc2800415028a8134a2355a816ccbb73efbaf46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:31Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.373320 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad0b1d01c4ff14005e40574c1663d3fafddd1527b8d283ee2f2364a861e3c351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:31Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.390359 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kmlgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2129365d-0a99-4cf0-a561-fd4126d1bfc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dabcd9bb31c364a82e0015bb58c48344f35fd73013cb9eb2c9d178ea6befbdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqx4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kmlgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:31Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.412696 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be36f4b-1171-4281-a7ac-43e411e080f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jj6k4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:31Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.427882 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6ef7bbeb1f3ba58b4d1cfa2d55bcc4d7e7264d3db53e61495625b58a8503ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://679e6aca450f387cb31946627423ce8bc164d9815a7caf225e6cf1512f189cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:31Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.440858 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02063fddc7a255292ea7cdd6a318546b1573a0701787f4ab839ac7bea5b1ccb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:31Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.451973 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.452025 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.452043 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.452066 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.452079 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:31Z","lastTransitionTime":"2025-10-01T14:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.457061 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:31Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.473081 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7wr7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bb66959-800a-45dc-909f-1a093c578823\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abffda55d62e4f219933292ded99619fb5bfbbe87a5091c8aaaee6ea6162353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bdn7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7wr7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:31Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.485778 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"289ee6d3-fabe-417f-964c-76ca03c143cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37230c61c3cdf57d73df404731eb692cf20c46a8d983ee40c0aef7ee1f3ad839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://161a9b5c96d73213aa3d0261956487464c5e1398e47d221db9fccafdfdc51856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vck47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:31Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.508250 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a061b8e2-74a8-4953-bfa2-5090a2f70459\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2720b235ad3d306800d8680e8ece6483b40235bf69b19ba91a8aeb99cf24cd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j7ntp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:31Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.525496 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:31Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.542702 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:31Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.555506 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.555560 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.555576 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.555598 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.555612 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:31Z","lastTransitionTime":"2025-10-01T14:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.558414 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9lvcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96a3328-c79b-4528-b9b5-badbc7380dd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dea414c74a97184e0507113cc8069ee732fe85125aee58c080165c511835f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs5q5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9lvcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:31Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.573921 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9lvcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96a3328-c79b-4528-b9b5-badbc7380dd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dea414c74a97184e0507113cc8069ee732fe85125aee58c080165c511835f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs5q5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9lvcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:31Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.587070 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:31Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.605259 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:31Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.631594 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa69b46a-20c4-4ce1-8be9-e945c6865b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0651d298d7c1364b4de3b030574abd7d6ba8ccacab5872d74162878f92592cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b72097f8016d0e79134956d3555973cd7c137c30ab5eade1bd92805d16f66bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://495c07855c881f2a9cd788740e2197c165a38891c729a8563f3540bf5c6d8dcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fb83a6932caf24a6b2430836bc2800415028a8134a2355a816ccbb73efbaf46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:31Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.647396 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad0b1d01c4ff14005e40574c1663d3fafddd1527b8d283ee2f2364a861e3c351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:31Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.658241 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.658287 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.658301 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.658322 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.658570 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:31Z","lastTransitionTime":"2025-10-01T14:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.660830 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kmlgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2129365d-0a99-4cf0-a561-fd4126d1bfc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dabcd9bb31c364a82e0015bb58c48344f35fd73013cb9eb2c9d178ea6befbdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqx4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kmlgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:31Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.679975 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be36f4b-1171-4281-a7ac-43e411e080f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jj6k4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:31Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.706215 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8156f536-4329-4bc1-8ea3-653c5d719a19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dbd5c544c544544637a62c6e11372b59bdddc28c6ddb14e2cdd93d2d02a85b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9cd85dad256af814a22cd9787f3c1d4061ff2b76a0fb1a92b547a10c809a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d50224da1e43f0ad8835be229e79e94d6603a0fcb0879b9dd306e4cc357c96f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6def3f33927474e3a3c86835c9606ccfd32e235441e1646cdba4cef0b41e2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe4f8936af05697692a6547962fa77f3ac8bb357e3484324073f7768b374060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:31Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.723793 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b439925b-c79c-4b66-957f-5be27680bbc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8551072e274183e6126e06729369468bcf9a98a9a487ff2febe02b088a6a51aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd6957bcec2f931ddc0f584bb947cc6c8e19aee87c4fa2d2727151312e00bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea2a54d9ce79c067adf62a1a112758028562f68fd877d7b5c1f0ac808fde931\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 14:56:16.707999 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 14:56:16.708248 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 14:56:16.709042 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1763409596/tls.crt::/tmp/serving-cert-1763409596/tls.key\\\\\\\"\\\\nI1001 14:56:17.040394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 14:56:17.043951 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 14:56:17.043976 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 14:56:17.043997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 14:56:17.044002 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 14:56:17.049470 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1001 14:56:17.049534 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 14:56:17.049584 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049607 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 14:56:17.049653 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 14:56:17.049673 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 14:56:17.049696 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 14:56:17.051084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d920faabd05e65200c4defe1588344af5ae2b34148b4213bdc86abe9dd19d236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:31Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.739397 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6ef7bbeb1f3ba58b4d1cfa2d55bcc4d7e7264d3db53e61495625b58a8503ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://679e6aca450f387cb31946627423ce8bc164d9815a7caf225e6cf1512f189cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:31Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.755419 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:31Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.760536 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.760582 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.760595 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.760614 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.760627 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:31Z","lastTransitionTime":"2025-10-01T14:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.768538 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7wr7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bb66959-800a-45dc-909f-1a093c578823\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abffda55d62e4f219933292ded99619fb5bfbbe87a5091c8aaaee6ea6162353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bdn7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7wr7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:31Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.781391 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"289ee6d3-fabe-417f-964c-76ca03c143cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37230c61c3cdf57d73df404731eb692cf20c46a8d983ee40c0aef7ee1f3ad839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://161a9b5c96d73213aa3d0261956487464c5e1398e47d221db9fccafdfdc51856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vck47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:31Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.807803 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a061b8e2-74a8-4953-bfa2-5090a2f70459\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2720b235ad3d306800d8680e8ece6483b40235bf69b19ba91a8aeb99cf24cd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j7ntp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:31Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.825270 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02063fddc7a255292ea7cdd6a318546b1573a0701787f4ab839ac7bea5b1ccb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:31Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.865451 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.865504 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.865517 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.865537 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.865549 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:31Z","lastTransitionTime":"2025-10-01T14:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.969347 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.969388 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.969397 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.969412 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:31 crc kubenswrapper[4771]: I1001 14:56:31.969422 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:31Z","lastTransitionTime":"2025-10-01T14:56:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.091803 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.091835 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.091843 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.091858 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.091868 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:32Z","lastTransitionTime":"2025-10-01T14:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.195182 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.196516 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.196788 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.196983 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.197116 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:32Z","lastTransitionTime":"2025-10-01T14:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.276762 4771 generic.go:334] "Generic (PLEG): container finished" podID="4be36f4b-1171-4281-a7ac-43e411e080f7" containerID="fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645" exitCode=0 Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.277010 4771 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.277228 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" event={"ID":"4be36f4b-1171-4281-a7ac-43e411e080f7","Type":"ContainerDied","Data":"fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645"} Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.300877 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02063fddc7a255292ea7cdd6a318546b1573a0701787f4ab839ac7bea5b1ccb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:32Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.302679 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.302720 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.302902 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.302924 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.302937 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:32Z","lastTransitionTime":"2025-10-01T14:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.326813 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:32Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.348707 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7wr7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bb66959-800a-45dc-909f-1a093c578823\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abffda55d62e4f219933292ded99619fb5bfbbe87a5091c8aaaee6ea6162353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bdn7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7wr7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:32Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.369776 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"289ee6d3-fabe-417f-964c-76ca03c143cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37230c61c3cdf57d73df404731eb692cf20c46a8d983ee40c0aef7ee1f3ad839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://161a9b5c96d73213aa3d0261956487464c5e1398e47d221db9fccafdfdc51856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vck47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:32Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.392559 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a061b8e2-74a8-4953-bfa2-5090a2f70459\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2720b235ad3d306800d8680e8ece6483b40235bf69b19ba91a8aeb99cf24cd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j7ntp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:32Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.398236 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.398284 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.398296 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.398318 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.398332 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:32Z","lastTransitionTime":"2025-10-01T14:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.408019 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:32Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:32 crc kubenswrapper[4771]: E1001 14:56:32.416856 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f03ada0f-e2c8-42c8-86e3-3e9572f1e63b\\\",\\\"systemUUID\\\":\\\"ab8b87ec-94d1-4eae-9ea3-b28f83991d01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:32Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.424541 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.424593 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.424606 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.424628 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.424643 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:32Z","lastTransitionTime":"2025-10-01T14:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.435023 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9lvcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96a3328-c79b-4528-b9b5-badbc7380dd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dea414c74a97184e0507113cc8069ee732fe85125aee58c080165c511835f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs5q5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9lvcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:32Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:32 crc kubenswrapper[4771]: E1001 14:56:32.439459 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f03ada0f-e2c8-42c8-86e3-3e9572f1e63b\\\",\\\"systemUUID\\\":\\\"ab8b87ec-94d1-4eae-9ea3-b28f83991d01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:32Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.443425 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.443473 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.443488 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.443510 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.443525 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:32Z","lastTransitionTime":"2025-10-01T14:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.453561 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:32Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:32 crc kubenswrapper[4771]: E1001 14:56:32.457528 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f03ada0f-e2c8-42c8-86e3-3e9572f1e63b\\\",\\\"systemUUID\\\":\\\"ab8b87ec-94d1-4eae-9ea3-b28f83991d01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:32Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.464027 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.464064 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.464073 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.464094 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.464109 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:32Z","lastTransitionTime":"2025-10-01T14:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.471817 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b439925b-c79c-4b66-957f-5be27680bbc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8551072e274183e6126e06729369468bcf9a98a9a487ff2febe02b088a6a51aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd6957bcec2f931ddc0f584bb947cc6c8e19aee87c4fa2d2727151312e00bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea2a54d9ce79c067adf62a1a112758028562f68fd877d7b5c1f0ac808fde931\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 14:56:16.707999 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 14:56:16.708248 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 14:56:16.709042 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1763409596/tls.crt::/tmp/serving-cert-1763409596/tls.key\\\\\\\"\\\\nI1001 14:56:17.040394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 14:56:17.043951 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 14:56:17.043976 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 14:56:17.043997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 14:56:17.044002 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 14:56:17.049470 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1001 14:56:17.049534 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 14:56:17.049584 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049607 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 14:56:17.049653 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 14:56:17.049673 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 14:56:17.049696 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 14:56:17.051084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d920faabd05e65200c4defe1588344af5ae2b34148b4213bdc86abe9dd19d236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:32Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:32 crc kubenswrapper[4771]: E1001 14:56:32.476854 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f03ada0f-e2c8-42c8-86e3-3e9572f1e63b\\\",\\\"systemUUID\\\":\\\"ab8b87ec-94d1-4eae-9ea3-b28f83991d01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:32Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.480769 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.480814 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.480829 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.480847 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.480860 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:32Z","lastTransitionTime":"2025-10-01T14:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.487342 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa69b46a-20c4-4ce1-8be9-e945c6865b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0651d298d7c1364b4de3b030574abd7d6ba8ccacab5872d74162878f92592cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b72097f8016d0e79134956d3555973cd7c137c30ab5eade1bd92805d16f66bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://495c07855c881f2a9cd788740e2197c165a38891c729a8563f3540bf5c6d8dcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fb83a6932caf24a6b2430836bc2800415028a8134a2355a816ccbb73efbaf46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:32Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:32 crc kubenswrapper[4771]: E1001 14:56:32.494102 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f03ada0f-e2c8-42c8-86e3-3e9572f1e63b\\\",\\\"systemUUID\\\":\\\"ab8b87ec-94d1-4eae-9ea3-b28f83991d01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:32Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:32 crc kubenswrapper[4771]: E1001 14:56:32.494340 4771 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.496564 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.496614 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.496630 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.496655 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.496752 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:32Z","lastTransitionTime":"2025-10-01T14:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.501620 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad0b1d01c4ff14005e40574c1663d3fafddd1527b8d283ee2f2364a861e3c351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:32Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.513407 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kmlgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2129365d-0a99-4cf0-a561-fd4126d1bfc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dabcd9bb31c364a82e0015bb58c48344f35fd73013cb9eb2c9d178ea6befbdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqx4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kmlgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:32Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.528559 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be36f4b-1171-4281-a7ac-43e411e080f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jj6k4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:32Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.552957 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8156f536-4329-4bc1-8ea3-653c5d719a19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dbd5c544c544544637a62c6e11372b59bdddc28c6ddb14e2cdd93d2d02a85b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9cd85dad256af814a22cd9787f3c1d4061ff2b76a0fb1a92b547a10c809a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d50224da1e43f0ad8835be229e79e94d6603a0fcb0879b9dd306e4cc357c96f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6def3f33927474e3a3c86835c9606ccfd32e235441e1646cdba4cef0b41e2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe4f8936af05697692a6547962fa77f3ac8bb357e3484324073f7768b374060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:32Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.568009 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6ef7bbeb1f3ba58b4d1cfa2d55bcc4d7e7264d3db53e61495625b58a8503ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://679e6aca450f387cb31946627423ce8bc164d9815a7caf225e6cf1512f189cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:32Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.599871 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.599915 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.599927 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.599960 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.599974 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:32Z","lastTransitionTime":"2025-10-01T14:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.708362 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.708435 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.708459 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.708494 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.708512 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:32Z","lastTransitionTime":"2025-10-01T14:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.755474 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:56:32 crc kubenswrapper[4771]: E1001 14:56:32.755818 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:56:48.755766036 +0000 UTC m=+53.374941257 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.811798 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.811863 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.811879 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.811904 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.811936 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:32Z","lastTransitionTime":"2025-10-01T14:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.856413 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.856496 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.856523 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.856592 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:56:32 crc kubenswrapper[4771]: E1001 14:56:32.856719 4771 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 14:56:32 crc kubenswrapper[4771]: E1001 14:56:32.856835 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 14:56:32 crc kubenswrapper[4771]: E1001 14:56:32.856881 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 14:56:32 crc kubenswrapper[4771]: E1001 14:56:32.856890 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 14:56:48.856855922 +0000 UTC m=+53.476031133 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 14:56:32 crc kubenswrapper[4771]: E1001 14:56:32.856835 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 14:56:32 crc kubenswrapper[4771]: E1001 14:56:32.856945 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 14:56:32 crc kubenswrapper[4771]: E1001 14:56:32.856895 4771 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 14:56:32 crc kubenswrapper[4771]: E1001 14:56:32.856978 4771 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 14:56:32 crc kubenswrapper[4771]: E1001 14:56:32.857039 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 14:56:48.857021596 +0000 UTC m=+53.476196817 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 14:56:32 crc kubenswrapper[4771]: E1001 14:56:32.856760 4771 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 14:56:32 crc kubenswrapper[4771]: E1001 14:56:32.857190 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 14:56:48.857163329 +0000 UTC m=+53.476338500 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 14:56:32 crc kubenswrapper[4771]: E1001 14:56:32.857601 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 14:56:48.857577299 +0000 UTC m=+53.476752550 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.915023 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.915063 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.915074 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.915092 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.915104 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:32Z","lastTransitionTime":"2025-10-01T14:56:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.985287 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.985295 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:56:32 crc kubenswrapper[4771]: E1001 14:56:32.985471 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:56:32 crc kubenswrapper[4771]: I1001 14:56:32.985295 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:56:32 crc kubenswrapper[4771]: E1001 14:56:32.985596 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:56:32 crc kubenswrapper[4771]: E1001 14:56:32.985770 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.017645 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.017688 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.017698 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.017713 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.017723 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:33Z","lastTransitionTime":"2025-10-01T14:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.121118 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.121188 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.121205 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.121230 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.121252 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:33Z","lastTransitionTime":"2025-10-01T14:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.224992 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.225072 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.225097 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.225146 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.225197 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:33Z","lastTransitionTime":"2025-10-01T14:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.288394 4771 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.289454 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" event={"ID":"4be36f4b-1171-4281-a7ac-43e411e080f7","Type":"ContainerStarted","Data":"65f58af4fa01651762bc2de081e844beb25bd1468804d6dbf99d01be10dd80e4"} Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.313981 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:33Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.327613 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.327875 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.327943 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.328012 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.328078 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:33Z","lastTransitionTime":"2025-10-01T14:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.336347 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:33Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.361341 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9lvcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96a3328-c79b-4528-b9b5-badbc7380dd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dea414c74a97184e0507113cc8069ee732fe85125aee58c080165c511835f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs5q5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9lvcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:33Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.394464 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8156f536-4329-4bc1-8ea3-653c5d719a19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dbd5c544c544544637a62c6e11372b59bdddc28c6ddb14e2cdd93d2d02a85b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9cd85dad256af814a22cd9787f3c1d4061ff2b76a0fb1a92b547a10c809a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d50224da1e43f0ad8835be229e79e94d6603a0fcb0879b9dd306e4cc357c96f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6def3f33927474e3a3c86835c9606ccfd32e235441e1646cdba4cef0b41e2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe4f8936af05697692a6547962fa77f3ac8bb357e3484324073f7768b374060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:33Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.417715 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b439925b-c79c-4b66-957f-5be27680bbc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8551072e274183e6126e06729369468bcf9a98a9a487ff2febe02b088a6a51aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd6957bcec2f931ddc0f584bb947cc6c8e19aee87c4fa2d2727151312e00bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea2a54d9ce79c067adf62a1a112758028562f68fd877d7b5c1f0ac808fde931\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 14:56:16.707999 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 14:56:16.708248 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 14:56:16.709042 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1763409596/tls.crt::/tmp/serving-cert-1763409596/tls.key\\\\\\\"\\\\nI1001 14:56:17.040394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 14:56:17.043951 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 14:56:17.043976 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 14:56:17.043997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 14:56:17.044002 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 14:56:17.049470 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1001 14:56:17.049534 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 14:56:17.049584 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049607 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 14:56:17.049653 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 14:56:17.049673 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 14:56:17.049696 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 14:56:17.051084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d920faabd05e65200c4defe1588344af5ae2b34148b4213bdc86abe9dd19d236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:33Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.431073 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.431129 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.431140 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.431183 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.431196 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:33Z","lastTransitionTime":"2025-10-01T14:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.434772 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa69b46a-20c4-4ce1-8be9-e945c6865b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0651d298d7c1364b4de3b030574abd7d6ba8ccacab5872d74162878f92592cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b72097f8016d0e79134956d3555973cd7c137c30ab5eade1bd92805d16f66bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://495c07855c881f2a9cd788740e2197c165a38891c729a8563f3540bf5c6d8dcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fb83a6932caf24a6b2430836bc2800415028a8134a2355a816ccbb73efbaf46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:33Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.449967 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad0b1d01c4ff14005e40574c1663d3fafddd1527b8d283ee2f2364a861e3c351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:33Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.463304 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kmlgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2129365d-0a99-4cf0-a561-fd4126d1bfc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dabcd9bb31c364a82e0015bb58c48344f35fd73013cb9eb2c9d178ea6befbdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqx4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kmlgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:33Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.481836 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be36f4b-1171-4281-a7ac-43e411e080f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f58af4fa01651762bc2de081e844beb25bd1468804d6dbf99d01be10dd80e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jj6k4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:33Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.495095 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6ef7bbeb1f3ba58b4d1cfa2d55bcc4d7e7264d3db53e61495625b58a8503ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://679e6aca450f387cb31946627423ce8bc164d9815a7caf225e6cf1512f189cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:33Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.510362 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02063fddc7a255292ea7cdd6a318546b1573a0701787f4ab839ac7bea5b1ccb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:33Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.524041 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:33Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.534488 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.534532 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.534542 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.534561 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.534571 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:33Z","lastTransitionTime":"2025-10-01T14:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.536358 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7wr7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bb66959-800a-45dc-909f-1a093c578823\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abffda55d62e4f219933292ded99619fb5bfbbe87a5091c8aaaee6ea6162353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bdn7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7wr7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:33Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.549249 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"289ee6d3-fabe-417f-964c-76ca03c143cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37230c61c3cdf57d73df404731eb692cf20c46a8d983ee40c0aef7ee1f3ad839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://161a9b5c96d73213aa3d0261956487464c5e1398e47d221db9fccafdfdc51856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vck47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:33Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.571071 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a061b8e2-74a8-4953-bfa2-5090a2f70459\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2720b235ad3d306800d8680e8ece6483b40235bf69b19ba91a8aeb99cf24cd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j7ntp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:33Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.637197 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.637234 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.637245 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.637264 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.637277 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:33Z","lastTransitionTime":"2025-10-01T14:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.740245 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.740312 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.740327 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.740350 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.740365 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:33Z","lastTransitionTime":"2025-10-01T14:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.842987 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.843064 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.843088 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.843118 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.843151 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:33Z","lastTransitionTime":"2025-10-01T14:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.952114 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.952161 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.952179 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.952201 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:33 crc kubenswrapper[4771]: I1001 14:56:33.952217 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:33Z","lastTransitionTime":"2025-10-01T14:56:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.054989 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.055052 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.055069 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.055094 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.055111 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:34Z","lastTransitionTime":"2025-10-01T14:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.157789 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.157856 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.157867 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.157887 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.157900 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:34Z","lastTransitionTime":"2025-10-01T14:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.260222 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.260256 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.260266 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.260279 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.260289 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:34Z","lastTransitionTime":"2025-10-01T14:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.294464 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j7ntp_a061b8e2-74a8-4953-bfa2-5090a2f70459/ovnkube-controller/0.log" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.297670 4771 generic.go:334] "Generic (PLEG): container finished" podID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerID="2720b235ad3d306800d8680e8ece6483b40235bf69b19ba91a8aeb99cf24cd4c" exitCode=1 Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.297764 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" event={"ID":"a061b8e2-74a8-4953-bfa2-5090a2f70459","Type":"ContainerDied","Data":"2720b235ad3d306800d8680e8ece6483b40235bf69b19ba91a8aeb99cf24cd4c"} Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.299030 4771 scope.go:117] "RemoveContainer" containerID="2720b235ad3d306800d8680e8ece6483b40235bf69b19ba91a8aeb99cf24cd4c" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.318072 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:34Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.330549 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7wr7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bb66959-800a-45dc-909f-1a093c578823\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abffda55d62e4f219933292ded99619fb5bfbbe87a5091c8aaaee6ea6162353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bdn7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7wr7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:34Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.343274 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"289ee6d3-fabe-417f-964c-76ca03c143cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37230c61c3cdf57d73df404731eb692cf20c46a8d983ee40c0aef7ee1f3ad839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://161a9b5c96d73213aa3d0261956487464c5e1398e47d221db9fccafdfdc51856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vck47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:34Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.362183 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.362223 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.362260 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.362276 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.362285 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:34Z","lastTransitionTime":"2025-10-01T14:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.362157 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a061b8e2-74a8-4953-bfa2-5090a2f70459\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2720b235ad3d306800d8680e8ece6483b40235bf69b19ba91a8aeb99cf24cd4c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2720b235ad3d306800d8680e8ece6483b40235bf69b19ba91a8aeb99cf24cd4c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T14:56:34Z\\\",\\\"message\\\":\\\"1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1001 14:56:34.050470 6039 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 14:56:34.050716 6039 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1001 14:56:34.051039 6039 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 14:56:34.051068 6039 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 14:56:34.051089 6039 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 14:56:34.051101 6039 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1001 14:56:34.051106 6039 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1001 14:56:34.051163 6039 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 14:56:34.051220 6039 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 14:56:34.051224 6039 factory.go:656] Stopping watch factory\\\\nI1001 14:56:34.051233 6039 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 14:56:34.051257 6039 ovnkube.go:599] Stopped ovnkube\\\\nI1001 14:56:34.051287 6039 metrics.go:553] Stopping metrics server at address\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j7ntp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:34Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.380473 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02063fddc7a255292ea7cdd6a318546b1573a0701787f4ab839ac7bea5b1ccb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:34Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.393753 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9lvcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96a3328-c79b-4528-b9b5-badbc7380dd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dea414c74a97184e0507113cc8069ee732fe85125aee58c080165c511835f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs5q5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9lvcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:34Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.410096 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:34Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.428377 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:34Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.440589 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa69b46a-20c4-4ce1-8be9-e945c6865b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0651d298d7c1364b4de3b030574abd7d6ba8ccacab5872d74162878f92592cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b72097f8016d0e79134956d3555973cd7c137c30ab5eade1bd92805d16f66bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://495c07855c881f2a9cd788740e2197c165a38891c729a8563f3540bf5c6d8dcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fb83a6932caf24a6b2430836bc2800415028a8134a2355a816ccbb73efbaf46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:34Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.456628 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad0b1d01c4ff14005e40574c1663d3fafddd1527b8d283ee2f2364a861e3c351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:34Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.464933 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.464980 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.464993 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.465012 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.465025 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:34Z","lastTransitionTime":"2025-10-01T14:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.469010 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kmlgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2129365d-0a99-4cf0-a561-fd4126d1bfc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dabcd9bb31c364a82e0015bb58c48344f35fd73013cb9eb2c9d178ea6befbdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqx4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kmlgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:34Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.486847 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be36f4b-1171-4281-a7ac-43e411e080f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f58af4fa01651762bc2de081e844beb25bd1468804d6dbf99d01be10dd80e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jj6k4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:34Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.517423 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8156f536-4329-4bc1-8ea3-653c5d719a19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dbd5c544c544544637a62c6e11372b59bdddc28c6ddb14e2cdd93d2d02a85b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9cd85dad256af814a22cd9787f3c1d4061ff2b76a0fb1a92b547a10c809a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d50224da1e43f0ad8835be229e79e94d6603a0fcb0879b9dd306e4cc357c96f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6def3f33927474e3a3c86835c9606ccfd32e235441e1646cdba4cef0b41e2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe4f8936af05697692a6547962fa77f3ac8bb357e3484324073f7768b374060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:34Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.536821 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b439925b-c79c-4b66-957f-5be27680bbc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8551072e274183e6126e06729369468bcf9a98a9a487ff2febe02b088a6a51aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd6957bcec2f931ddc0f584bb947cc6c8e19aee87c4fa2d2727151312e00bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea2a54d9ce79c067adf62a1a112758028562f68fd877d7b5c1f0ac808fde931\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 14:56:16.707999 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 14:56:16.708248 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 14:56:16.709042 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1763409596/tls.crt::/tmp/serving-cert-1763409596/tls.key\\\\\\\"\\\\nI1001 14:56:17.040394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 14:56:17.043951 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 14:56:17.043976 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 14:56:17.043997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 14:56:17.044002 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 14:56:17.049470 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1001 14:56:17.049534 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 14:56:17.049584 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049607 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 14:56:17.049653 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 14:56:17.049673 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 14:56:17.049696 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 14:56:17.051084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d920faabd05e65200c4defe1588344af5ae2b34148b4213bdc86abe9dd19d236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:34Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.550428 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6ef7bbeb1f3ba58b4d1cfa2d55bcc4d7e7264d3db53e61495625b58a8503ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://679e6aca450f387cb31946627423ce8bc164d9815a7caf225e6cf1512f189cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:34Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.567467 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.567509 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.567523 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.567541 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.567554 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:34Z","lastTransitionTime":"2025-10-01T14:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.670393 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.670472 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.670495 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.670526 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.670550 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:34Z","lastTransitionTime":"2025-10-01T14:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.776998 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.777076 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.777094 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.777119 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.777138 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:34Z","lastTransitionTime":"2025-10-01T14:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.879877 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.879913 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.879923 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.879939 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.879952 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:34Z","lastTransitionTime":"2025-10-01T14:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.982534 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.982576 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.982588 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.982604 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.982617 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:34Z","lastTransitionTime":"2025-10-01T14:56:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.984995 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.985030 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:56:34 crc kubenswrapper[4771]: E1001 14:56:34.985102 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:56:34 crc kubenswrapper[4771]: I1001 14:56:34.985139 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:56:34 crc kubenswrapper[4771]: E1001 14:56:34.985229 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:56:34 crc kubenswrapper[4771]: E1001 14:56:34.985328 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.084989 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.085032 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.085045 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.085062 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.085074 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:35Z","lastTransitionTime":"2025-10-01T14:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.188085 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.188138 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.188148 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.188163 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.188174 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:35Z","lastTransitionTime":"2025-10-01T14:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.291062 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.291113 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.291125 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.291141 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.291154 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:35Z","lastTransitionTime":"2025-10-01T14:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.305627 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j7ntp_a061b8e2-74a8-4953-bfa2-5090a2f70459/ovnkube-controller/0.log" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.308943 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" event={"ID":"a061b8e2-74a8-4953-bfa2-5090a2f70459","Type":"ContainerStarted","Data":"f8e753527ef673cd09f9a6d72e7729db9d0d48329088fa9abab4ff1ed30ef3ae"} Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.309099 4771 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.325119 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa69b46a-20c4-4ce1-8be9-e945c6865b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0651d298d7c1364b4de3b030574abd7d6ba8ccacab5872d74162878f92592cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b72097f8016d0e79134956d3555973cd7c137c30ab5eade1bd92805d16f66bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://495c07855c881f2a9cd788740e2197c165a38891c729a8563f3540bf5c6d8dcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fb83a6932caf24a6b2430836bc2800415028a8134a2355a816ccbb73efbaf46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:35Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.342052 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad0b1d01c4ff14005e40574c1663d3fafddd1527b8d283ee2f2364a861e3c351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:35Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.354079 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kmlgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2129365d-0a99-4cf0-a561-fd4126d1bfc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dabcd9bb31c364a82e0015bb58c48344f35fd73013cb9eb2c9d178ea6befbdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqx4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kmlgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:35Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.371901 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be36f4b-1171-4281-a7ac-43e411e080f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f58af4fa01651762bc2de081e844beb25bd1468804d6dbf99d01be10dd80e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jj6k4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:35Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.393619 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.393675 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.393692 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.393716 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.393786 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:35Z","lastTransitionTime":"2025-10-01T14:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.402261 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8156f536-4329-4bc1-8ea3-653c5d719a19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dbd5c544c544544637a62c6e11372b59bdddc28c6ddb14e2cdd93d2d02a85b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9cd85dad256af814a22cd9787f3c1d4061ff2b76a0fb1a92b547a10c809a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d50224da1e43f0ad8835be229e79e94d6603a0fcb0879b9dd306e4cc357c96f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6def3f33927474e3a3c86835c9606ccfd32e235441e1646cdba4cef0b41e2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe4f8936af05697692a6547962fa77f3ac8bb357e3484324073f7768b374060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:35Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.426030 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b439925b-c79c-4b66-957f-5be27680bbc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8551072e274183e6126e06729369468bcf9a98a9a487ff2febe02b088a6a51aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd6957bcec2f931ddc0f584bb947cc6c8e19aee87c4fa2d2727151312e00bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea2a54d9ce79c067adf62a1a112758028562f68fd877d7b5c1f0ac808fde931\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 14:56:16.707999 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 14:56:16.708248 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 14:56:16.709042 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1763409596/tls.crt::/tmp/serving-cert-1763409596/tls.key\\\\\\\"\\\\nI1001 14:56:17.040394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 14:56:17.043951 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 14:56:17.043976 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 14:56:17.043997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 14:56:17.044002 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 14:56:17.049470 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1001 14:56:17.049534 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 14:56:17.049584 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049607 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 14:56:17.049653 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 14:56:17.049673 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 14:56:17.049696 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 14:56:17.051084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d920faabd05e65200c4defe1588344af5ae2b34148b4213bdc86abe9dd19d236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:35Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.445402 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6ef7bbeb1f3ba58b4d1cfa2d55bcc4d7e7264d3db53e61495625b58a8503ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://679e6aca450f387cb31946627423ce8bc164d9815a7caf225e6cf1512f189cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:35Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.468656 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:35Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.486707 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7wr7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bb66959-800a-45dc-909f-1a093c578823\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abffda55d62e4f219933292ded99619fb5bfbbe87a5091c8aaaee6ea6162353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bdn7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7wr7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:35Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.496778 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.496826 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.496837 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.496852 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.496864 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:35Z","lastTransitionTime":"2025-10-01T14:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.504401 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"289ee6d3-fabe-417f-964c-76ca03c143cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37230c61c3cdf57d73df404731eb692cf20c46a8d983ee40c0aef7ee1f3ad839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://161a9b5c96d73213aa3d0261956487464c5e1398e47d221db9fccafdfdc51856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vck47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:35Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.528094 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a061b8e2-74a8-4953-bfa2-5090a2f70459\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8e753527ef673cd09f9a6d72e7729db9d0d48329088fa9abab4ff1ed30ef3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2720b235ad3d306800d8680e8ece6483b40235bf69b19ba91a8aeb99cf24cd4c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T14:56:34Z\\\",\\\"message\\\":\\\"1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1001 14:56:34.050470 6039 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 14:56:34.050716 6039 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1001 14:56:34.051039 6039 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 14:56:34.051068 6039 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 14:56:34.051089 6039 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 14:56:34.051101 6039 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1001 14:56:34.051106 6039 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1001 14:56:34.051163 6039 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 14:56:34.051220 6039 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 14:56:34.051224 6039 factory.go:656] Stopping watch factory\\\\nI1001 14:56:34.051233 6039 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 14:56:34.051257 6039 ovnkube.go:599] Stopped ovnkube\\\\nI1001 14:56:34.051287 6039 metrics.go:553] Stopping metrics server at address\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j7ntp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:35Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.546045 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02063fddc7a255292ea7cdd6a318546b1573a0701787f4ab839ac7bea5b1ccb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:35Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.562631 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9lvcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96a3328-c79b-4528-b9b5-badbc7380dd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dea414c74a97184e0507113cc8069ee732fe85125aee58c080165c511835f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs5q5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9lvcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:35Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.583401 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:35Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.599631 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.599679 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.599692 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.599714 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.599750 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:35Z","lastTransitionTime":"2025-10-01T14:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.604018 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:35Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.703066 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.703139 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.703178 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.703211 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.703237 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:35Z","lastTransitionTime":"2025-10-01T14:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.737710 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9jb9x"] Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.738606 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9jb9x" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.740584 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.740974 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.776668 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:35Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.789206 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2dtb\" (UniqueName: \"kubernetes.io/projected/fe483b7b-ed55-4649-ac50-66ac981305e2-kube-api-access-x2dtb\") pod \"ovnkube-control-plane-749d76644c-9jb9x\" (UID: \"fe483b7b-ed55-4649-ac50-66ac981305e2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9jb9x" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.789286 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fe483b7b-ed55-4649-ac50-66ac981305e2-env-overrides\") pod \"ovnkube-control-plane-749d76644c-9jb9x\" (UID: \"fe483b7b-ed55-4649-ac50-66ac981305e2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9jb9x" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.789324 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fe483b7b-ed55-4649-ac50-66ac981305e2-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-9jb9x\" (UID: \"fe483b7b-ed55-4649-ac50-66ac981305e2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9jb9x" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.789350 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fe483b7b-ed55-4649-ac50-66ac981305e2-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-9jb9x\" (UID: \"fe483b7b-ed55-4649-ac50-66ac981305e2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9jb9x" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.799047 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:35Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.805538 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.805574 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.805583 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.805599 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.805609 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:35Z","lastTransitionTime":"2025-10-01T14:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.831600 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9lvcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96a3328-c79b-4528-b9b5-badbc7380dd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dea414c74a97184e0507113cc8069ee732fe85125aee58c080165c511835f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs5q5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9lvcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:35Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.844062 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad0b1d01c4ff14005e40574c1663d3fafddd1527b8d283ee2f2364a861e3c351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:35Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.854554 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kmlgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2129365d-0a99-4cf0-a561-fd4126d1bfc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dabcd9bb31c364a82e0015bb58c48344f35fd73013cb9eb2c9d178ea6befbdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqx4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kmlgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:35Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.869404 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be36f4b-1171-4281-a7ac-43e411e080f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f58af4fa01651762bc2de081e844beb25bd1468804d6dbf99d01be10dd80e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jj6k4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:35Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.889053 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8156f536-4329-4bc1-8ea3-653c5d719a19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dbd5c544c544544637a62c6e11372b59bdddc28c6ddb14e2cdd93d2d02a85b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9cd85dad256af814a22cd9787f3c1d4061ff2b76a0fb1a92b547a10c809a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d50224da1e43f0ad8835be229e79e94d6603a0fcb0879b9dd306e4cc357c96f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6def3f33927474e3a3c86835c9606ccfd32e235441e1646cdba4cef0b41e2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe4f8936af05697692a6547962fa77f3ac8bb357e3484324073f7768b374060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:35Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.889953 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fe483b7b-ed55-4649-ac50-66ac981305e2-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-9jb9x\" (UID: \"fe483b7b-ed55-4649-ac50-66ac981305e2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9jb9x" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.890027 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2dtb\" (UniqueName: \"kubernetes.io/projected/fe483b7b-ed55-4649-ac50-66ac981305e2-kube-api-access-x2dtb\") pod \"ovnkube-control-plane-749d76644c-9jb9x\" (UID: \"fe483b7b-ed55-4649-ac50-66ac981305e2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9jb9x" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.890075 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fe483b7b-ed55-4649-ac50-66ac981305e2-env-overrides\") pod \"ovnkube-control-plane-749d76644c-9jb9x\" (UID: \"fe483b7b-ed55-4649-ac50-66ac981305e2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9jb9x" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.890114 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fe483b7b-ed55-4649-ac50-66ac981305e2-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-9jb9x\" (UID: \"fe483b7b-ed55-4649-ac50-66ac981305e2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9jb9x" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.890814 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fe483b7b-ed55-4649-ac50-66ac981305e2-env-overrides\") pod \"ovnkube-control-plane-749d76644c-9jb9x\" (UID: \"fe483b7b-ed55-4649-ac50-66ac981305e2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9jb9x" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.891087 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fe483b7b-ed55-4649-ac50-66ac981305e2-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-9jb9x\" (UID: \"fe483b7b-ed55-4649-ac50-66ac981305e2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9jb9x" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.899257 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fe483b7b-ed55-4649-ac50-66ac981305e2-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-9jb9x\" (UID: \"fe483b7b-ed55-4649-ac50-66ac981305e2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9jb9x" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.904251 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b439925b-c79c-4b66-957f-5be27680bbc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8551072e274183e6126e06729369468bcf9a98a9a487ff2febe02b088a6a51aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd6957bcec2f931ddc0f584bb947cc6c8e19aee87c4fa2d2727151312e00bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea2a54d9ce79c067adf62a1a112758028562f68fd877d7b5c1f0ac808fde931\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 14:56:16.707999 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 14:56:16.708248 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 14:56:16.709042 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1763409596/tls.crt::/tmp/serving-cert-1763409596/tls.key\\\\\\\"\\\\nI1001 14:56:17.040394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 14:56:17.043951 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 14:56:17.043976 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 14:56:17.043997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 14:56:17.044002 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 14:56:17.049470 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1001 14:56:17.049534 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 14:56:17.049584 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049607 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 14:56:17.049653 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 14:56:17.049673 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 14:56:17.049696 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 14:56:17.051084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d920faabd05e65200c4defe1588344af5ae2b34148b4213bdc86abe9dd19d236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:35Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.909129 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.909168 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.909182 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.909201 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.909215 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:35Z","lastTransitionTime":"2025-10-01T14:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.909956 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2dtb\" (UniqueName: \"kubernetes.io/projected/fe483b7b-ed55-4649-ac50-66ac981305e2-kube-api-access-x2dtb\") pod \"ovnkube-control-plane-749d76644c-9jb9x\" (UID: \"fe483b7b-ed55-4649-ac50-66ac981305e2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9jb9x" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.919917 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa69b46a-20c4-4ce1-8be9-e945c6865b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0651d298d7c1364b4de3b030574abd7d6ba8ccacab5872d74162878f92592cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b72097f8016d0e79134956d3555973cd7c137c30ab5eade1bd92805d16f66bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://495c07855c881f2a9cd788740e2197c165a38891c729a8563f3540bf5c6d8dcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fb83a6932caf24a6b2430836bc2800415028a8134a2355a816ccbb73efbaf46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:35Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.935718 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6ef7bbeb1f3ba58b4d1cfa2d55bcc4d7e7264d3db53e61495625b58a8503ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://679e6aca450f387cb31946627423ce8bc164d9815a7caf225e6cf1512f189cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:35Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.950505 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9jb9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe483b7b-ed55-4649-ac50-66ac981305e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2dtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2dtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9jb9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:35Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.964485 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7wr7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bb66959-800a-45dc-909f-1a093c578823\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abffda55d62e4f219933292ded99619fb5bfbbe87a5091c8aaaee6ea6162353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bdn7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7wr7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:35Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:35 crc kubenswrapper[4771]: I1001 14:56:35.981948 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"289ee6d3-fabe-417f-964c-76ca03c143cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37230c61c3cdf57d73df404731eb692cf20c46a8d983ee40c0aef7ee1f3ad839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://161a9b5c96d73213aa3d0261956487464c5e1398e47d221db9fccafdfdc51856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vck47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:35Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.004933 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a061b8e2-74a8-4953-bfa2-5090a2f70459\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8e753527ef673cd09f9a6d72e7729db9d0d48329088fa9abab4ff1ed30ef3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2720b235ad3d306800d8680e8ece6483b40235bf69b19ba91a8aeb99cf24cd4c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T14:56:34Z\\\",\\\"message\\\":\\\"1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1001 14:56:34.050470 6039 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 14:56:34.050716 6039 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1001 14:56:34.051039 6039 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 14:56:34.051068 6039 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 14:56:34.051089 6039 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 14:56:34.051101 6039 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1001 14:56:34.051106 6039 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1001 14:56:34.051163 6039 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 14:56:34.051220 6039 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 14:56:34.051224 6039 factory.go:656] Stopping watch factory\\\\nI1001 14:56:34.051233 6039 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 14:56:34.051257 6039 ovnkube.go:599] Stopped ovnkube\\\\nI1001 14:56:34.051287 6039 metrics.go:553] Stopping metrics server at address\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j7ntp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:36Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.012655 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.012762 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.012786 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.012815 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.012833 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:36Z","lastTransitionTime":"2025-10-01T14:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.020435 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02063fddc7a255292ea7cdd6a318546b1573a0701787f4ab839ac7bea5b1ccb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:36Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.036638 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:36Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.054571 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6ef7bbeb1f3ba58b4d1cfa2d55bcc4d7e7264d3db53e61495625b58a8503ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://679e6aca450f387cb31946627423ce8bc164d9815a7caf225e6cf1512f189cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:36Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.072112 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9jb9x" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.072142 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9jb9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe483b7b-ed55-4649-ac50-66ac981305e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2dtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2dtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9jb9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:36Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.084348 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"289ee6d3-fabe-417f-964c-76ca03c143cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37230c61c3cdf57d73df404731eb692cf20c46a8d983ee40c0aef7ee1f3ad839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://161a9b5c96d73213aa3d0261956487464c5e1398e47d221db9fccafdfdc51856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vck47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:36Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.108495 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a061b8e2-74a8-4953-bfa2-5090a2f70459\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8e753527ef673cd09f9a6d72e7729db9d0d48329088fa9abab4ff1ed30ef3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2720b235ad3d306800d8680e8ece6483b40235bf69b19ba91a8aeb99cf24cd4c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T14:56:34Z\\\",\\\"message\\\":\\\"1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1001 14:56:34.050470 6039 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 14:56:34.050716 6039 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1001 14:56:34.051039 6039 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 14:56:34.051068 6039 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 14:56:34.051089 6039 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 14:56:34.051101 6039 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1001 14:56:34.051106 6039 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1001 14:56:34.051163 6039 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 14:56:34.051220 6039 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 14:56:34.051224 6039 factory.go:656] Stopping watch factory\\\\nI1001 14:56:34.051233 6039 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 14:56:34.051257 6039 ovnkube.go:599] Stopped ovnkube\\\\nI1001 14:56:34.051287 6039 metrics.go:553] Stopping metrics server at address\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j7ntp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:36Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.117150 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.117196 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.117206 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.117223 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.117234 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:36Z","lastTransitionTime":"2025-10-01T14:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.128696 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02063fddc7a255292ea7cdd6a318546b1573a0701787f4ab839ac7bea5b1ccb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:36Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.145958 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:36Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.191502 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7wr7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bb66959-800a-45dc-909f-1a093c578823\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abffda55d62e4f219933292ded99619fb5bfbbe87a5091c8aaaee6ea6162353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bdn7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7wr7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:36Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.208000 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:36Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.220239 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.220315 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.220334 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.220361 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.220381 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:36Z","lastTransitionTime":"2025-10-01T14:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.226466 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:36Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.241395 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9lvcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96a3328-c79b-4528-b9b5-badbc7380dd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dea414c74a97184e0507113cc8069ee732fe85125aee58c080165c511835f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs5q5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9lvcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:36Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.254913 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kmlgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2129365d-0a99-4cf0-a561-fd4126d1bfc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dabcd9bb31c364a82e0015bb58c48344f35fd73013cb9eb2c9d178ea6befbdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqx4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kmlgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:36Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.272605 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be36f4b-1171-4281-a7ac-43e411e080f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f58af4fa01651762bc2de081e844beb25bd1468804d6dbf99d01be10dd80e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jj6k4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:36Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.293572 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8156f536-4329-4bc1-8ea3-653c5d719a19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dbd5c544c544544637a62c6e11372b59bdddc28c6ddb14e2cdd93d2d02a85b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9cd85dad256af814a22cd9787f3c1d4061ff2b76a0fb1a92b547a10c809a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d50224da1e43f0ad8835be229e79e94d6603a0fcb0879b9dd306e4cc357c96f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6def3f33927474e3a3c86835c9606ccfd32e235441e1646cdba4cef0b41e2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe4f8936af05697692a6547962fa77f3ac8bb357e3484324073f7768b374060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:36Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.311967 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b439925b-c79c-4b66-957f-5be27680bbc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8551072e274183e6126e06729369468bcf9a98a9a487ff2febe02b088a6a51aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd6957bcec2f931ddc0f584bb947cc6c8e19aee87c4fa2d2727151312e00bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea2a54d9ce79c067adf62a1a112758028562f68fd877d7b5c1f0ac808fde931\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 14:56:16.707999 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 14:56:16.708248 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 14:56:16.709042 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1763409596/tls.crt::/tmp/serving-cert-1763409596/tls.key\\\\\\\"\\\\nI1001 14:56:17.040394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 14:56:17.043951 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 14:56:17.043976 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 14:56:17.043997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 14:56:17.044002 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 14:56:17.049470 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1001 14:56:17.049534 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 14:56:17.049584 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049607 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 14:56:17.049653 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 14:56:17.049673 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 14:56:17.049696 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 14:56:17.051084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d920faabd05e65200c4defe1588344af5ae2b34148b4213bdc86abe9dd19d236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:36Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.315462 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9jb9x" event={"ID":"fe483b7b-ed55-4649-ac50-66ac981305e2","Type":"ContainerStarted","Data":"66be2b058a867417dd127080e41de2dbda3404e4d59a01ef2773220b7a02d4c4"} Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.317363 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j7ntp_a061b8e2-74a8-4953-bfa2-5090a2f70459/ovnkube-controller/1.log" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.318295 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j7ntp_a061b8e2-74a8-4953-bfa2-5090a2f70459/ovnkube-controller/0.log" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.321818 4771 generic.go:334] "Generic (PLEG): container finished" podID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerID="f8e753527ef673cd09f9a6d72e7729db9d0d48329088fa9abab4ff1ed30ef3ae" exitCode=1 Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.321883 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" event={"ID":"a061b8e2-74a8-4953-bfa2-5090a2f70459","Type":"ContainerDied","Data":"f8e753527ef673cd09f9a6d72e7729db9d0d48329088fa9abab4ff1ed30ef3ae"} Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.321942 4771 scope.go:117] "RemoveContainer" containerID="2720b235ad3d306800d8680e8ece6483b40235bf69b19ba91a8aeb99cf24cd4c" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.323156 4771 scope.go:117] "RemoveContainer" containerID="f8e753527ef673cd09f9a6d72e7729db9d0d48329088fa9abab4ff1ed30ef3ae" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.323332 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.323359 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.323367 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.323380 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.323389 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:36Z","lastTransitionTime":"2025-10-01T14:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:36 crc kubenswrapper[4771]: E1001 14:56:36.323411 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-j7ntp_openshift-ovn-kubernetes(a061b8e2-74a8-4953-bfa2-5090a2f70459)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.330180 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa69b46a-20c4-4ce1-8be9-e945c6865b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0651d298d7c1364b4de3b030574abd7d6ba8ccacab5872d74162878f92592cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b72097f8016d0e79134956d3555973cd7c137c30ab5eade1bd92805d16f66bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://495c07855c881f2a9cd788740e2197c165a38891c729a8563f3540bf5c6d8dcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fb83a6932caf24a6b2430836bc2800415028a8134a2355a816ccbb73efbaf46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:36Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.343988 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad0b1d01c4ff14005e40574c1663d3fafddd1527b8d283ee2f2364a861e3c351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:36Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.362284 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6ef7bbeb1f3ba58b4d1cfa2d55bcc4d7e7264d3db53e61495625b58a8503ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://679e6aca450f387cb31946627423ce8bc164d9815a7caf225e6cf1512f189cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:36Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.386083 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9jb9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe483b7b-ed55-4649-ac50-66ac981305e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2dtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2dtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9jb9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:36Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.407256 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02063fddc7a255292ea7cdd6a318546b1573a0701787f4ab839ac7bea5b1ccb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:36Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.426112 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.426404 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.426543 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.426685 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.426930 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:36Z","lastTransitionTime":"2025-10-01T14:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.428364 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:36Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.443993 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7wr7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bb66959-800a-45dc-909f-1a093c578823\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abffda55d62e4f219933292ded99619fb5bfbbe87a5091c8aaaee6ea6162353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bdn7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7wr7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:36Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.462280 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"289ee6d3-fabe-417f-964c-76ca03c143cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37230c61c3cdf57d73df404731eb692cf20c46a8d983ee40c0aef7ee1f3ad839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://161a9b5c96d73213aa3d0261956487464c5e1398e47d221db9fccafdfdc51856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vck47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:36Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.496917 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a061b8e2-74a8-4953-bfa2-5090a2f70459\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8e753527ef673cd09f9a6d72e7729db9d0d48329088fa9abab4ff1ed30ef3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2720b235ad3d306800d8680e8ece6483b40235bf69b19ba91a8aeb99cf24cd4c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T14:56:34Z\\\",\\\"message\\\":\\\"1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1001 14:56:34.050470 6039 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 14:56:34.050716 6039 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1001 14:56:34.051039 6039 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 14:56:34.051068 6039 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 14:56:34.051089 6039 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 14:56:34.051101 6039 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1001 14:56:34.051106 6039 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1001 14:56:34.051163 6039 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 14:56:34.051220 6039 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 14:56:34.051224 6039 factory.go:656] Stopping watch factory\\\\nI1001 14:56:34.051233 6039 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 14:56:34.051257 6039 ovnkube.go:599] Stopped ovnkube\\\\nI1001 14:56:34.051287 6039 metrics.go:553] Stopping metrics server at address\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e753527ef673cd09f9a6d72e7729db9d0d48329088fa9abab4ff1ed30ef3ae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T14:56:35Z\\\",\\\"message\\\":\\\"false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.110],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nF1001 14:56:35.269926 6242 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:35Z is after 2025-08-24T17:21:41Z]\\\\nI1001 14:56:35.269935 6242 lb_config.go:1031] Cluster endpoints for openshift-operator-lifecycle-manager/package-server-manager-metrics for network=default are: map[]\\\\nI1001 14:56:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j7ntp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:36Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.515279 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:36Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.530820 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.530892 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.530915 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.530945 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.530969 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:36Z","lastTransitionTime":"2025-10-01T14:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.532514 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9lvcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96a3328-c79b-4528-b9b5-badbc7380dd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dea414c74a97184e0507113cc8069ee732fe85125aee58c080165c511835f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs5q5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9lvcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:36Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.550788 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:36Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.567825 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b439925b-c79c-4b66-957f-5be27680bbc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8551072e274183e6126e06729369468bcf9a98a9a487ff2febe02b088a6a51aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd6957bcec2f931ddc0f584bb947cc6c8e19aee87c4fa2d2727151312e00bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea2a54d9ce79c067adf62a1a112758028562f68fd877d7b5c1f0ac808fde931\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 14:56:16.707999 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 14:56:16.708248 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 14:56:16.709042 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1763409596/tls.crt::/tmp/serving-cert-1763409596/tls.key\\\\\\\"\\\\nI1001 14:56:17.040394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 14:56:17.043951 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 14:56:17.043976 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 14:56:17.043997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 14:56:17.044002 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 14:56:17.049470 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1001 14:56:17.049534 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 14:56:17.049584 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049607 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 14:56:17.049653 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 14:56:17.049673 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 14:56:17.049696 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 14:56:17.051084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d920faabd05e65200c4defe1588344af5ae2b34148b4213bdc86abe9dd19d236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:36Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.584216 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa69b46a-20c4-4ce1-8be9-e945c6865b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0651d298d7c1364b4de3b030574abd7d6ba8ccacab5872d74162878f92592cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b72097f8016d0e79134956d3555973cd7c137c30ab5eade1bd92805d16f66bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://495c07855c881f2a9cd788740e2197c165a38891c729a8563f3540bf5c6d8dcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fb83a6932caf24a6b2430836bc2800415028a8134a2355a816ccbb73efbaf46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:36Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.600324 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad0b1d01c4ff14005e40574c1663d3fafddd1527b8d283ee2f2364a861e3c351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:36Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.612143 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kmlgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2129365d-0a99-4cf0-a561-fd4126d1bfc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dabcd9bb31c364a82e0015bb58c48344f35fd73013cb9eb2c9d178ea6befbdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqx4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kmlgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:36Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.632523 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be36f4b-1171-4281-a7ac-43e411e080f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f58af4fa01651762bc2de081e844beb25bd1468804d6dbf99d01be10dd80e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jj6k4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:36Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.633573 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.633628 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.633645 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.633665 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.633680 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:36Z","lastTransitionTime":"2025-10-01T14:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.657353 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8156f536-4329-4bc1-8ea3-653c5d719a19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dbd5c544c544544637a62c6e11372b59bdddc28c6ddb14e2cdd93d2d02a85b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9cd85dad256af814a22cd9787f3c1d4061ff2b76a0fb1a92b547a10c809a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d50224da1e43f0ad8835be229e79e94d6603a0fcb0879b9dd306e4cc357c96f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6def3f33927474e3a3c86835c9606ccfd32e235441e1646cdba4cef0b41e2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe4f8936af05697692a6547962fa77f3ac8bb357e3484324073f7768b374060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:36Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.736595 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.736658 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.736671 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.736687 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.736698 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:36Z","lastTransitionTime":"2025-10-01T14:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.839677 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.839803 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.839831 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.839865 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.839893 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:36Z","lastTransitionTime":"2025-10-01T14:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.942223 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.942289 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.942305 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.942330 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.942350 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:36Z","lastTransitionTime":"2025-10-01T14:56:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.984494 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.984566 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:56:36 crc kubenswrapper[4771]: I1001 14:56:36.984585 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:56:36 crc kubenswrapper[4771]: E1001 14:56:36.984698 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:56:36 crc kubenswrapper[4771]: E1001 14:56:36.984819 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:56:36 crc kubenswrapper[4771]: E1001 14:56:36.984968 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.045459 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.045493 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.045501 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.045515 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.045525 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:37Z","lastTransitionTime":"2025-10-01T14:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.148182 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.148671 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.148702 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.148770 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.148797 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:37Z","lastTransitionTime":"2025-10-01T14:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.251498 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.251548 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.251561 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.251579 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.251591 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:37Z","lastTransitionTime":"2025-10-01T14:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.294480 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-8qdkc"] Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.295025 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:56:37 crc kubenswrapper[4771]: E1001 14:56:37.295095 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.305399 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a49c960d-cfd1-4745-976b-59c62e3dcf8e-metrics-certs\") pod \"network-metrics-daemon-8qdkc\" (UID: \"a49c960d-cfd1-4745-976b-59c62e3dcf8e\") " pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.305514 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9mq6\" (UniqueName: \"kubernetes.io/projected/a49c960d-cfd1-4745-976b-59c62e3dcf8e-kube-api-access-m9mq6\") pod \"network-metrics-daemon-8qdkc\" (UID: \"a49c960d-cfd1-4745-976b-59c62e3dcf8e\") " pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.314127 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6ef7bbeb1f3ba58b4d1cfa2d55bcc4d7e7264d3db53e61495625b58a8503ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://679e6aca450f387cb31946627423ce8bc164d9815a7caf225e6cf1512f189cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:37Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.328908 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j7ntp_a061b8e2-74a8-4953-bfa2-5090a2f70459/ovnkube-controller/1.log" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.334668 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9jb9x" event={"ID":"fe483b7b-ed55-4649-ac50-66ac981305e2","Type":"ContainerStarted","Data":"fb235560f4fabd7d33a9286e029c075fefa4dd44eea942cd8fe4ca74819ce722"} Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.334862 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9jb9x" event={"ID":"fe483b7b-ed55-4649-ac50-66ac981305e2","Type":"ContainerStarted","Data":"e00c87efe2f7a38dd71171e28ae517733a09ed433bd3fec878757e5094d423ca"} Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.337614 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9jb9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe483b7b-ed55-4649-ac50-66ac981305e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:35Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2dtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2dtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9jb9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:37Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.353864 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.354111 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.354293 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.354513 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.354712 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:37Z","lastTransitionTime":"2025-10-01T14:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.376048 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a061b8e2-74a8-4953-bfa2-5090a2f70459\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8e753527ef673cd09f9a6d72e7729db9d0d48329088fa9abab4ff1ed30ef3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2720b235ad3d306800d8680e8ece6483b40235bf69b19ba91a8aeb99cf24cd4c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T14:56:34Z\\\",\\\"message\\\":\\\"1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1001 14:56:34.050470 6039 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 14:56:34.050716 6039 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1001 14:56:34.051039 6039 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 14:56:34.051068 6039 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 14:56:34.051089 6039 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 14:56:34.051101 6039 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1001 14:56:34.051106 6039 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1001 14:56:34.051163 6039 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 14:56:34.051220 6039 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 14:56:34.051224 6039 factory.go:656] Stopping watch factory\\\\nI1001 14:56:34.051233 6039 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 14:56:34.051257 6039 ovnkube.go:599] Stopped ovnkube\\\\nI1001 14:56:34.051287 6039 metrics.go:553] Stopping metrics server at address\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e753527ef673cd09f9a6d72e7729db9d0d48329088fa9abab4ff1ed30ef3ae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T14:56:35Z\\\",\\\"message\\\":\\\"false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.110],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nF1001 14:56:35.269926 6242 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:35Z is after 2025-08-24T17:21:41Z]\\\\nI1001 14:56:35.269935 6242 lb_config.go:1031] Cluster endpoints for openshift-operator-lifecycle-manager/package-server-manager-metrics for network=default are: map[]\\\\nI1001 14:56:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j7ntp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:37Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.401055 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02063fddc7a255292ea7cdd6a318546b1573a0701787f4ab839ac7bea5b1ccb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:37Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.406487 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9mq6\" (UniqueName: \"kubernetes.io/projected/a49c960d-cfd1-4745-976b-59c62e3dcf8e-kube-api-access-m9mq6\") pod \"network-metrics-daemon-8qdkc\" (UID: \"a49c960d-cfd1-4745-976b-59c62e3dcf8e\") " pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.406609 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a49c960d-cfd1-4745-976b-59c62e3dcf8e-metrics-certs\") pod \"network-metrics-daemon-8qdkc\" (UID: \"a49c960d-cfd1-4745-976b-59c62e3dcf8e\") " pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:56:37 crc kubenswrapper[4771]: E1001 14:56:37.406844 4771 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 14:56:37 crc kubenswrapper[4771]: E1001 14:56:37.406939 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a49c960d-cfd1-4745-976b-59c62e3dcf8e-metrics-certs podName:a49c960d-cfd1-4745-976b-59c62e3dcf8e nodeName:}" failed. No retries permitted until 2025-10-01 14:56:37.906909465 +0000 UTC m=+42.526084676 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a49c960d-cfd1-4745-976b-59c62e3dcf8e-metrics-certs") pod "network-metrics-daemon-8qdkc" (UID: "a49c960d-cfd1-4745-976b-59c62e3dcf8e") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.423870 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:37Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.437358 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9mq6\" (UniqueName: \"kubernetes.io/projected/a49c960d-cfd1-4745-976b-59c62e3dcf8e-kube-api-access-m9mq6\") pod \"network-metrics-daemon-8qdkc\" (UID: \"a49c960d-cfd1-4745-976b-59c62e3dcf8e\") " pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.443608 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7wr7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bb66959-800a-45dc-909f-1a093c578823\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abffda55d62e4f219933292ded99619fb5bfbbe87a5091c8aaaee6ea6162353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bdn7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7wr7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:37Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.458144 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.458201 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.458236 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.458256 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.458268 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:37Z","lastTransitionTime":"2025-10-01T14:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.463506 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"289ee6d3-fabe-417f-964c-76ca03c143cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37230c61c3cdf57d73df404731eb692cf20c46a8d983ee40c0aef7ee1f3ad839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://161a9b5c96d73213aa3d0261956487464c5e1398e47d221db9fccafdfdc51856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vck47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:37Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.483313 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:37Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.498750 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:37Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.512291 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9lvcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96a3328-c79b-4528-b9b5-badbc7380dd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dea414c74a97184e0507113cc8069ee732fe85125aee58c080165c511835f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs5q5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9lvcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:37Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.524460 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8qdkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a49c960d-cfd1-4745-976b-59c62e3dcf8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9mq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9mq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8qdkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:37Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.539900 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be36f4b-1171-4281-a7ac-43e411e080f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f58af4fa01651762bc2de081e844beb25bd1468804d6dbf99d01be10dd80e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jj6k4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:37Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.559953 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8156f536-4329-4bc1-8ea3-653c5d719a19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dbd5c544c544544637a62c6e11372b59bdddc28c6ddb14e2cdd93d2d02a85b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9cd85dad256af814a22cd9787f3c1d4061ff2b76a0fb1a92b547a10c809a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d50224da1e43f0ad8835be229e79e94d6603a0fcb0879b9dd306e4cc357c96f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6def3f33927474e3a3c86835c9606ccfd32e235441e1646cdba4cef0b41e2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe4f8936af05697692a6547962fa77f3ac8bb357e3484324073f7768b374060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:37Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.561204 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.561246 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.561258 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.561274 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.561288 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:37Z","lastTransitionTime":"2025-10-01T14:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.579118 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b439925b-c79c-4b66-957f-5be27680bbc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8551072e274183e6126e06729369468bcf9a98a9a487ff2febe02b088a6a51aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd6957bcec2f931ddc0f584bb947cc6c8e19aee87c4fa2d2727151312e00bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea2a54d9ce79c067adf62a1a112758028562f68fd877d7b5c1f0ac808fde931\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 14:56:16.707999 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 14:56:16.708248 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 14:56:16.709042 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1763409596/tls.crt::/tmp/serving-cert-1763409596/tls.key\\\\\\\"\\\\nI1001 14:56:17.040394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 14:56:17.043951 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 14:56:17.043976 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 14:56:17.043997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 14:56:17.044002 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 14:56:17.049470 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1001 14:56:17.049534 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 14:56:17.049584 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049607 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 14:56:17.049653 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 14:56:17.049673 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 14:56:17.049696 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 14:56:17.051084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d920faabd05e65200c4defe1588344af5ae2b34148b4213bdc86abe9dd19d236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:37Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.600200 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa69b46a-20c4-4ce1-8be9-e945c6865b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0651d298d7c1364b4de3b030574abd7d6ba8ccacab5872d74162878f92592cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b72097f8016d0e79134956d3555973cd7c137c30ab5eade1bd92805d16f66bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://495c07855c881f2a9cd788740e2197c165a38891c729a8563f3540bf5c6d8dcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fb83a6932caf24a6b2430836bc2800415028a8134a2355a816ccbb73efbaf46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:37Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.618051 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad0b1d01c4ff14005e40574c1663d3fafddd1527b8d283ee2f2364a861e3c351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:37Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.636224 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kmlgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2129365d-0a99-4cf0-a561-fd4126d1bfc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dabcd9bb31c364a82e0015bb58c48344f35fd73013cb9eb2c9d178ea6befbdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqx4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kmlgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:37Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.661878 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8156f536-4329-4bc1-8ea3-653c5d719a19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dbd5c544c544544637a62c6e11372b59bdddc28c6ddb14e2cdd93d2d02a85b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9cd85dad256af814a22cd9787f3c1d4061ff2b76a0fb1a92b547a10c809a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d50224da1e43f0ad8835be229e79e94d6603a0fcb0879b9dd306e4cc357c96f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6def3f33927474e3a3c86835c9606ccfd32e235441e1646cdba4cef0b41e2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe4f8936af05697692a6547962fa77f3ac8bb357e3484324073f7768b374060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:37Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.663788 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.663830 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.663840 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.663855 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.663866 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:37Z","lastTransitionTime":"2025-10-01T14:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.681262 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b439925b-c79c-4b66-957f-5be27680bbc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8551072e274183e6126e06729369468bcf9a98a9a487ff2febe02b088a6a51aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd6957bcec2f931ddc0f584bb947cc6c8e19aee87c4fa2d2727151312e00bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea2a54d9ce79c067adf62a1a112758028562f68fd877d7b5c1f0ac808fde931\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 14:56:16.707999 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 14:56:16.708248 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 14:56:16.709042 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1763409596/tls.crt::/tmp/serving-cert-1763409596/tls.key\\\\\\\"\\\\nI1001 14:56:17.040394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 14:56:17.043951 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 14:56:17.043976 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 14:56:17.043997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 14:56:17.044002 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 14:56:17.049470 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1001 14:56:17.049534 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 14:56:17.049584 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049607 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 14:56:17.049653 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 14:56:17.049673 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 14:56:17.049696 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 14:56:17.051084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d920faabd05e65200c4defe1588344af5ae2b34148b4213bdc86abe9dd19d236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:37Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.697005 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa69b46a-20c4-4ce1-8be9-e945c6865b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0651d298d7c1364b4de3b030574abd7d6ba8ccacab5872d74162878f92592cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b72097f8016d0e79134956d3555973cd7c137c30ab5eade1bd92805d16f66bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://495c07855c881f2a9cd788740e2197c165a38891c729a8563f3540bf5c6d8dcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fb83a6932caf24a6b2430836bc2800415028a8134a2355a816ccbb73efbaf46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:37Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.712890 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad0b1d01c4ff14005e40574c1663d3fafddd1527b8d283ee2f2364a861e3c351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:37Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.724349 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kmlgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2129365d-0a99-4cf0-a561-fd4126d1bfc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dabcd9bb31c364a82e0015bb58c48344f35fd73013cb9eb2c9d178ea6befbdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqx4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kmlgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:37Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.743068 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be36f4b-1171-4281-a7ac-43e411e080f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f58af4fa01651762bc2de081e844beb25bd1468804d6dbf99d01be10dd80e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jj6k4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:37Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.758919 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6ef7bbeb1f3ba58b4d1cfa2d55bcc4d7e7264d3db53e61495625b58a8503ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://679e6aca450f387cb31946627423ce8bc164d9815a7caf225e6cf1512f189cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:37Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.766443 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.766527 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.766544 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.766570 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.766587 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:37Z","lastTransitionTime":"2025-10-01T14:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.773718 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9jb9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe483b7b-ed55-4649-ac50-66ac981305e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e00c87efe2f7a38dd71171e28ae517733a09ed433bd3fec878757e5094d423ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2dtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb235560f4fabd7d33a9286e029c075fefa4dd44eea942cd8fe4ca74819ce722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2dtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9jb9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:37Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.789831 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02063fddc7a255292ea7cdd6a318546b1573a0701787f4ab839ac7bea5b1ccb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:37Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.807003 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:37Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.817589 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7wr7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bb66959-800a-45dc-909f-1a093c578823\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abffda55d62e4f219933292ded99619fb5bfbbe87a5091c8aaaee6ea6162353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bdn7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7wr7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:37Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.828396 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"289ee6d3-fabe-417f-964c-76ca03c143cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37230c61c3cdf57d73df404731eb692cf20c46a8d983ee40c0aef7ee1f3ad839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://161a9b5c96d73213aa3d0261956487464c5e1398e47d221db9fccafdfdc51856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vck47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:37Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.848615 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a061b8e2-74a8-4953-bfa2-5090a2f70459\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8e753527ef673cd09f9a6d72e7729db9d0d48329088fa9abab4ff1ed30ef3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2720b235ad3d306800d8680e8ece6483b40235bf69b19ba91a8aeb99cf24cd4c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T14:56:34Z\\\",\\\"message\\\":\\\"1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1001 14:56:34.050470 6039 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 14:56:34.050716 6039 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1001 14:56:34.051039 6039 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 14:56:34.051068 6039 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 14:56:34.051089 6039 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 14:56:34.051101 6039 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1001 14:56:34.051106 6039 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1001 14:56:34.051163 6039 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 14:56:34.051220 6039 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 14:56:34.051224 6039 factory.go:656] Stopping watch factory\\\\nI1001 14:56:34.051233 6039 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 14:56:34.051257 6039 ovnkube.go:599] Stopped ovnkube\\\\nI1001 14:56:34.051287 6039 metrics.go:553] Stopping metrics server at address\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e753527ef673cd09f9a6d72e7729db9d0d48329088fa9abab4ff1ed30ef3ae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T14:56:35Z\\\",\\\"message\\\":\\\"false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.110],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nF1001 14:56:35.269926 6242 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:35Z is after 2025-08-24T17:21:41Z]\\\\nI1001 14:56:35.269935 6242 lb_config.go:1031] Cluster endpoints for openshift-operator-lifecycle-manager/package-server-manager-metrics for network=default are: map[]\\\\nI1001 14:56:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j7ntp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:37Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.862798 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:37Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.869285 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.869320 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.869331 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.869345 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.869353 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:37Z","lastTransitionTime":"2025-10-01T14:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.876971 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:37Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.897391 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9lvcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96a3328-c79b-4528-b9b5-badbc7380dd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dea414c74a97184e0507113cc8069ee732fe85125aee58c080165c511835f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs5q5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9lvcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:37Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.911058 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a49c960d-cfd1-4745-976b-59c62e3dcf8e-metrics-certs\") pod \"network-metrics-daemon-8qdkc\" (UID: \"a49c960d-cfd1-4745-976b-59c62e3dcf8e\") " pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:56:37 crc kubenswrapper[4771]: E1001 14:56:37.911331 4771 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 14:56:37 crc kubenswrapper[4771]: E1001 14:56:37.911487 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a49c960d-cfd1-4745-976b-59c62e3dcf8e-metrics-certs podName:a49c960d-cfd1-4745-976b-59c62e3dcf8e nodeName:}" failed. No retries permitted until 2025-10-01 14:56:38.911452132 +0000 UTC m=+43.530627303 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a49c960d-cfd1-4745-976b-59c62e3dcf8e-metrics-certs") pod "network-metrics-daemon-8qdkc" (UID: "a49c960d-cfd1-4745-976b-59c62e3dcf8e") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.912835 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8qdkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a49c960d-cfd1-4745-976b-59c62e3dcf8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9mq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9mq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8qdkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:37Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.977130 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.977237 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.977418 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.977471 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.977491 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:37Z","lastTransitionTime":"2025-10-01T14:56:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:37 crc kubenswrapper[4771]: I1001 14:56:37.985356 4771 scope.go:117] "RemoveContainer" containerID="c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.084307 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.084361 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.084374 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.084393 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.084406 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:38Z","lastTransitionTime":"2025-10-01T14:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.188086 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.188127 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.188135 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.188150 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.188159 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:38Z","lastTransitionTime":"2025-10-01T14:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.290760 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.290800 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.290811 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.290828 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.290840 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:38Z","lastTransitionTime":"2025-10-01T14:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.339039 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.341291 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"77ba775d006f7fce834b114986ea63340af2cca2e7d10b6fc3be16555f278fd1"} Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.341754 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.367607 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a061b8e2-74a8-4953-bfa2-5090a2f70459\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8e753527ef673cd09f9a6d72e7729db9d0d48329088fa9abab4ff1ed30ef3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2720b235ad3d306800d8680e8ece6483b40235bf69b19ba91a8aeb99cf24cd4c\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T14:56:34Z\\\",\\\"message\\\":\\\"1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1001 14:56:34.050470 6039 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 14:56:34.050716 6039 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1001 14:56:34.051039 6039 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 14:56:34.051068 6039 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 14:56:34.051089 6039 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 14:56:34.051101 6039 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1001 14:56:34.051106 6039 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1001 14:56:34.051163 6039 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 14:56:34.051220 6039 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 14:56:34.051224 6039 factory.go:656] Stopping watch factory\\\\nI1001 14:56:34.051233 6039 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 14:56:34.051257 6039 ovnkube.go:599] Stopped ovnkube\\\\nI1001 14:56:34.051287 6039 metrics.go:553] Stopping metrics server at address\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:29Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e753527ef673cd09f9a6d72e7729db9d0d48329088fa9abab4ff1ed30ef3ae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T14:56:35Z\\\",\\\"message\\\":\\\"false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.110],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nF1001 14:56:35.269926 6242 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:35Z is after 2025-08-24T17:21:41Z]\\\\nI1001 14:56:35.269935 6242 lb_config.go:1031] Cluster endpoints for openshift-operator-lifecycle-manager/package-server-manager-metrics for network=default are: map[]\\\\nI1001 14:56:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j7ntp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:38Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.387364 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02063fddc7a255292ea7cdd6a318546b1573a0701787f4ab839ac7bea5b1ccb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:38Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.393937 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.393975 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.393989 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.394010 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.394026 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:38Z","lastTransitionTime":"2025-10-01T14:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.403939 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:38Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.421986 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7wr7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bb66959-800a-45dc-909f-1a093c578823\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abffda55d62e4f219933292ded99619fb5bfbbe87a5091c8aaaee6ea6162353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bdn7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7wr7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:38Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.444444 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"289ee6d3-fabe-417f-964c-76ca03c143cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37230c61c3cdf57d73df404731eb692cf20c46a8d983ee40c0aef7ee1f3ad839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://161a9b5c96d73213aa3d0261956487464c5e1398e47d221db9fccafdfdc51856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vck47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:38Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.459986 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:38Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.481443 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:38Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.497163 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.497195 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.497206 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.497222 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.497234 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:38Z","lastTransitionTime":"2025-10-01T14:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.503612 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9lvcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96a3328-c79b-4528-b9b5-badbc7380dd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dea414c74a97184e0507113cc8069ee732fe85125aee58c080165c511835f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs5q5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9lvcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:38Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.521270 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8qdkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a49c960d-cfd1-4745-976b-59c62e3dcf8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9mq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9mq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8qdkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:38Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.545186 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be36f4b-1171-4281-a7ac-43e411e080f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f58af4fa01651762bc2de081e844beb25bd1468804d6dbf99d01be10dd80e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jj6k4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:38Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.576026 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8156f536-4329-4bc1-8ea3-653c5d719a19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dbd5c544c544544637a62c6e11372b59bdddc28c6ddb14e2cdd93d2d02a85b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9cd85dad256af814a22cd9787f3c1d4061ff2b76a0fb1a92b547a10c809a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d50224da1e43f0ad8835be229e79e94d6603a0fcb0879b9dd306e4cc357c96f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6def3f33927474e3a3c86835c9606ccfd32e235441e1646cdba4cef0b41e2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe4f8936af05697692a6547962fa77f3ac8bb357e3484324073f7768b374060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:38Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.600118 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.600152 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.600160 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.600171 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.600181 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:38Z","lastTransitionTime":"2025-10-01T14:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.601038 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b439925b-c79c-4b66-957f-5be27680bbc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8551072e274183e6126e06729369468bcf9a98a9a487ff2febe02b088a6a51aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd6957bcec2f931ddc0f584bb947cc6c8e19aee87c4fa2d2727151312e00bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea2a54d9ce79c067adf62a1a112758028562f68fd877d7b5c1f0ac808fde931\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba775d006f7fce834b114986ea63340af2cca2e7d10b6fc3be16555f278fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 14:56:16.707999 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 14:56:16.708248 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 14:56:16.709042 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1763409596/tls.crt::/tmp/serving-cert-1763409596/tls.key\\\\\\\"\\\\nI1001 14:56:17.040394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 14:56:17.043951 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 14:56:17.043976 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 14:56:17.043997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 14:56:17.044002 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 14:56:17.049470 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1001 14:56:17.049534 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 14:56:17.049584 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049607 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 14:56:17.049653 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 14:56:17.049673 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 14:56:17.049696 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 14:56:17.051084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d920faabd05e65200c4defe1588344af5ae2b34148b4213bdc86abe9dd19d236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:38Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.617387 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa69b46a-20c4-4ce1-8be9-e945c6865b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0651d298d7c1364b4de3b030574abd7d6ba8ccacab5872d74162878f92592cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b72097f8016d0e79134956d3555973cd7c137c30ab5eade1bd92805d16f66bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://495c07855c881f2a9cd788740e2197c165a38891c729a8563f3540bf5c6d8dcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fb83a6932caf24a6b2430836bc2800415028a8134a2355a816ccbb73efbaf46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:38Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.634246 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad0b1d01c4ff14005e40574c1663d3fafddd1527b8d283ee2f2364a861e3c351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:38Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.648106 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kmlgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2129365d-0a99-4cf0-a561-fd4126d1bfc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dabcd9bb31c364a82e0015bb58c48344f35fd73013cb9eb2c9d178ea6befbdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqx4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kmlgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:38Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.668016 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6ef7bbeb1f3ba58b4d1cfa2d55bcc4d7e7264d3db53e61495625b58a8503ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://679e6aca450f387cb31946627423ce8bc164d9815a7caf225e6cf1512f189cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:38Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.687629 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9jb9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe483b7b-ed55-4649-ac50-66ac981305e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e00c87efe2f7a38dd71171e28ae517733a09ed433bd3fec878757e5094d423ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2dtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb235560f4fabd7d33a9286e029c075fefa4dd44eea942cd8fe4ca74819ce722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2dtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9jb9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:38Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.702460 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.702520 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.702531 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.702546 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.702558 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:38Z","lastTransitionTime":"2025-10-01T14:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.805035 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.805067 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.805076 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.805088 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.805097 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:38Z","lastTransitionTime":"2025-10-01T14:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.907784 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.907816 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.907824 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.907837 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.907846 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:38Z","lastTransitionTime":"2025-10-01T14:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.922117 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a49c960d-cfd1-4745-976b-59c62e3dcf8e-metrics-certs\") pod \"network-metrics-daemon-8qdkc\" (UID: \"a49c960d-cfd1-4745-976b-59c62e3dcf8e\") " pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:56:38 crc kubenswrapper[4771]: E1001 14:56:38.922271 4771 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 14:56:38 crc kubenswrapper[4771]: E1001 14:56:38.922319 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a49c960d-cfd1-4745-976b-59c62e3dcf8e-metrics-certs podName:a49c960d-cfd1-4745-976b-59c62e3dcf8e nodeName:}" failed. No retries permitted until 2025-10-01 14:56:40.922304491 +0000 UTC m=+45.541479672 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a49c960d-cfd1-4745-976b-59c62e3dcf8e-metrics-certs") pod "network-metrics-daemon-8qdkc" (UID: "a49c960d-cfd1-4745-976b-59c62e3dcf8e") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.985032 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:56:38 crc kubenswrapper[4771]: E1001 14:56:38.985200 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.985586 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:56:38 crc kubenswrapper[4771]: E1001 14:56:38.985657 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.985770 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:56:38 crc kubenswrapper[4771]: E1001 14:56:38.985841 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:56:38 crc kubenswrapper[4771]: I1001 14:56:38.985891 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:56:38 crc kubenswrapper[4771]: E1001 14:56:38.985950 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.010236 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.010309 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.010333 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.010361 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.010388 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:39Z","lastTransitionTime":"2025-10-01T14:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.113756 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.113797 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.113807 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.113822 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.113833 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:39Z","lastTransitionTime":"2025-10-01T14:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.216518 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.216582 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.216602 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.216627 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.216648 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:39Z","lastTransitionTime":"2025-10-01T14:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.319933 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.319983 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.319993 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.320010 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.320022 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:39Z","lastTransitionTime":"2025-10-01T14:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.424086 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.424141 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.424153 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.424171 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.424183 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:39Z","lastTransitionTime":"2025-10-01T14:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.527490 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.527550 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.527568 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.527592 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.527611 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:39Z","lastTransitionTime":"2025-10-01T14:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.630189 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.630233 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.630246 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.630265 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.630279 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:39Z","lastTransitionTime":"2025-10-01T14:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.732840 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.732903 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.732918 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.732940 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.732959 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:39Z","lastTransitionTime":"2025-10-01T14:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.835997 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.836081 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.836100 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.836126 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.836144 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:39Z","lastTransitionTime":"2025-10-01T14:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.940037 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.940124 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.940148 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.940178 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:39 crc kubenswrapper[4771]: I1001 14:56:39.940200 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:39Z","lastTransitionTime":"2025-10-01T14:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.044399 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.044467 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.044493 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.044525 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.044549 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:40Z","lastTransitionTime":"2025-10-01T14:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.149298 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.149377 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.149403 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.149437 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.149465 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:40Z","lastTransitionTime":"2025-10-01T14:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.253369 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.253449 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.253484 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.253514 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.253536 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:40Z","lastTransitionTime":"2025-10-01T14:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.356347 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.356408 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.356425 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.356452 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.356472 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:40Z","lastTransitionTime":"2025-10-01T14:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.459217 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.459708 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.459728 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.459776 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.459794 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:40Z","lastTransitionTime":"2025-10-01T14:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.564148 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.564227 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.564249 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.564275 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.564296 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:40Z","lastTransitionTime":"2025-10-01T14:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.667671 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.667769 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.667788 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.667814 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.667833 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:40Z","lastTransitionTime":"2025-10-01T14:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.770939 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.770999 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.771016 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.771046 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.771066 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:40Z","lastTransitionTime":"2025-10-01T14:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.875035 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.875123 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.875156 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.875189 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.875215 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:40Z","lastTransitionTime":"2025-10-01T14:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.940120 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a49c960d-cfd1-4745-976b-59c62e3dcf8e-metrics-certs\") pod \"network-metrics-daemon-8qdkc\" (UID: \"a49c960d-cfd1-4745-976b-59c62e3dcf8e\") " pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:56:40 crc kubenswrapper[4771]: E1001 14:56:40.940397 4771 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 14:56:40 crc kubenswrapper[4771]: E1001 14:56:40.940502 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a49c960d-cfd1-4745-976b-59c62e3dcf8e-metrics-certs podName:a49c960d-cfd1-4745-976b-59c62e3dcf8e nodeName:}" failed. No retries permitted until 2025-10-01 14:56:44.940473513 +0000 UTC m=+49.559648724 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a49c960d-cfd1-4745-976b-59c62e3dcf8e-metrics-certs") pod "network-metrics-daemon-8qdkc" (UID: "a49c960d-cfd1-4745-976b-59c62e3dcf8e") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.978804 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.978887 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.978908 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.978934 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.978954 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:40Z","lastTransitionTime":"2025-10-01T14:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.984415 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.984449 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.984576 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:56:40 crc kubenswrapper[4771]: I1001 14:56:40.985000 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:56:40 crc kubenswrapper[4771]: E1001 14:56:40.984991 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:56:40 crc kubenswrapper[4771]: E1001 14:56:40.985156 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:56:40 crc kubenswrapper[4771]: E1001 14:56:40.985270 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:56:40 crc kubenswrapper[4771]: E1001 14:56:40.985399 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:56:41 crc kubenswrapper[4771]: I1001 14:56:41.082348 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:41 crc kubenswrapper[4771]: I1001 14:56:41.082865 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:41 crc kubenswrapper[4771]: I1001 14:56:41.083094 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:41 crc kubenswrapper[4771]: I1001 14:56:41.083287 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:41 crc kubenswrapper[4771]: I1001 14:56:41.083502 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:41Z","lastTransitionTime":"2025-10-01T14:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:41 crc kubenswrapper[4771]: I1001 14:56:41.188360 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:41 crc kubenswrapper[4771]: I1001 14:56:41.188429 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:41 crc kubenswrapper[4771]: I1001 14:56:41.188447 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:41 crc kubenswrapper[4771]: I1001 14:56:41.188478 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:41 crc kubenswrapper[4771]: I1001 14:56:41.188498 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:41Z","lastTransitionTime":"2025-10-01T14:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:41 crc kubenswrapper[4771]: I1001 14:56:41.292210 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:41 crc kubenswrapper[4771]: I1001 14:56:41.292264 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:41 crc kubenswrapper[4771]: I1001 14:56:41.292282 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:41 crc kubenswrapper[4771]: I1001 14:56:41.292305 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:41 crc kubenswrapper[4771]: I1001 14:56:41.292323 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:41Z","lastTransitionTime":"2025-10-01T14:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:41 crc kubenswrapper[4771]: I1001 14:56:41.396530 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:41 crc kubenswrapper[4771]: I1001 14:56:41.396594 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:41 crc kubenswrapper[4771]: I1001 14:56:41.396613 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:41 crc kubenswrapper[4771]: I1001 14:56:41.396638 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:41 crc kubenswrapper[4771]: I1001 14:56:41.396660 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:41Z","lastTransitionTime":"2025-10-01T14:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:41 crc kubenswrapper[4771]: I1001 14:56:41.501216 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:41 crc kubenswrapper[4771]: I1001 14:56:41.501285 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:41 crc kubenswrapper[4771]: I1001 14:56:41.501303 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:41 crc kubenswrapper[4771]: I1001 14:56:41.501330 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:41 crc kubenswrapper[4771]: I1001 14:56:41.501352 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:41Z","lastTransitionTime":"2025-10-01T14:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:41 crc kubenswrapper[4771]: I1001 14:56:41.604319 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:41 crc kubenswrapper[4771]: I1001 14:56:41.604382 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:41 crc kubenswrapper[4771]: I1001 14:56:41.604398 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:41 crc kubenswrapper[4771]: I1001 14:56:41.604421 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:41 crc kubenswrapper[4771]: I1001 14:56:41.604439 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:41Z","lastTransitionTime":"2025-10-01T14:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:41 crc kubenswrapper[4771]: I1001 14:56:41.707715 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:41 crc kubenswrapper[4771]: I1001 14:56:41.707782 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:41 crc kubenswrapper[4771]: I1001 14:56:41.707793 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:41 crc kubenswrapper[4771]: I1001 14:56:41.707812 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:41 crc kubenswrapper[4771]: I1001 14:56:41.707824 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:41Z","lastTransitionTime":"2025-10-01T14:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:41 crc kubenswrapper[4771]: I1001 14:56:41.811568 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:41 crc kubenswrapper[4771]: I1001 14:56:41.811626 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:41 crc kubenswrapper[4771]: I1001 14:56:41.811644 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:41 crc kubenswrapper[4771]: I1001 14:56:41.811667 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:41 crc kubenswrapper[4771]: I1001 14:56:41.811685 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:41Z","lastTransitionTime":"2025-10-01T14:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:41 crc kubenswrapper[4771]: I1001 14:56:41.915474 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:41 crc kubenswrapper[4771]: I1001 14:56:41.915884 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:41 crc kubenswrapper[4771]: I1001 14:56:41.916134 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:41 crc kubenswrapper[4771]: I1001 14:56:41.916357 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:41 crc kubenswrapper[4771]: I1001 14:56:41.916501 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:41Z","lastTransitionTime":"2025-10-01T14:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.020646 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.020700 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.020712 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.020747 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.020760 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:42Z","lastTransitionTime":"2025-10-01T14:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.123984 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.124055 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.124067 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.124090 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.124104 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:42Z","lastTransitionTime":"2025-10-01T14:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.228286 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.228370 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.228389 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.228426 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.228448 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:42Z","lastTransitionTime":"2025-10-01T14:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.332438 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.332506 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.332524 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.332549 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.332568 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:42Z","lastTransitionTime":"2025-10-01T14:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.435992 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.436052 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.436068 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.436090 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.436107 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:42Z","lastTransitionTime":"2025-10-01T14:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.538932 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.539002 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.539016 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.539044 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.539061 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:42Z","lastTransitionTime":"2025-10-01T14:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.643063 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.643118 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.643130 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.643148 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.643160 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:42Z","lastTransitionTime":"2025-10-01T14:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.745530 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.745566 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.745574 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.745589 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.745600 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:42Z","lastTransitionTime":"2025-10-01T14:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.807666 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.807717 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.807766 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.807787 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.807803 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:42Z","lastTransitionTime":"2025-10-01T14:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:42 crc kubenswrapper[4771]: E1001 14:56:42.826575 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f03ada0f-e2c8-42c8-86e3-3e9572f1e63b\\\",\\\"systemUUID\\\":\\\"ab8b87ec-94d1-4eae-9ea3-b28f83991d01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:42Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.831317 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.831357 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.831366 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.831384 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.831399 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:42Z","lastTransitionTime":"2025-10-01T14:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:42 crc kubenswrapper[4771]: E1001 14:56:42.849003 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f03ada0f-e2c8-42c8-86e3-3e9572f1e63b\\\",\\\"systemUUID\\\":\\\"ab8b87ec-94d1-4eae-9ea3-b28f83991d01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:42Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.854018 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.854059 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.854070 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.854087 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.854101 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:42Z","lastTransitionTime":"2025-10-01T14:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:42 crc kubenswrapper[4771]: E1001 14:56:42.870796 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f03ada0f-e2c8-42c8-86e3-3e9572f1e63b\\\",\\\"systemUUID\\\":\\\"ab8b87ec-94d1-4eae-9ea3-b28f83991d01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:42Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.875870 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.875931 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.875960 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.875987 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.876007 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:42Z","lastTransitionTime":"2025-10-01T14:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:42 crc kubenswrapper[4771]: E1001 14:56:42.890033 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f03ada0f-e2c8-42c8-86e3-3e9572f1e63b\\\",\\\"systemUUID\\\":\\\"ab8b87ec-94d1-4eae-9ea3-b28f83991d01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:42Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.894319 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.894596 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.894765 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.894885 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.894974 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:42Z","lastTransitionTime":"2025-10-01T14:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:42 crc kubenswrapper[4771]: E1001 14:56:42.915008 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:42Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f03ada0f-e2c8-42c8-86e3-3e9572f1e63b\\\",\\\"systemUUID\\\":\\\"ab8b87ec-94d1-4eae-9ea3-b28f83991d01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:42Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:42 crc kubenswrapper[4771]: E1001 14:56:42.915239 4771 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.917639 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.917720 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.917790 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.917832 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.917860 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:42Z","lastTransitionTime":"2025-10-01T14:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.984342 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.984432 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.984437 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:56:42 crc kubenswrapper[4771]: I1001 14:56:42.984363 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:56:42 crc kubenswrapper[4771]: E1001 14:56:42.984552 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:56:42 crc kubenswrapper[4771]: E1001 14:56:42.984669 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:56:42 crc kubenswrapper[4771]: E1001 14:56:42.984860 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:56:42 crc kubenswrapper[4771]: E1001 14:56:42.985028 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.020810 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.020868 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.020878 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.020900 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.020913 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:43Z","lastTransitionTime":"2025-10-01T14:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.123936 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.124020 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.124038 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.124062 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.124080 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:43Z","lastTransitionTime":"2025-10-01T14:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.228204 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.228275 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.228296 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.228323 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.228342 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:43Z","lastTransitionTime":"2025-10-01T14:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.255921 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.257287 4771 scope.go:117] "RemoveContainer" containerID="f8e753527ef673cd09f9a6d72e7729db9d0d48329088fa9abab4ff1ed30ef3ae" Oct 01 14:56:43 crc kubenswrapper[4771]: E1001 14:56:43.257779 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-j7ntp_openshift-ovn-kubernetes(a061b8e2-74a8-4953-bfa2-5090a2f70459)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.283170 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:43Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.305921 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:43Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.323495 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9lvcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96a3328-c79b-4528-b9b5-badbc7380dd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dea414c74a97184e0507113cc8069ee732fe85125aee58c080165c511835f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs5q5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9lvcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:43Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.331231 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.331255 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.331264 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.331282 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.331295 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:43Z","lastTransitionTime":"2025-10-01T14:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.338818 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8qdkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a49c960d-cfd1-4745-976b-59c62e3dcf8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9mq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9mq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8qdkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:43Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.368135 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8156f536-4329-4bc1-8ea3-653c5d719a19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dbd5c544c544544637a62c6e11372b59bdddc28c6ddb14e2cdd93d2d02a85b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9cd85dad256af814a22cd9787f3c1d4061ff2b76a0fb1a92b547a10c809a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d50224da1e43f0ad8835be229e79e94d6603a0fcb0879b9dd306e4cc357c96f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6def3f33927474e3a3c86835c9606ccfd32e235441e1646cdba4cef0b41e2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe4f8936af05697692a6547962fa77f3ac8bb357e3484324073f7768b374060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:43Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.383633 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b439925b-c79c-4b66-957f-5be27680bbc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8551072e274183e6126e06729369468bcf9a98a9a487ff2febe02b088a6a51aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd6957bcec2f931ddc0f584bb947cc6c8e19aee87c4fa2d2727151312e00bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea2a54d9ce79c067adf62a1a112758028562f68fd877d7b5c1f0ac808fde931\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba775d006f7fce834b114986ea63340af2cca2e7d10b6fc3be16555f278fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 14:56:16.707999 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 14:56:16.708248 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 14:56:16.709042 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1763409596/tls.crt::/tmp/serving-cert-1763409596/tls.key\\\\\\\"\\\\nI1001 14:56:17.040394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 14:56:17.043951 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 14:56:17.043976 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 14:56:17.043997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 14:56:17.044002 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 14:56:17.049470 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1001 14:56:17.049534 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 14:56:17.049584 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049607 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 14:56:17.049653 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 14:56:17.049673 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 14:56:17.049696 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 14:56:17.051084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d920faabd05e65200c4defe1588344af5ae2b34148b4213bdc86abe9dd19d236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:43Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.401587 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa69b46a-20c4-4ce1-8be9-e945c6865b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0651d298d7c1364b4de3b030574abd7d6ba8ccacab5872d74162878f92592cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b72097f8016d0e79134956d3555973cd7c137c30ab5eade1bd92805d16f66bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://495c07855c881f2a9cd788740e2197c165a38891c729a8563f3540bf5c6d8dcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fb83a6932caf24a6b2430836bc2800415028a8134a2355a816ccbb73efbaf46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:43Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.418031 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad0b1d01c4ff14005e40574c1663d3fafddd1527b8d283ee2f2364a861e3c351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:43Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.428756 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kmlgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2129365d-0a99-4cf0-a561-fd4126d1bfc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dabcd9bb31c364a82e0015bb58c48344f35fd73013cb9eb2c9d178ea6befbdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqx4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kmlgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:43Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.434666 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.434817 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.434899 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.434981 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.435057 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:43Z","lastTransitionTime":"2025-10-01T14:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.445926 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be36f4b-1171-4281-a7ac-43e411e080f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f58af4fa01651762bc2de081e844beb25bd1468804d6dbf99d01be10dd80e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jj6k4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:43Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.462535 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6ef7bbeb1f3ba58b4d1cfa2d55bcc4d7e7264d3db53e61495625b58a8503ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://679e6aca450f387cb31946627423ce8bc164d9815a7caf225e6cf1512f189cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:43Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.475051 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9jb9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe483b7b-ed55-4649-ac50-66ac981305e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e00c87efe2f7a38dd71171e28ae517733a09ed433bd3fec878757e5094d423ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2dtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb235560f4fabd7d33a9286e029c075fefa4dd44eea942cd8fe4ca74819ce722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2dtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9jb9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:43Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.498437 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02063fddc7a255292ea7cdd6a318546b1573a0701787f4ab839ac7bea5b1ccb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:43Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.517294 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:43Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.534510 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7wr7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bb66959-800a-45dc-909f-1a093c578823\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abffda55d62e4f219933292ded99619fb5bfbbe87a5091c8aaaee6ea6162353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bdn7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7wr7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:43Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.537980 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.538185 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.538316 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.538444 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.538578 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:43Z","lastTransitionTime":"2025-10-01T14:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.564060 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"289ee6d3-fabe-417f-964c-76ca03c143cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37230c61c3cdf57d73df404731eb692cf20c46a8d983ee40c0aef7ee1f3ad839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://161a9b5c96d73213aa3d0261956487464c5e1398e47d221db9fccafdfdc51856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vck47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:43Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.593967 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a061b8e2-74a8-4953-bfa2-5090a2f70459\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8e753527ef673cd09f9a6d72e7729db9d0d48329088fa9abab4ff1ed30ef3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e753527ef673cd09f9a6d72e7729db9d0d48329088fa9abab4ff1ed30ef3ae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T14:56:35Z\\\",\\\"message\\\":\\\"false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.110],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nF1001 14:56:35.269926 6242 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:35Z is after 2025-08-24T17:21:41Z]\\\\nI1001 14:56:35.269935 6242 lb_config.go:1031] Cluster endpoints for openshift-operator-lifecycle-manager/package-server-manager-metrics for network=default are: map[]\\\\nI1001 14:56:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-j7ntp_openshift-ovn-kubernetes(a061b8e2-74a8-4953-bfa2-5090a2f70459)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j7ntp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:43Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.642129 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.642168 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.642177 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.642196 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.642207 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:43Z","lastTransitionTime":"2025-10-01T14:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.744464 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.744526 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.744550 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.744578 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.744602 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:43Z","lastTransitionTime":"2025-10-01T14:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.848726 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.848844 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.848878 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.848924 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.848949 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:43Z","lastTransitionTime":"2025-10-01T14:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.952939 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.953005 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.953025 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.953044 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:43 crc kubenswrapper[4771]: I1001 14:56:43.953056 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:43Z","lastTransitionTime":"2025-10-01T14:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.055155 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.055483 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.055557 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.055634 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.055706 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:44Z","lastTransitionTime":"2025-10-01T14:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.158519 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.158863 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.158957 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.159072 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.159165 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:44Z","lastTransitionTime":"2025-10-01T14:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.263308 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.263803 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.263999 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.264229 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.264419 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:44Z","lastTransitionTime":"2025-10-01T14:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.322289 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.332249 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.345597 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6ef7bbeb1f3ba58b4d1cfa2d55bcc4d7e7264d3db53e61495625b58a8503ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://679e6aca450f387cb31946627423ce8bc164d9815a7caf225e6cf1512f189cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:44Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.360883 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9jb9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe483b7b-ed55-4649-ac50-66ac981305e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e00c87efe2f7a38dd71171e28ae517733a09ed433bd3fec878757e5094d423ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2dtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb235560f4fabd7d33a9286e029c075fefa4dd44eea942cd8fe4ca74819ce722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2dtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9jb9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:44Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.367224 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.367283 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.367301 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.367322 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.367338 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:44Z","lastTransitionTime":"2025-10-01T14:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.377021 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"289ee6d3-fabe-417f-964c-76ca03c143cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37230c61c3cdf57d73df404731eb692cf20c46a8d983ee40c0aef7ee1f3ad839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://161a9b5c96d73213aa3d0261956487464c5e1398e47d221db9fccafdfdc51856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vck47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:44Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.401778 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a061b8e2-74a8-4953-bfa2-5090a2f70459\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8e753527ef673cd09f9a6d72e7729db9d0d48329088fa9abab4ff1ed30ef3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e753527ef673cd09f9a6d72e7729db9d0d48329088fa9abab4ff1ed30ef3ae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T14:56:35Z\\\",\\\"message\\\":\\\"false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.110],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nF1001 14:56:35.269926 6242 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:35Z is after 2025-08-24T17:21:41Z]\\\\nI1001 14:56:35.269935 6242 lb_config.go:1031] Cluster endpoints for openshift-operator-lifecycle-manager/package-server-manager-metrics for network=default are: map[]\\\\nI1001 14:56:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-j7ntp_openshift-ovn-kubernetes(a061b8e2-74a8-4953-bfa2-5090a2f70459)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j7ntp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:44Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.416110 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02063fddc7a255292ea7cdd6a318546b1573a0701787f4ab839ac7bea5b1ccb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:44Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.439314 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:44Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.455344 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7wr7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bb66959-800a-45dc-909f-1a093c578823\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abffda55d62e4f219933292ded99619fb5bfbbe87a5091c8aaaee6ea6162353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bdn7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7wr7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:44Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.469881 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.469933 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.469944 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.469964 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.469977 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:44Z","lastTransitionTime":"2025-10-01T14:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.476940 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:44Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.496941 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:44Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.522840 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9lvcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96a3328-c79b-4528-b9b5-badbc7380dd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dea414c74a97184e0507113cc8069ee732fe85125aee58c080165c511835f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs5q5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9lvcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:44Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.538330 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8qdkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a49c960d-cfd1-4745-976b-59c62e3dcf8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9mq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9mq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8qdkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:44Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.555871 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kmlgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2129365d-0a99-4cf0-a561-fd4126d1bfc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dabcd9bb31c364a82e0015bb58c48344f35fd73013cb9eb2c9d178ea6befbdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqx4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kmlgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:44Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.573772 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.574487 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.574532 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.574560 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.574579 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:44Z","lastTransitionTime":"2025-10-01T14:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.576878 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be36f4b-1171-4281-a7ac-43e411e080f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f58af4fa01651762bc2de081e844beb25bd1468804d6dbf99d01be10dd80e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jj6k4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:44Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.608625 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8156f536-4329-4bc1-8ea3-653c5d719a19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dbd5c544c544544637a62c6e11372b59bdddc28c6ddb14e2cdd93d2d02a85b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9cd85dad256af814a22cd9787f3c1d4061ff2b76a0fb1a92b547a10c809a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d50224da1e43f0ad8835be229e79e94d6603a0fcb0879b9dd306e4cc357c96f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6def3f33927474e3a3c86835c9606ccfd32e235441e1646cdba4cef0b41e2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe4f8936af05697692a6547962fa77f3ac8bb357e3484324073f7768b374060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:44Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.628502 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b439925b-c79c-4b66-957f-5be27680bbc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8551072e274183e6126e06729369468bcf9a98a9a487ff2febe02b088a6a51aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd6957bcec2f931ddc0f584bb947cc6c8e19aee87c4fa2d2727151312e00bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea2a54d9ce79c067adf62a1a112758028562f68fd877d7b5c1f0ac808fde931\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba775d006f7fce834b114986ea63340af2cca2e7d10b6fc3be16555f278fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 14:56:16.707999 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 14:56:16.708248 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 14:56:16.709042 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1763409596/tls.crt::/tmp/serving-cert-1763409596/tls.key\\\\\\\"\\\\nI1001 14:56:17.040394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 14:56:17.043951 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 14:56:17.043976 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 14:56:17.043997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 14:56:17.044002 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 14:56:17.049470 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1001 14:56:17.049534 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 14:56:17.049584 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049607 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 14:56:17.049653 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 14:56:17.049673 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 14:56:17.049696 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 14:56:17.051084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d920faabd05e65200c4defe1588344af5ae2b34148b4213bdc86abe9dd19d236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:44Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.645198 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa69b46a-20c4-4ce1-8be9-e945c6865b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0651d298d7c1364b4de3b030574abd7d6ba8ccacab5872d74162878f92592cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b72097f8016d0e79134956d3555973cd7c137c30ab5eade1bd92805d16f66bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://495c07855c881f2a9cd788740e2197c165a38891c729a8563f3540bf5c6d8dcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fb83a6932caf24a6b2430836bc2800415028a8134a2355a816ccbb73efbaf46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:44Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.661851 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad0b1d01c4ff14005e40574c1663d3fafddd1527b8d283ee2f2364a861e3c351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:44Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.677180 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.677220 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.677231 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.677247 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.677258 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:44Z","lastTransitionTime":"2025-10-01T14:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.780675 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.780797 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.780857 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.780889 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.780912 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:44Z","lastTransitionTime":"2025-10-01T14:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.884106 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.884163 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.884171 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.884187 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.884195 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:44Z","lastTransitionTime":"2025-10-01T14:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.984341 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.984463 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:56:44 crc kubenswrapper[4771]: E1001 14:56:44.985114 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.984527 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:56:44 crc kubenswrapper[4771]: E1001 14:56:44.985124 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.984502 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:56:44 crc kubenswrapper[4771]: E1001 14:56:44.985509 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:56:44 crc kubenswrapper[4771]: E1001 14:56:44.985934 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.987414 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.987448 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.987459 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.987477 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.987489 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:44Z","lastTransitionTime":"2025-10-01T14:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:44 crc kubenswrapper[4771]: I1001 14:56:44.992723 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a49c960d-cfd1-4745-976b-59c62e3dcf8e-metrics-certs\") pod \"network-metrics-daemon-8qdkc\" (UID: \"a49c960d-cfd1-4745-976b-59c62e3dcf8e\") " pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:56:44 crc kubenswrapper[4771]: E1001 14:56:44.993030 4771 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 14:56:44 crc kubenswrapper[4771]: E1001 14:56:44.993104 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a49c960d-cfd1-4745-976b-59c62e3dcf8e-metrics-certs podName:a49c960d-cfd1-4745-976b-59c62e3dcf8e nodeName:}" failed. No retries permitted until 2025-10-01 14:56:52.99308252 +0000 UTC m=+57.612257731 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a49c960d-cfd1-4745-976b-59c62e3dcf8e-metrics-certs") pod "network-metrics-daemon-8qdkc" (UID: "a49c960d-cfd1-4745-976b-59c62e3dcf8e") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 14:56:45 crc kubenswrapper[4771]: I1001 14:56:45.090702 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:45 crc kubenswrapper[4771]: I1001 14:56:45.090803 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:45 crc kubenswrapper[4771]: I1001 14:56:45.090820 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:45 crc kubenswrapper[4771]: I1001 14:56:45.090844 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:45 crc kubenswrapper[4771]: I1001 14:56:45.090863 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:45Z","lastTransitionTime":"2025-10-01T14:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:45 crc kubenswrapper[4771]: I1001 14:56:45.193705 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:45 crc kubenswrapper[4771]: I1001 14:56:45.193785 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:45 crc kubenswrapper[4771]: I1001 14:56:45.193804 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:45 crc kubenswrapper[4771]: I1001 14:56:45.193827 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:45 crc kubenswrapper[4771]: I1001 14:56:45.193846 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:45Z","lastTransitionTime":"2025-10-01T14:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:45 crc kubenswrapper[4771]: I1001 14:56:45.297159 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:45 crc kubenswrapper[4771]: I1001 14:56:45.297201 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:45 crc kubenswrapper[4771]: I1001 14:56:45.297212 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:45 crc kubenswrapper[4771]: I1001 14:56:45.297228 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:45 crc kubenswrapper[4771]: I1001 14:56:45.297239 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:45Z","lastTransitionTime":"2025-10-01T14:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:45 crc kubenswrapper[4771]: I1001 14:56:45.400506 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:45 crc kubenswrapper[4771]: I1001 14:56:45.400572 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:45 crc kubenswrapper[4771]: I1001 14:56:45.400590 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:45 crc kubenswrapper[4771]: I1001 14:56:45.400613 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:45 crc kubenswrapper[4771]: I1001 14:56:45.400631 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:45Z","lastTransitionTime":"2025-10-01T14:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:45 crc kubenswrapper[4771]: I1001 14:56:45.503496 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:45 crc kubenswrapper[4771]: I1001 14:56:45.503597 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:45 crc kubenswrapper[4771]: I1001 14:56:45.503613 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:45 crc kubenswrapper[4771]: I1001 14:56:45.503630 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:45 crc kubenswrapper[4771]: I1001 14:56:45.503644 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:45Z","lastTransitionTime":"2025-10-01T14:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:45 crc kubenswrapper[4771]: I1001 14:56:45.672388 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:45 crc kubenswrapper[4771]: I1001 14:56:45.672430 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:45 crc kubenswrapper[4771]: I1001 14:56:45.672445 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:45 crc kubenswrapper[4771]: I1001 14:56:45.672491 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:45 crc kubenswrapper[4771]: I1001 14:56:45.672504 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:45Z","lastTransitionTime":"2025-10-01T14:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:45 crc kubenswrapper[4771]: I1001 14:56:45.775763 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:45 crc kubenswrapper[4771]: I1001 14:56:45.775828 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:45 crc kubenswrapper[4771]: I1001 14:56:45.775845 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:45 crc kubenswrapper[4771]: I1001 14:56:45.775868 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:45 crc kubenswrapper[4771]: I1001 14:56:45.775888 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:45Z","lastTransitionTime":"2025-10-01T14:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:45 crc kubenswrapper[4771]: I1001 14:56:45.879072 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:45 crc kubenswrapper[4771]: I1001 14:56:45.879161 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:45 crc kubenswrapper[4771]: I1001 14:56:45.879209 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:45 crc kubenswrapper[4771]: I1001 14:56:45.879232 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:45 crc kubenswrapper[4771]: I1001 14:56:45.879248 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:45Z","lastTransitionTime":"2025-10-01T14:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:45 crc kubenswrapper[4771]: I1001 14:56:45.981699 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:45 crc kubenswrapper[4771]: I1001 14:56:45.981772 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:45 crc kubenswrapper[4771]: I1001 14:56:45.981807 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:45 crc kubenswrapper[4771]: I1001 14:56:45.981823 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:45 crc kubenswrapper[4771]: I1001 14:56:45.981834 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:45Z","lastTransitionTime":"2025-10-01T14:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.005529 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6ef7bbeb1f3ba58b4d1cfa2d55bcc4d7e7264d3db53e61495625b58a8503ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://679e6aca450f387cb31946627423ce8bc164d9815a7caf225e6cf1512f189cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:46Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.022532 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9jb9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe483b7b-ed55-4649-ac50-66ac981305e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e00c87efe2f7a38dd71171e28ae517733a09ed433bd3fec878757e5094d423ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2dtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb235560f4fabd7d33a9286e029c075fefa4dd44eea942cd8fe4ca74819ce722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2dtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9jb9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:46Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.039538 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"289ee6d3-fabe-417f-964c-76ca03c143cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37230c61c3cdf57d73df404731eb692cf20c46a8d983ee40c0aef7ee1f3ad839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://161a9b5c96d73213aa3d0261956487464c5e1398e47d221db9fccafdfdc51856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vck47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:46Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.064400 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a061b8e2-74a8-4953-bfa2-5090a2f70459\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8e753527ef673cd09f9a6d72e7729db9d0d48329088fa9abab4ff1ed30ef3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e753527ef673cd09f9a6d72e7729db9d0d48329088fa9abab4ff1ed30ef3ae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T14:56:35Z\\\",\\\"message\\\":\\\"false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.110],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nF1001 14:56:35.269926 6242 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:35Z is after 2025-08-24T17:21:41Z]\\\\nI1001 14:56:35.269935 6242 lb_config.go:1031] Cluster endpoints for openshift-operator-lifecycle-manager/package-server-manager-metrics for network=default are: map[]\\\\nI1001 14:56:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-j7ntp_openshift-ovn-kubernetes(a061b8e2-74a8-4953-bfa2-5090a2f70459)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j7ntp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:46Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.080189 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b2b4b8e-b886-4fa6-abf2-6bffd3d7dd4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe139fdfee8f2ebb2368fa660edd669455c3b903836d7ef6212dea9921d8488b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5837a628a7ed87d4bc032e06b4732df175e922bf49ecbffee596f79c5357c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e47111864672ec3e393187147b7390f995634d4d32bf75915b5cdbb3915aca9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eab2c2fe2d4eae570e4686e0c48ff8e9407ff544bcd9f5339371287c23449333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eab2c2fe2d4eae570e4686e0c48ff8e9407ff544bcd9f5339371287c23449333\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:46Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.084141 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.084211 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.084236 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.084263 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.084284 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:46Z","lastTransitionTime":"2025-10-01T14:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.102265 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02063fddc7a255292ea7cdd6a318546b1573a0701787f4ab839ac7bea5b1ccb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:46Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.120947 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:46Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.134906 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7wr7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bb66959-800a-45dc-909f-1a093c578823\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abffda55d62e4f219933292ded99619fb5bfbbe87a5091c8aaaee6ea6162353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bdn7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7wr7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:46Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.155372 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:46Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.170791 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:46Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.186510 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.186544 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.186552 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.186566 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.186577 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:46Z","lastTransitionTime":"2025-10-01T14:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.189227 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9lvcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96a3328-c79b-4528-b9b5-badbc7380dd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dea414c74a97184e0507113cc8069ee732fe85125aee58c080165c511835f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs5q5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9lvcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:46Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.201744 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8qdkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a49c960d-cfd1-4745-976b-59c62e3dcf8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9mq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9mq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8qdkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:46Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.212775 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kmlgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2129365d-0a99-4cf0-a561-fd4126d1bfc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dabcd9bb31c364a82e0015bb58c48344f35fd73013cb9eb2c9d178ea6befbdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqx4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kmlgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:46Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.228216 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be36f4b-1171-4281-a7ac-43e411e080f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f58af4fa01651762bc2de081e844beb25bd1468804d6dbf99d01be10dd80e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jj6k4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:46Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.261553 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8156f536-4329-4bc1-8ea3-653c5d719a19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dbd5c544c544544637a62c6e11372b59bdddc28c6ddb14e2cdd93d2d02a85b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9cd85dad256af814a22cd9787f3c1d4061ff2b76a0fb1a92b547a10c809a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d50224da1e43f0ad8835be229e79e94d6603a0fcb0879b9dd306e4cc357c96f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6def3f33927474e3a3c86835c9606ccfd32e235441e1646cdba4cef0b41e2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe4f8936af05697692a6547962fa77f3ac8bb357e3484324073f7768b374060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:46Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.283090 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b439925b-c79c-4b66-957f-5be27680bbc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8551072e274183e6126e06729369468bcf9a98a9a487ff2febe02b088a6a51aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd6957bcec2f931ddc0f584bb947cc6c8e19aee87c4fa2d2727151312e00bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea2a54d9ce79c067adf62a1a112758028562f68fd877d7b5c1f0ac808fde931\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba775d006f7fce834b114986ea63340af2cca2e7d10b6fc3be16555f278fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 14:56:16.707999 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 14:56:16.708248 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 14:56:16.709042 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1763409596/tls.crt::/tmp/serving-cert-1763409596/tls.key\\\\\\\"\\\\nI1001 14:56:17.040394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 14:56:17.043951 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 14:56:17.043976 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 14:56:17.043997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 14:56:17.044002 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 14:56:17.049470 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1001 14:56:17.049534 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 14:56:17.049584 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049607 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 14:56:17.049653 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 14:56:17.049673 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 14:56:17.049696 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 14:56:17.051084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d920faabd05e65200c4defe1588344af5ae2b34148b4213bdc86abe9dd19d236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:46Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.289247 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.289304 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.289324 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.289351 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.289375 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:46Z","lastTransitionTime":"2025-10-01T14:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.298076 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa69b46a-20c4-4ce1-8be9-e945c6865b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0651d298d7c1364b4de3b030574abd7d6ba8ccacab5872d74162878f92592cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b72097f8016d0e79134956d3555973cd7c137c30ab5eade1bd92805d16f66bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://495c07855c881f2a9cd788740e2197c165a38891c729a8563f3540bf5c6d8dcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fb83a6932caf24a6b2430836bc2800415028a8134a2355a816ccbb73efbaf46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:46Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.312822 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad0b1d01c4ff14005e40574c1663d3fafddd1527b8d283ee2f2364a861e3c351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:46Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.392429 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.392490 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.392510 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.392533 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.392550 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:46Z","lastTransitionTime":"2025-10-01T14:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.496039 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.496086 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.496098 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.496116 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.496139 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:46Z","lastTransitionTime":"2025-10-01T14:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.599275 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.599328 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.599338 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.599355 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.599366 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:46Z","lastTransitionTime":"2025-10-01T14:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.702225 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.702279 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.702288 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.702303 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.702313 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:46Z","lastTransitionTime":"2025-10-01T14:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.806097 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.806207 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.806232 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.806264 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.806289 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:46Z","lastTransitionTime":"2025-10-01T14:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.909398 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.909440 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.909455 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.909475 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.909490 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:46Z","lastTransitionTime":"2025-10-01T14:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.984282 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.984350 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.984290 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:56:46 crc kubenswrapper[4771]: I1001 14:56:46.984289 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:56:46 crc kubenswrapper[4771]: E1001 14:56:46.984512 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:56:46 crc kubenswrapper[4771]: E1001 14:56:46.984652 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:56:46 crc kubenswrapper[4771]: E1001 14:56:46.984813 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:56:46 crc kubenswrapper[4771]: E1001 14:56:46.984953 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.012124 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.012192 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.012211 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.012234 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.012253 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:47Z","lastTransitionTime":"2025-10-01T14:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.115212 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.115296 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.115320 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.115350 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.115376 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:47Z","lastTransitionTime":"2025-10-01T14:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.218045 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.218118 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.218155 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.218191 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.218213 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:47Z","lastTransitionTime":"2025-10-01T14:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.321437 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.321493 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.321513 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.321537 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.321558 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:47Z","lastTransitionTime":"2025-10-01T14:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.425993 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.426109 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.426124 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.426160 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.426174 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:47Z","lastTransitionTime":"2025-10-01T14:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.529558 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.529809 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.529818 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.529837 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.529848 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:47Z","lastTransitionTime":"2025-10-01T14:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.633572 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.633631 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.633651 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.633677 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.633696 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:47Z","lastTransitionTime":"2025-10-01T14:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.742101 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.742142 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.742151 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.742178 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.742188 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:47Z","lastTransitionTime":"2025-10-01T14:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.846891 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.847263 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.847429 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.847590 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.847802 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:47Z","lastTransitionTime":"2025-10-01T14:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.950969 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.951026 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.951041 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.951061 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:47 crc kubenswrapper[4771]: I1001 14:56:47.951075 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:47Z","lastTransitionTime":"2025-10-01T14:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.054775 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.054858 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.054887 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.054915 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.054942 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:48Z","lastTransitionTime":"2025-10-01T14:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.158544 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.159061 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.159263 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.159447 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.159692 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:48Z","lastTransitionTime":"2025-10-01T14:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.262588 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.262640 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.262654 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.262671 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.262685 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:48Z","lastTransitionTime":"2025-10-01T14:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.365068 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.365178 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.365204 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.365236 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.365260 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:48Z","lastTransitionTime":"2025-10-01T14:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.468947 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.469025 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.469037 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.469074 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.469086 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:48Z","lastTransitionTime":"2025-10-01T14:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.572601 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.572697 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.572776 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.572816 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.572840 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:48Z","lastTransitionTime":"2025-10-01T14:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.676266 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.676313 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.676326 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.676345 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.676358 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:48Z","lastTransitionTime":"2025-10-01T14:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.780156 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.780225 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.780244 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.780269 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.780288 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:48Z","lastTransitionTime":"2025-10-01T14:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.807711 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:56:48 crc kubenswrapper[4771]: E1001 14:56:48.808038 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:57:20.808007689 +0000 UTC m=+85.427182900 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.883251 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.883388 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.883454 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.883490 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.883514 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:48Z","lastTransitionTime":"2025-10-01T14:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.909111 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.909154 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.909180 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.909206 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:56:48 crc kubenswrapper[4771]: E1001 14:56:48.909314 4771 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 14:56:48 crc kubenswrapper[4771]: E1001 14:56:48.909362 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 14:57:20.909347632 +0000 UTC m=+85.528522813 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 14:56:48 crc kubenswrapper[4771]: E1001 14:56:48.909352 4771 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 14:56:48 crc kubenswrapper[4771]: E1001 14:56:48.909447 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 14:56:48 crc kubenswrapper[4771]: E1001 14:56:48.909489 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 14:56:48 crc kubenswrapper[4771]: E1001 14:56:48.909503 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 14:56:48 crc kubenswrapper[4771]: E1001 14:56:48.909517 4771 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 14:56:48 crc kubenswrapper[4771]: E1001 14:56:48.909541 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 14:56:48 crc kubenswrapper[4771]: E1001 14:56:48.909562 4771 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 14:56:48 crc kubenswrapper[4771]: E1001 14:56:48.909518 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 14:57:20.909481005 +0000 UTC m=+85.528656216 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 14:56:48 crc kubenswrapper[4771]: E1001 14:56:48.909679 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 14:57:20.909631998 +0000 UTC m=+85.528807179 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 14:56:48 crc kubenswrapper[4771]: E1001 14:56:48.909703 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 14:57:20.90969392 +0000 UTC m=+85.528869101 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.984987 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.984997 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.985042 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.985233 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:56:48 crc kubenswrapper[4771]: E1001 14:56:48.986154 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:56:48 crc kubenswrapper[4771]: E1001 14:56:48.986277 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:56:48 crc kubenswrapper[4771]: E1001 14:56:48.986399 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:56:48 crc kubenswrapper[4771]: E1001 14:56:48.986487 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.987550 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.987625 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.987650 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.987681 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:48 crc kubenswrapper[4771]: I1001 14:56:48.987705 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:48Z","lastTransitionTime":"2025-10-01T14:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:49 crc kubenswrapper[4771]: I1001 14:56:49.091045 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:49 crc kubenswrapper[4771]: I1001 14:56:49.091117 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:49 crc kubenswrapper[4771]: I1001 14:56:49.091141 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:49 crc kubenswrapper[4771]: I1001 14:56:49.091170 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:49 crc kubenswrapper[4771]: I1001 14:56:49.091199 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:49Z","lastTransitionTime":"2025-10-01T14:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:49 crc kubenswrapper[4771]: I1001 14:56:49.194655 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:49 crc kubenswrapper[4771]: I1001 14:56:49.194763 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:49 crc kubenswrapper[4771]: I1001 14:56:49.194789 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:49 crc kubenswrapper[4771]: I1001 14:56:49.194820 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:49 crc kubenswrapper[4771]: I1001 14:56:49.194845 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:49Z","lastTransitionTime":"2025-10-01T14:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:49 crc kubenswrapper[4771]: I1001 14:56:49.298571 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:49 crc kubenswrapper[4771]: I1001 14:56:49.298634 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:49 crc kubenswrapper[4771]: I1001 14:56:49.298650 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:49 crc kubenswrapper[4771]: I1001 14:56:49.298678 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:49 crc kubenswrapper[4771]: I1001 14:56:49.298695 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:49Z","lastTransitionTime":"2025-10-01T14:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:49 crc kubenswrapper[4771]: I1001 14:56:49.402122 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:49 crc kubenswrapper[4771]: I1001 14:56:49.402194 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:49 crc kubenswrapper[4771]: I1001 14:56:49.402221 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:49 crc kubenswrapper[4771]: I1001 14:56:49.402251 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:49 crc kubenswrapper[4771]: I1001 14:56:49.402272 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:49Z","lastTransitionTime":"2025-10-01T14:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:49 crc kubenswrapper[4771]: I1001 14:56:49.505985 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:49 crc kubenswrapper[4771]: I1001 14:56:49.506044 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:49 crc kubenswrapper[4771]: I1001 14:56:49.506061 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:49 crc kubenswrapper[4771]: I1001 14:56:49.506082 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:49 crc kubenswrapper[4771]: I1001 14:56:49.506100 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:49Z","lastTransitionTime":"2025-10-01T14:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:49 crc kubenswrapper[4771]: I1001 14:56:49.609971 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:49 crc kubenswrapper[4771]: I1001 14:56:49.610050 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:49 crc kubenswrapper[4771]: I1001 14:56:49.610072 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:49 crc kubenswrapper[4771]: I1001 14:56:49.610100 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:49 crc kubenswrapper[4771]: I1001 14:56:49.610119 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:49Z","lastTransitionTime":"2025-10-01T14:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:49 crc kubenswrapper[4771]: I1001 14:56:49.713287 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:49 crc kubenswrapper[4771]: I1001 14:56:49.713671 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:49 crc kubenswrapper[4771]: I1001 14:56:49.713843 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:49 crc kubenswrapper[4771]: I1001 14:56:49.713989 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:49 crc kubenswrapper[4771]: I1001 14:56:49.714124 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:49Z","lastTransitionTime":"2025-10-01T14:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:49 crc kubenswrapper[4771]: I1001 14:56:49.817569 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:49 crc kubenswrapper[4771]: I1001 14:56:49.818462 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:49 crc kubenswrapper[4771]: I1001 14:56:49.818638 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:49 crc kubenswrapper[4771]: I1001 14:56:49.818822 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:49 crc kubenswrapper[4771]: I1001 14:56:49.818948 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:49Z","lastTransitionTime":"2025-10-01T14:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:49 crc kubenswrapper[4771]: I1001 14:56:49.923299 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:49 crc kubenswrapper[4771]: I1001 14:56:49.923708 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:49 crc kubenswrapper[4771]: I1001 14:56:49.923967 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:49 crc kubenswrapper[4771]: I1001 14:56:49.924199 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:49 crc kubenswrapper[4771]: I1001 14:56:49.924407 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:49Z","lastTransitionTime":"2025-10-01T14:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.028552 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.029202 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.029366 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.029923 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.030208 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:50Z","lastTransitionTime":"2025-10-01T14:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.134127 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.134521 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.134786 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.135096 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.135397 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:50Z","lastTransitionTime":"2025-10-01T14:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.247065 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.247128 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.247139 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.247160 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.247172 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:50Z","lastTransitionTime":"2025-10-01T14:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.351071 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.351114 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.351125 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.351142 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.351153 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:50Z","lastTransitionTime":"2025-10-01T14:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.454408 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.454511 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.454526 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.454560 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.454578 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:50Z","lastTransitionTime":"2025-10-01T14:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.557898 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.557966 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.557983 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.558009 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.558027 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:50Z","lastTransitionTime":"2025-10-01T14:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.577440 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.601772 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b2b4b8e-b886-4fa6-abf2-6bffd3d7dd4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe139fdfee8f2ebb2368fa660edd669455c3b903836d7ef6212dea9921d8488b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5837a628a7ed87d4bc032e06b4732df175e922bf49ecbffee596f79c5357c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e47111864672ec3e393187147b7390f995634d4d32bf75915b5cdbb3915aca9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eab2c2fe2d4eae570e4686e0c48ff8e9407ff544bcd9f5339371287c23449333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eab2c2fe2d4eae570e4686e0c48ff8e9407ff544bcd9f5339371287c23449333\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:50Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.625083 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02063fddc7a255292ea7cdd6a318546b1573a0701787f4ab839ac7bea5b1ccb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:50Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.646010 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:50Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.660902 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.660949 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.660961 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.660978 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.660992 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:50Z","lastTransitionTime":"2025-10-01T14:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.664206 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7wr7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bb66959-800a-45dc-909f-1a093c578823\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abffda55d62e4f219933292ded99619fb5bfbbe87a5091c8aaaee6ea6162353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bdn7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7wr7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:50Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.680594 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"289ee6d3-fabe-417f-964c-76ca03c143cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37230c61c3cdf57d73df404731eb692cf20c46a8d983ee40c0aef7ee1f3ad839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://161a9b5c96d73213aa3d0261956487464c5e1398e47d221db9fccafdfdc51856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vck47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:50Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.706056 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a061b8e2-74a8-4953-bfa2-5090a2f70459\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8e753527ef673cd09f9a6d72e7729db9d0d48329088fa9abab4ff1ed30ef3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e753527ef673cd09f9a6d72e7729db9d0d48329088fa9abab4ff1ed30ef3ae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T14:56:35Z\\\",\\\"message\\\":\\\"false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.110],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nF1001 14:56:35.269926 6242 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:35Z is after 2025-08-24T17:21:41Z]\\\\nI1001 14:56:35.269935 6242 lb_config.go:1031] Cluster endpoints for openshift-operator-lifecycle-manager/package-server-manager-metrics for network=default are: map[]\\\\nI1001 14:56:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-j7ntp_openshift-ovn-kubernetes(a061b8e2-74a8-4953-bfa2-5090a2f70459)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j7ntp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:50Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.726135 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:50Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.746801 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:50Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.764199 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.764261 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.764278 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.764302 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.764322 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:50Z","lastTransitionTime":"2025-10-01T14:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.770036 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9lvcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96a3328-c79b-4528-b9b5-badbc7380dd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dea414c74a97184e0507113cc8069ee732fe85125aee58c080165c511835f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs5q5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9lvcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:50Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.789313 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8qdkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a49c960d-cfd1-4745-976b-59c62e3dcf8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9mq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9mq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8qdkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:50Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.816227 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8156f536-4329-4bc1-8ea3-653c5d719a19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dbd5c544c544544637a62c6e11372b59bdddc28c6ddb14e2cdd93d2d02a85b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9cd85dad256af814a22cd9787f3c1d4061ff2b76a0fb1a92b547a10c809a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d50224da1e43f0ad8835be229e79e94d6603a0fcb0879b9dd306e4cc357c96f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6def3f33927474e3a3c86835c9606ccfd32e235441e1646cdba4cef0b41e2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe4f8936af05697692a6547962fa77f3ac8bb357e3484324073f7768b374060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:50Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.842043 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b439925b-c79c-4b66-957f-5be27680bbc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8551072e274183e6126e06729369468bcf9a98a9a487ff2febe02b088a6a51aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd6957bcec2f931ddc0f584bb947cc6c8e19aee87c4fa2d2727151312e00bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea2a54d9ce79c067adf62a1a112758028562f68fd877d7b5c1f0ac808fde931\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba775d006f7fce834b114986ea63340af2cca2e7d10b6fc3be16555f278fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 14:56:16.707999 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 14:56:16.708248 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 14:56:16.709042 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1763409596/tls.crt::/tmp/serving-cert-1763409596/tls.key\\\\\\\"\\\\nI1001 14:56:17.040394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 14:56:17.043951 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 14:56:17.043976 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 14:56:17.043997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 14:56:17.044002 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 14:56:17.049470 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1001 14:56:17.049534 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 14:56:17.049584 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049607 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 14:56:17.049653 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 14:56:17.049673 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 14:56:17.049696 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 14:56:17.051084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d920faabd05e65200c4defe1588344af5ae2b34148b4213bdc86abe9dd19d236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:50Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.864868 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa69b46a-20c4-4ce1-8be9-e945c6865b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0651d298d7c1364b4de3b030574abd7d6ba8ccacab5872d74162878f92592cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b72097f8016d0e79134956d3555973cd7c137c30ab5eade1bd92805d16f66bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://495c07855c881f2a9cd788740e2197c165a38891c729a8563f3540bf5c6d8dcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fb83a6932caf24a6b2430836bc2800415028a8134a2355a816ccbb73efbaf46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:50Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.867179 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.867235 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.867252 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.867276 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.867294 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:50Z","lastTransitionTime":"2025-10-01T14:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.885708 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad0b1d01c4ff14005e40574c1663d3fafddd1527b8d283ee2f2364a861e3c351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:50Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.904336 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kmlgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2129365d-0a99-4cf0-a561-fd4126d1bfc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dabcd9bb31c364a82e0015bb58c48344f35fd73013cb9eb2c9d178ea6befbdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqx4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kmlgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:50Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.931674 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be36f4b-1171-4281-a7ac-43e411e080f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f58af4fa01651762bc2de081e844beb25bd1468804d6dbf99d01be10dd80e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jj6k4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:50Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.954111 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6ef7bbeb1f3ba58b4d1cfa2d55bcc4d7e7264d3db53e61495625b58a8503ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://679e6aca450f387cb31946627423ce8bc164d9815a7caf225e6cf1512f189cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:50Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.970379 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.970652 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.970931 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.971168 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.971415 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:50Z","lastTransitionTime":"2025-10-01T14:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.973086 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9jb9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe483b7b-ed55-4649-ac50-66ac981305e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e00c87efe2f7a38dd71171e28ae517733a09ed433bd3fec878757e5094d423ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2dtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb235560f4fabd7d33a9286e029c075fefa4dd44eea942cd8fe4ca74819ce722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2dtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9jb9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:50Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.985307 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.985335 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.985437 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:56:50 crc kubenswrapper[4771]: E1001 14:56:50.985584 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:56:50 crc kubenswrapper[4771]: I1001 14:56:50.985682 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:56:50 crc kubenswrapper[4771]: E1001 14:56:50.985838 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:56:50 crc kubenswrapper[4771]: E1001 14:56:50.985996 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:56:50 crc kubenswrapper[4771]: E1001 14:56:50.986214 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:56:51 crc kubenswrapper[4771]: I1001 14:56:51.074414 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:51 crc kubenswrapper[4771]: I1001 14:56:51.074458 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:51 crc kubenswrapper[4771]: I1001 14:56:51.074467 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:51 crc kubenswrapper[4771]: I1001 14:56:51.074482 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:51 crc kubenswrapper[4771]: I1001 14:56:51.074493 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:51Z","lastTransitionTime":"2025-10-01T14:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:51 crc kubenswrapper[4771]: I1001 14:56:51.178564 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:51 crc kubenswrapper[4771]: I1001 14:56:51.178626 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:51 crc kubenswrapper[4771]: I1001 14:56:51.178642 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:51 crc kubenswrapper[4771]: I1001 14:56:51.178666 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:51 crc kubenswrapper[4771]: I1001 14:56:51.178679 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:51Z","lastTransitionTime":"2025-10-01T14:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:51 crc kubenswrapper[4771]: I1001 14:56:51.281967 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:51 crc kubenswrapper[4771]: I1001 14:56:51.282035 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:51 crc kubenswrapper[4771]: I1001 14:56:51.282057 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:51 crc kubenswrapper[4771]: I1001 14:56:51.282088 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:51 crc kubenswrapper[4771]: I1001 14:56:51.282114 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:51Z","lastTransitionTime":"2025-10-01T14:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:51 crc kubenswrapper[4771]: I1001 14:56:51.384867 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:51 crc kubenswrapper[4771]: I1001 14:56:51.384912 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:51 crc kubenswrapper[4771]: I1001 14:56:51.384923 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:51 crc kubenswrapper[4771]: I1001 14:56:51.384939 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:51 crc kubenswrapper[4771]: I1001 14:56:51.384950 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:51Z","lastTransitionTime":"2025-10-01T14:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:51 crc kubenswrapper[4771]: I1001 14:56:51.487622 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:51 crc kubenswrapper[4771]: I1001 14:56:51.487972 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:51 crc kubenswrapper[4771]: I1001 14:56:51.488045 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:51 crc kubenswrapper[4771]: I1001 14:56:51.488140 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:51 crc kubenswrapper[4771]: I1001 14:56:51.488226 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:51Z","lastTransitionTime":"2025-10-01T14:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:51 crc kubenswrapper[4771]: I1001 14:56:51.590851 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:51 crc kubenswrapper[4771]: I1001 14:56:51.590921 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:51 crc kubenswrapper[4771]: I1001 14:56:51.590940 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:51 crc kubenswrapper[4771]: I1001 14:56:51.590968 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:51 crc kubenswrapper[4771]: I1001 14:56:51.590988 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:51Z","lastTransitionTime":"2025-10-01T14:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:51 crc kubenswrapper[4771]: I1001 14:56:51.694602 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:51 crc kubenswrapper[4771]: I1001 14:56:51.694687 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:51 crc kubenswrapper[4771]: I1001 14:56:51.694711 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:51 crc kubenswrapper[4771]: I1001 14:56:51.694784 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:51 crc kubenswrapper[4771]: I1001 14:56:51.694804 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:51Z","lastTransitionTime":"2025-10-01T14:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:51 crc kubenswrapper[4771]: I1001 14:56:51.803066 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:51 crc kubenswrapper[4771]: I1001 14:56:51.804467 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:51 crc kubenswrapper[4771]: I1001 14:56:51.804501 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:51 crc kubenswrapper[4771]: I1001 14:56:51.804536 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:51 crc kubenswrapper[4771]: I1001 14:56:51.804559 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:51Z","lastTransitionTime":"2025-10-01T14:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:51 crc kubenswrapper[4771]: I1001 14:56:51.906784 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:51 crc kubenswrapper[4771]: I1001 14:56:51.906845 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:51 crc kubenswrapper[4771]: I1001 14:56:51.906865 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:51 crc kubenswrapper[4771]: I1001 14:56:51.906889 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:51 crc kubenswrapper[4771]: I1001 14:56:51.906906 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:51Z","lastTransitionTime":"2025-10-01T14:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.010045 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.010370 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.010455 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.010562 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.010668 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:52Z","lastTransitionTime":"2025-10-01T14:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.114152 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.114232 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.114258 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.114293 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.114318 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:52Z","lastTransitionTime":"2025-10-01T14:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.217317 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.217679 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.218061 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.218267 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.218469 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:52Z","lastTransitionTime":"2025-10-01T14:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.321692 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.322362 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.322596 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.322881 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.323114 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:52Z","lastTransitionTime":"2025-10-01T14:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.426847 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.426921 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.426943 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.426970 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.426987 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:52Z","lastTransitionTime":"2025-10-01T14:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.530219 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.530302 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.530327 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.530362 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.530388 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:52Z","lastTransitionTime":"2025-10-01T14:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.633565 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.633871 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.633973 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.634081 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.634165 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:52Z","lastTransitionTime":"2025-10-01T14:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.737902 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.738350 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.738530 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.738664 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.738819 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:52Z","lastTransitionTime":"2025-10-01T14:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.841112 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.841414 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.841510 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.841601 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.841687 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:52Z","lastTransitionTime":"2025-10-01T14:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.944035 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.944096 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.944115 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.944139 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.944156 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:52Z","lastTransitionTime":"2025-10-01T14:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.984990 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.984997 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.985125 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:56:52 crc kubenswrapper[4771]: I1001 14:56:52.985163 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:56:52 crc kubenswrapper[4771]: E1001 14:56:52.985351 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:56:52 crc kubenswrapper[4771]: E1001 14:56:52.985509 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:56:52 crc kubenswrapper[4771]: E1001 14:56:52.985652 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:56:52 crc kubenswrapper[4771]: E1001 14:56:52.985783 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.047891 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.047974 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.047991 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.048015 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.048032 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:53Z","lastTransitionTime":"2025-10-01T14:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.060199 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a49c960d-cfd1-4745-976b-59c62e3dcf8e-metrics-certs\") pod \"network-metrics-daemon-8qdkc\" (UID: \"a49c960d-cfd1-4745-976b-59c62e3dcf8e\") " pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:56:53 crc kubenswrapper[4771]: E1001 14:56:53.060493 4771 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 14:56:53 crc kubenswrapper[4771]: E1001 14:56:53.060613 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a49c960d-cfd1-4745-976b-59c62e3dcf8e-metrics-certs podName:a49c960d-cfd1-4745-976b-59c62e3dcf8e nodeName:}" failed. No retries permitted until 2025-10-01 14:57:09.060579666 +0000 UTC m=+73.679754877 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a49c960d-cfd1-4745-976b-59c62e3dcf8e-metrics-certs") pod "network-metrics-daemon-8qdkc" (UID: "a49c960d-cfd1-4745-976b-59c62e3dcf8e") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.151130 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.151219 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.151246 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.151278 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.151303 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:53Z","lastTransitionTime":"2025-10-01T14:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.194110 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.194172 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.194189 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.194215 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.194230 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:53Z","lastTransitionTime":"2025-10-01T14:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:53 crc kubenswrapper[4771]: E1001 14:56:53.208676 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f03ada0f-e2c8-42c8-86e3-3e9572f1e63b\\\",\\\"systemUUID\\\":\\\"ab8b87ec-94d1-4eae-9ea3-b28f83991d01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:53Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.213288 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.213358 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.213377 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.213404 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.213431 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:53Z","lastTransitionTime":"2025-10-01T14:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:53 crc kubenswrapper[4771]: E1001 14:56:53.231982 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f03ada0f-e2c8-42c8-86e3-3e9572f1e63b\\\",\\\"systemUUID\\\":\\\"ab8b87ec-94d1-4eae-9ea3-b28f83991d01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:53Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.235824 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.235855 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.235867 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.235883 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.235896 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:53Z","lastTransitionTime":"2025-10-01T14:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:53 crc kubenswrapper[4771]: E1001 14:56:53.253802 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f03ada0f-e2c8-42c8-86e3-3e9572f1e63b\\\",\\\"systemUUID\\\":\\\"ab8b87ec-94d1-4eae-9ea3-b28f83991d01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:53Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.258262 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.258303 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.258321 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.258338 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.258350 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:53Z","lastTransitionTime":"2025-10-01T14:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:53 crc kubenswrapper[4771]: E1001 14:56:53.279647 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f03ada0f-e2c8-42c8-86e3-3e9572f1e63b\\\",\\\"systemUUID\\\":\\\"ab8b87ec-94d1-4eae-9ea3-b28f83991d01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:53Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.285090 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.285139 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.285156 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.285179 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.285198 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:53Z","lastTransitionTime":"2025-10-01T14:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:53 crc kubenswrapper[4771]: E1001 14:56:53.300194 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:56:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f03ada0f-e2c8-42c8-86e3-3e9572f1e63b\\\",\\\"systemUUID\\\":\\\"ab8b87ec-94d1-4eae-9ea3-b28f83991d01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:53Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:53 crc kubenswrapper[4771]: E1001 14:56:53.300450 4771 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.302851 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.302999 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.303117 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.303240 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.303369 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:53Z","lastTransitionTime":"2025-10-01T14:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.406581 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.406939 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.407032 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.407145 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.407251 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:53Z","lastTransitionTime":"2025-10-01T14:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.512092 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.512148 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.512169 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.512189 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.512207 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:53Z","lastTransitionTime":"2025-10-01T14:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.616155 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.616228 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.616247 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.616279 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.616304 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:53Z","lastTransitionTime":"2025-10-01T14:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.719436 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.719491 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.719502 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.719520 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.719532 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:53Z","lastTransitionTime":"2025-10-01T14:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.822699 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.822760 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.822771 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.822787 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.822799 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:53Z","lastTransitionTime":"2025-10-01T14:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.926105 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.926169 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.926188 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.926210 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:53 crc kubenswrapper[4771]: I1001 14:56:53.926228 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:53Z","lastTransitionTime":"2025-10-01T14:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.029281 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.029349 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.029362 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.029381 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.029395 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:54Z","lastTransitionTime":"2025-10-01T14:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.131878 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.131920 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.131932 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.131948 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.131961 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:54Z","lastTransitionTime":"2025-10-01T14:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.235405 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.235464 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.235476 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.235496 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.235513 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:54Z","lastTransitionTime":"2025-10-01T14:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.339101 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.339177 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.339197 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.339220 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.339237 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:54Z","lastTransitionTime":"2025-10-01T14:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.442318 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.442376 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.442398 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.442424 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.442446 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:54Z","lastTransitionTime":"2025-10-01T14:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.545761 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.545810 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.545825 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.545844 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.545860 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:54Z","lastTransitionTime":"2025-10-01T14:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.648714 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.648825 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.648867 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.648897 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.648919 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:54Z","lastTransitionTime":"2025-10-01T14:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.751890 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.752218 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.752294 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.752372 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.752439 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:54Z","lastTransitionTime":"2025-10-01T14:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.855845 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.855942 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.855961 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.855987 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.856005 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:54Z","lastTransitionTime":"2025-10-01T14:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.959011 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.959337 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.959425 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.959506 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.959579 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:54Z","lastTransitionTime":"2025-10-01T14:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.984783 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.984841 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.984893 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:56:54 crc kubenswrapper[4771]: E1001 14:56:54.985004 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:56:54 crc kubenswrapper[4771]: E1001 14:56:54.985140 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:56:54 crc kubenswrapper[4771]: E1001 14:56:54.985263 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:56:54 crc kubenswrapper[4771]: I1001 14:56:54.985400 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:56:54 crc kubenswrapper[4771]: E1001 14:56:54.985661 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.062314 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.062938 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.062989 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.063027 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.063053 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:55Z","lastTransitionTime":"2025-10-01T14:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.165465 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.165527 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.165550 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.165575 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.165594 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:55Z","lastTransitionTime":"2025-10-01T14:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.269245 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.269342 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.269358 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.269374 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.269385 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:55Z","lastTransitionTime":"2025-10-01T14:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.372341 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.372481 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.372496 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.372515 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.372527 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:55Z","lastTransitionTime":"2025-10-01T14:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.475251 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.475305 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.475322 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.475343 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.475359 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:55Z","lastTransitionTime":"2025-10-01T14:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.578557 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.578612 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.578628 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.578654 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.578672 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:55Z","lastTransitionTime":"2025-10-01T14:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.682296 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.682350 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.682369 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.682391 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.682410 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:55Z","lastTransitionTime":"2025-10-01T14:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.785033 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.785113 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.785126 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.785148 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.785161 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:55Z","lastTransitionTime":"2025-10-01T14:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.888397 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.888459 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.888476 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.888502 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.888520 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:55Z","lastTransitionTime":"2025-10-01T14:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.991416 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.991827 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.991981 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.992136 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:55 crc kubenswrapper[4771]: I1001 14:56:55.992274 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:55Z","lastTransitionTime":"2025-10-01T14:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.004009 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:56Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.020557 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:56Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.041908 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9lvcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96a3328-c79b-4528-b9b5-badbc7380dd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dea414c74a97184e0507113cc8069ee732fe85125aee58c080165c511835f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs5q5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9lvcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:56Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.054800 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8qdkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a49c960d-cfd1-4745-976b-59c62e3dcf8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9mq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9mq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8qdkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:56Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.085882 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8156f536-4329-4bc1-8ea3-653c5d719a19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dbd5c544c544544637a62c6e11372b59bdddc28c6ddb14e2cdd93d2d02a85b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9cd85dad256af814a22cd9787f3c1d4061ff2b76a0fb1a92b547a10c809a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d50224da1e43f0ad8835be229e79e94d6603a0fcb0879b9dd306e4cc357c96f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6def3f33927474e3a3c86835c9606ccfd32e235441e1646cdba4cef0b41e2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe4f8936af05697692a6547962fa77f3ac8bb357e3484324073f7768b374060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:56Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.096375 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.096421 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.096439 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.096501 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.096523 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:56Z","lastTransitionTime":"2025-10-01T14:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.109717 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b439925b-c79c-4b66-957f-5be27680bbc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8551072e274183e6126e06729369468bcf9a98a9a487ff2febe02b088a6a51aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd6957bcec2f931ddc0f584bb947cc6c8e19aee87c4fa2d2727151312e00bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea2a54d9ce79c067adf62a1a112758028562f68fd877d7b5c1f0ac808fde931\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba775d006f7fce834b114986ea63340af2cca2e7d10b6fc3be16555f278fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 14:56:16.707999 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 14:56:16.708248 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 14:56:16.709042 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1763409596/tls.crt::/tmp/serving-cert-1763409596/tls.key\\\\\\\"\\\\nI1001 14:56:17.040394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 14:56:17.043951 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 14:56:17.043976 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 14:56:17.043997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 14:56:17.044002 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 14:56:17.049470 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1001 14:56:17.049534 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 14:56:17.049584 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049607 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 14:56:17.049653 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 14:56:17.049673 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 14:56:17.049696 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 14:56:17.051084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d920faabd05e65200c4defe1588344af5ae2b34148b4213bdc86abe9dd19d236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:56Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.130080 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa69b46a-20c4-4ce1-8be9-e945c6865b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0651d298d7c1364b4de3b030574abd7d6ba8ccacab5872d74162878f92592cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b72097f8016d0e79134956d3555973cd7c137c30ab5eade1bd92805d16f66bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://495c07855c881f2a9cd788740e2197c165a38891c729a8563f3540bf5c6d8dcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fb83a6932caf24a6b2430836bc2800415028a8134a2355a816ccbb73efbaf46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:56Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.146390 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad0b1d01c4ff14005e40574c1663d3fafddd1527b8d283ee2f2364a861e3c351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:56Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.164124 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kmlgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2129365d-0a99-4cf0-a561-fd4126d1bfc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dabcd9bb31c364a82e0015bb58c48344f35fd73013cb9eb2c9d178ea6befbdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqx4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kmlgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:56Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.183489 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be36f4b-1171-4281-a7ac-43e411e080f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f58af4fa01651762bc2de081e844beb25bd1468804d6dbf99d01be10dd80e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jj6k4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:56Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.199400 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.199556 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.199575 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.199653 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.199675 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:56Z","lastTransitionTime":"2025-10-01T14:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.200772 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6ef7bbeb1f3ba58b4d1cfa2d55bcc4d7e7264d3db53e61495625b58a8503ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://679e6aca450f387cb31946627423ce8bc164d9815a7caf225e6cf1512f189cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:56Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.213848 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9jb9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe483b7b-ed55-4649-ac50-66ac981305e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e00c87efe2f7a38dd71171e28ae517733a09ed433bd3fec878757e5094d423ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2dtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb235560f4fabd7d33a9286e029c075fefa4dd44eea942cd8fe4ca74819ce722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2dtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9jb9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:56Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.225773 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b2b4b8e-b886-4fa6-abf2-6bffd3d7dd4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe139fdfee8f2ebb2368fa660edd669455c3b903836d7ef6212dea9921d8488b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5837a628a7ed87d4bc032e06b4732df175e922bf49ecbffee596f79c5357c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e47111864672ec3e393187147b7390f995634d4d32bf75915b5cdbb3915aca9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eab2c2fe2d4eae570e4686e0c48ff8e9407ff544bcd9f5339371287c23449333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eab2c2fe2d4eae570e4686e0c48ff8e9407ff544bcd9f5339371287c23449333\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:56Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.238431 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02063fddc7a255292ea7cdd6a318546b1573a0701787f4ab839ac7bea5b1ccb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:56Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.255973 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:56Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.266167 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7wr7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bb66959-800a-45dc-909f-1a093c578823\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abffda55d62e4f219933292ded99619fb5bfbbe87a5091c8aaaee6ea6162353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bdn7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7wr7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:56Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.277471 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"289ee6d3-fabe-417f-964c-76ca03c143cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37230c61c3cdf57d73df404731eb692cf20c46a8d983ee40c0aef7ee1f3ad839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://161a9b5c96d73213aa3d0261956487464c5e1398e47d221db9fccafdfdc51856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vck47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:56Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.294257 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a061b8e2-74a8-4953-bfa2-5090a2f70459\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8e753527ef673cd09f9a6d72e7729db9d0d48329088fa9abab4ff1ed30ef3ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e753527ef673cd09f9a6d72e7729db9d0d48329088fa9abab4ff1ed30ef3ae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T14:56:35Z\\\",\\\"message\\\":\\\"false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.110],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nF1001 14:56:35.269926 6242 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:35Z is after 2025-08-24T17:21:41Z]\\\\nI1001 14:56:35.269935 6242 lb_config.go:1031] Cluster endpoints for openshift-operator-lifecycle-manager/package-server-manager-metrics for network=default are: map[]\\\\nI1001 14:56:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-j7ntp_openshift-ovn-kubernetes(a061b8e2-74a8-4953-bfa2-5090a2f70459)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j7ntp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:56Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.301609 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.301645 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.301655 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.301667 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.301676 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:56Z","lastTransitionTime":"2025-10-01T14:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.403546 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.403593 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.403609 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.403636 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.403652 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:56Z","lastTransitionTime":"2025-10-01T14:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.506758 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.506829 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.506846 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.506869 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.506887 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:56Z","lastTransitionTime":"2025-10-01T14:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.609832 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.610314 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.610331 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.610355 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.610373 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:56Z","lastTransitionTime":"2025-10-01T14:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.714142 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.714215 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.714240 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.714272 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.714301 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:56Z","lastTransitionTime":"2025-10-01T14:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.817375 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.817412 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.817421 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.817434 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.817462 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:56Z","lastTransitionTime":"2025-10-01T14:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.921306 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.921515 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.921550 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.921580 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.921603 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:56Z","lastTransitionTime":"2025-10-01T14:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.984319 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.984417 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:56:56 crc kubenswrapper[4771]: E1001 14:56:56.984552 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.984598 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:56:56 crc kubenswrapper[4771]: I1001 14:56:56.984579 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:56:56 crc kubenswrapper[4771]: E1001 14:56:56.984816 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:56:56 crc kubenswrapper[4771]: E1001 14:56:56.984981 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:56:56 crc kubenswrapper[4771]: E1001 14:56:56.985131 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.026443 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.026527 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.026553 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.026585 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.026641 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:57Z","lastTransitionTime":"2025-10-01T14:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.129878 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.129947 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.129966 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.129990 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.130008 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:57Z","lastTransitionTime":"2025-10-01T14:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.233265 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.233343 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.233361 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.233386 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.233403 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:57Z","lastTransitionTime":"2025-10-01T14:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.337084 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.337151 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.337172 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.337194 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.337213 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:57Z","lastTransitionTime":"2025-10-01T14:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.439756 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.439808 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.439824 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.439849 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.439866 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:57Z","lastTransitionTime":"2025-10-01T14:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.542527 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.542671 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.542696 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.542723 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.542760 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:57Z","lastTransitionTime":"2025-10-01T14:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.645267 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.645365 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.645383 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.645407 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.645430 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:57Z","lastTransitionTime":"2025-10-01T14:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.748562 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.748637 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.748662 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.748710 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.748763 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:57Z","lastTransitionTime":"2025-10-01T14:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.851745 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.851781 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.851793 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.851809 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.851823 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:57Z","lastTransitionTime":"2025-10-01T14:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.955150 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.955219 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.955238 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.955267 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.955285 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:57Z","lastTransitionTime":"2025-10-01T14:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:57 crc kubenswrapper[4771]: I1001 14:56:57.985511 4771 scope.go:117] "RemoveContainer" containerID="f8e753527ef673cd09f9a6d72e7729db9d0d48329088fa9abab4ff1ed30ef3ae" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.059143 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.059200 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.059218 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.059244 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.059262 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:58Z","lastTransitionTime":"2025-10-01T14:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.162264 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.162302 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.162314 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.162336 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.162347 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:58Z","lastTransitionTime":"2025-10-01T14:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.264993 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.265026 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.265035 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.265047 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.265058 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:58Z","lastTransitionTime":"2025-10-01T14:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.368997 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.369452 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.369473 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.369504 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.369526 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:58Z","lastTransitionTime":"2025-10-01T14:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.414107 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j7ntp_a061b8e2-74a8-4953-bfa2-5090a2f70459/ovnkube-controller/1.log" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.424143 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" event={"ID":"a061b8e2-74a8-4953-bfa2-5090a2f70459","Type":"ContainerStarted","Data":"76e2370562a77d3eb4433f434869c23f9e4743501a2069aed27f5fd25c61ec33"} Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.425226 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.454068 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8156f536-4329-4bc1-8ea3-653c5d719a19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dbd5c544c544544637a62c6e11372b59bdddc28c6ddb14e2cdd93d2d02a85b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9cd85dad256af814a22cd9787f3c1d4061ff2b76a0fb1a92b547a10c809a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d50224da1e43f0ad8835be229e79e94d6603a0fcb0879b9dd306e4cc357c96f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6def3f33927474e3a3c86835c9606ccfd32e235441e1646cdba4cef0b41e2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe4f8936af05697692a6547962fa77f3ac8bb357e3484324073f7768b374060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:58Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.473035 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.473083 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.473097 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.473114 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.473127 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:58Z","lastTransitionTime":"2025-10-01T14:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.474310 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b439925b-c79c-4b66-957f-5be27680bbc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8551072e274183e6126e06729369468bcf9a98a9a487ff2febe02b088a6a51aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd6957bcec2f931ddc0f584bb947cc6c8e19aee87c4fa2d2727151312e00bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea2a54d9ce79c067adf62a1a112758028562f68fd877d7b5c1f0ac808fde931\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba775d006f7fce834b114986ea63340af2cca2e7d10b6fc3be16555f278fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 14:56:16.707999 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 14:56:16.708248 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 14:56:16.709042 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1763409596/tls.crt::/tmp/serving-cert-1763409596/tls.key\\\\\\\"\\\\nI1001 14:56:17.040394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 14:56:17.043951 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 14:56:17.043976 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 14:56:17.043997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 14:56:17.044002 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 14:56:17.049470 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1001 14:56:17.049534 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 14:56:17.049584 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049607 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 14:56:17.049653 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 14:56:17.049673 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 14:56:17.049696 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 14:56:17.051084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d920faabd05e65200c4defe1588344af5ae2b34148b4213bdc86abe9dd19d236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:58Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.489697 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa69b46a-20c4-4ce1-8be9-e945c6865b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0651d298d7c1364b4de3b030574abd7d6ba8ccacab5872d74162878f92592cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b72097f8016d0e79134956d3555973cd7c137c30ab5eade1bd92805d16f66bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://495c07855c881f2a9cd788740e2197c165a38891c729a8563f3540bf5c6d8dcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fb83a6932caf24a6b2430836bc2800415028a8134a2355a816ccbb73efbaf46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:58Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.508946 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad0b1d01c4ff14005e40574c1663d3fafddd1527b8d283ee2f2364a861e3c351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:58Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.528132 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kmlgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2129365d-0a99-4cf0-a561-fd4126d1bfc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dabcd9bb31c364a82e0015bb58c48344f35fd73013cb9eb2c9d178ea6befbdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqx4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kmlgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:58Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.549237 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be36f4b-1171-4281-a7ac-43e411e080f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f58af4fa01651762bc2de081e844beb25bd1468804d6dbf99d01be10dd80e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jj6k4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:58Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.562979 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6ef7bbeb1f3ba58b4d1cfa2d55bcc4d7e7264d3db53e61495625b58a8503ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://679e6aca450f387cb31946627423ce8bc164d9815a7caf225e6cf1512f189cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:58Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.575544 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9jb9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe483b7b-ed55-4649-ac50-66ac981305e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e00c87efe2f7a38dd71171e28ae517733a09ed433bd3fec878757e5094d423ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2dtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb235560f4fabd7d33a9286e029c075fefa4dd44eea942cd8fe4ca74819ce722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2dtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9jb9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:58Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.576138 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.576163 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.576177 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.576193 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.576205 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:58Z","lastTransitionTime":"2025-10-01T14:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.585431 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b2b4b8e-b886-4fa6-abf2-6bffd3d7dd4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe139fdfee8f2ebb2368fa660edd669455c3b903836d7ef6212dea9921d8488b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5837a628a7ed87d4bc032e06b4732df175e922bf49ecbffee596f79c5357c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e47111864672ec3e393187147b7390f995634d4d32bf75915b5cdbb3915aca9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eab2c2fe2d4eae570e4686e0c48ff8e9407ff544bcd9f5339371287c23449333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eab2c2fe2d4eae570e4686e0c48ff8e9407ff544bcd9f5339371287c23449333\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:58Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.600817 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02063fddc7a255292ea7cdd6a318546b1573a0701787f4ab839ac7bea5b1ccb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:58Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.613123 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:58Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.625611 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7wr7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bb66959-800a-45dc-909f-1a093c578823\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abffda55d62e4f219933292ded99619fb5bfbbe87a5091c8aaaee6ea6162353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bdn7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7wr7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:58Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.637583 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"289ee6d3-fabe-417f-964c-76ca03c143cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37230c61c3cdf57d73df404731eb692cf20c46a8d983ee40c0aef7ee1f3ad839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://161a9b5c96d73213aa3d0261956487464c5e1398e47d221db9fccafdfdc51856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vck47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:58Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.655451 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a061b8e2-74a8-4953-bfa2-5090a2f70459\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76e2370562a77d3eb4433f434869c23f9e4743501a2069aed27f5fd25c61ec33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e753527ef673cd09f9a6d72e7729db9d0d48329088fa9abab4ff1ed30ef3ae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T14:56:35Z\\\",\\\"message\\\":\\\"false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.110],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nF1001 14:56:35.269926 6242 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:35Z is after 2025-08-24T17:21:41Z]\\\\nI1001 14:56:35.269935 6242 lb_config.go:1031] Cluster endpoints for openshift-operator-lifecycle-manager/package-server-manager-metrics for network=default are: map[]\\\\nI1001 14:56:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j7ntp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:58Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.669958 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:58Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.678867 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.678904 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.678914 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.678927 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.678935 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:58Z","lastTransitionTime":"2025-10-01T14:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.684854 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:58Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.697478 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9lvcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96a3328-c79b-4528-b9b5-badbc7380dd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dea414c74a97184e0507113cc8069ee732fe85125aee58c080165c511835f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs5q5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9lvcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:58Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.713019 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8qdkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a49c960d-cfd1-4745-976b-59c62e3dcf8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9mq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9mq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8qdkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:58Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.782479 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.782539 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.782556 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.782580 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.782598 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:58Z","lastTransitionTime":"2025-10-01T14:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.886099 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.886163 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.886178 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.886202 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.886219 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:58Z","lastTransitionTime":"2025-10-01T14:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.984969 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.985030 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.985063 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:56:58 crc kubenswrapper[4771]: E1001 14:56:58.985143 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.985172 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:56:58 crc kubenswrapper[4771]: E1001 14:56:58.985380 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:56:58 crc kubenswrapper[4771]: E1001 14:56:58.985458 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:56:58 crc kubenswrapper[4771]: E1001 14:56:58.985654 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.990954 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.991012 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.991034 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.991063 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:58 crc kubenswrapper[4771]: I1001 14:56:58.991084 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:58Z","lastTransitionTime":"2025-10-01T14:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.094978 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.095025 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.095063 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.095091 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.095112 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:59Z","lastTransitionTime":"2025-10-01T14:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.199244 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.199323 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.199335 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.199356 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.199367 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:59Z","lastTransitionTime":"2025-10-01T14:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.303435 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.303504 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.303521 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.303552 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.303571 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:59Z","lastTransitionTime":"2025-10-01T14:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.406942 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.407012 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.407025 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.407046 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.407058 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:59Z","lastTransitionTime":"2025-10-01T14:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.430549 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j7ntp_a061b8e2-74a8-4953-bfa2-5090a2f70459/ovnkube-controller/2.log" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.431345 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j7ntp_a061b8e2-74a8-4953-bfa2-5090a2f70459/ovnkube-controller/1.log" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.435864 4771 generic.go:334] "Generic (PLEG): container finished" podID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerID="76e2370562a77d3eb4433f434869c23f9e4743501a2069aed27f5fd25c61ec33" exitCode=1 Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.435927 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" event={"ID":"a061b8e2-74a8-4953-bfa2-5090a2f70459","Type":"ContainerDied","Data":"76e2370562a77d3eb4433f434869c23f9e4743501a2069aed27f5fd25c61ec33"} Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.435979 4771 scope.go:117] "RemoveContainer" containerID="f8e753527ef673cd09f9a6d72e7729db9d0d48329088fa9abab4ff1ed30ef3ae" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.438311 4771 scope.go:117] "RemoveContainer" containerID="76e2370562a77d3eb4433f434869c23f9e4743501a2069aed27f5fd25c61ec33" Oct 01 14:56:59 crc kubenswrapper[4771]: E1001 14:56:59.438651 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j7ntp_openshift-ovn-kubernetes(a061b8e2-74a8-4953-bfa2-5090a2f70459)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.466575 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be36f4b-1171-4281-a7ac-43e411e080f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f58af4fa01651762bc2de081e844beb25bd1468804d6dbf99d01be10dd80e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jj6k4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:59Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.492153 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8156f536-4329-4bc1-8ea3-653c5d719a19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dbd5c544c544544637a62c6e11372b59bdddc28c6ddb14e2cdd93d2d02a85b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9cd85dad256af814a22cd9787f3c1d4061ff2b76a0fb1a92b547a10c809a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d50224da1e43f0ad8835be229e79e94d6603a0fcb0879b9dd306e4cc357c96f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6def3f33927474e3a3c86835c9606ccfd32e235441e1646cdba4cef0b41e2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe4f8936af05697692a6547962fa77f3ac8bb357e3484324073f7768b374060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:59Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.509974 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.510019 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.510030 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.510046 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.510058 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:59Z","lastTransitionTime":"2025-10-01T14:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.514412 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b439925b-c79c-4b66-957f-5be27680bbc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8551072e274183e6126e06729369468bcf9a98a9a487ff2febe02b088a6a51aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd6957bcec2f931ddc0f584bb947cc6c8e19aee87c4fa2d2727151312e00bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea2a54d9ce79c067adf62a1a112758028562f68fd877d7b5c1f0ac808fde931\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba775d006f7fce834b114986ea63340af2cca2e7d10b6fc3be16555f278fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 14:56:16.707999 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 14:56:16.708248 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 14:56:16.709042 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1763409596/tls.crt::/tmp/serving-cert-1763409596/tls.key\\\\\\\"\\\\nI1001 14:56:17.040394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 14:56:17.043951 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 14:56:17.043976 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 14:56:17.043997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 14:56:17.044002 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 14:56:17.049470 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1001 14:56:17.049534 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 14:56:17.049584 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049607 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 14:56:17.049653 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 14:56:17.049673 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 14:56:17.049696 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 14:56:17.051084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d920faabd05e65200c4defe1588344af5ae2b34148b4213bdc86abe9dd19d236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:59Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.531252 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa69b46a-20c4-4ce1-8be9-e945c6865b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0651d298d7c1364b4de3b030574abd7d6ba8ccacab5872d74162878f92592cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b72097f8016d0e79134956d3555973cd7c137c30ab5eade1bd92805d16f66bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://495c07855c881f2a9cd788740e2197c165a38891c729a8563f3540bf5c6d8dcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fb83a6932caf24a6b2430836bc2800415028a8134a2355a816ccbb73efbaf46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:59Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.545385 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad0b1d01c4ff14005e40574c1663d3fafddd1527b8d283ee2f2364a861e3c351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:59Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.555503 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kmlgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2129365d-0a99-4cf0-a561-fd4126d1bfc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dabcd9bb31c364a82e0015bb58c48344f35fd73013cb9eb2c9d178ea6befbdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqx4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kmlgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:59Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.568208 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6ef7bbeb1f3ba58b4d1cfa2d55bcc4d7e7264d3db53e61495625b58a8503ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://679e6aca450f387cb31946627423ce8bc164d9815a7caf225e6cf1512f189cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:59Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.583153 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9jb9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe483b7b-ed55-4649-ac50-66ac981305e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e00c87efe2f7a38dd71171e28ae517733a09ed433bd3fec878757e5094d423ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2dtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb235560f4fabd7d33a9286e029c075fefa4dd44eea942cd8fe4ca74819ce722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2dtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9jb9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:59Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.612622 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.612658 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.612671 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.612689 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.612701 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:59Z","lastTransitionTime":"2025-10-01T14:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.613340 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a061b8e2-74a8-4953-bfa2-5090a2f70459\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76e2370562a77d3eb4433f434869c23f9e4743501a2069aed27f5fd25c61ec33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f8e753527ef673cd09f9a6d72e7729db9d0d48329088fa9abab4ff1ed30ef3ae\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T14:56:35Z\\\",\\\"message\\\":\\\"false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.110],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nF1001 14:56:35.269926 6242 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:35Z is after 2025-08-24T17:21:41Z]\\\\nI1001 14:56:35.269935 6242 lb_config.go:1031] Cluster endpoints for openshift-operator-lifecycle-manager/package-server-manager-metrics for network=default are: map[]\\\\nI1001 14:56:3\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76e2370562a77d3eb4433f434869c23f9e4743501a2069aed27f5fd25c61ec33\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T14:56:58Z\\\",\\\"message\\\":\\\"pods:v4/a13607449821398607916) with []\\\\nI1001 14:56:58.971658 6554 factory.go:1336] Added *v1.Node event handler 7\\\\nI1001 14:56:58.971782 6554 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 14:56:58.971818 6554 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 14:56:58.971894 6554 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 14:56:58.971933 6554 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 14:56:58.971902 6554 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1001 14:56:58.971981 6554 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 14:56:58.972021 6554 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 14:56:58.972083 6554 factory.go:656] Stopping watch factory\\\\nI1001 14:56:58.972119 6554 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 14:56:58.972126 6554 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 14:56:58.972841 6554 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1001 14:56:58.972996 6554 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1001 14:56:58.973494 6554 ovnkube.go:599] Stopped ovnkube\\\\nI1001 14:56:58.973545 6554 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1001 14:56:58.973645 6554 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j7ntp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:59Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.629998 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b2b4b8e-b886-4fa6-abf2-6bffd3d7dd4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe139fdfee8f2ebb2368fa660edd669455c3b903836d7ef6212dea9921d8488b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5837a628a7ed87d4bc032e06b4732df175e922bf49ecbffee596f79c5357c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e47111864672ec3e393187147b7390f995634d4d32bf75915b5cdbb3915aca9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eab2c2fe2d4eae570e4686e0c48ff8e9407ff544bcd9f5339371287c23449333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eab2c2fe2d4eae570e4686e0c48ff8e9407ff544bcd9f5339371287c23449333\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:59Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.643830 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02063fddc7a255292ea7cdd6a318546b1573a0701787f4ab839ac7bea5b1ccb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:59Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.657722 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:59Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.671072 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7wr7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bb66959-800a-45dc-909f-1a093c578823\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abffda55d62e4f219933292ded99619fb5bfbbe87a5091c8aaaee6ea6162353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bdn7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7wr7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:59Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.686319 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"289ee6d3-fabe-417f-964c-76ca03c143cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37230c61c3cdf57d73df404731eb692cf20c46a8d983ee40c0aef7ee1f3ad839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://161a9b5c96d73213aa3d0261956487464c5e1398e47d221db9fccafdfdc51856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vck47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:59Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.704105 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:59Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.714765 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.714792 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.714800 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.714813 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.714824 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:59Z","lastTransitionTime":"2025-10-01T14:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.717398 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:59Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.735436 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9lvcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96a3328-c79b-4528-b9b5-badbc7380dd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dea414c74a97184e0507113cc8069ee732fe85125aee58c080165c511835f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs5q5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9lvcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:59Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.746765 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8qdkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a49c960d-cfd1-4745-976b-59c62e3dcf8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9mq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9mq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8qdkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:56:59Z is after 2025-08-24T17:21:41Z" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.817104 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.817189 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.817223 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.817254 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.817279 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:59Z","lastTransitionTime":"2025-10-01T14:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.919891 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.919942 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.919953 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.919969 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:56:59 crc kubenswrapper[4771]: I1001 14:56:59.919982 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:56:59Z","lastTransitionTime":"2025-10-01T14:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.022935 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.023005 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.023024 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.023049 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.023066 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:00Z","lastTransitionTime":"2025-10-01T14:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.125812 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.125871 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.125887 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.125910 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.125928 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:00Z","lastTransitionTime":"2025-10-01T14:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.229905 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.229967 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.229977 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.229995 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.230005 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:00Z","lastTransitionTime":"2025-10-01T14:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.333499 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.333569 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.333592 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.333629 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.333657 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:00Z","lastTransitionTime":"2025-10-01T14:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.436529 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.436617 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.436638 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.436665 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.436683 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:00Z","lastTransitionTime":"2025-10-01T14:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.440059 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j7ntp_a061b8e2-74a8-4953-bfa2-5090a2f70459/ovnkube-controller/2.log" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.443138 4771 scope.go:117] "RemoveContainer" containerID="76e2370562a77d3eb4433f434869c23f9e4743501a2069aed27f5fd25c61ec33" Oct 01 14:57:00 crc kubenswrapper[4771]: E1001 14:57:00.443284 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j7ntp_openshift-ovn-kubernetes(a061b8e2-74a8-4953-bfa2-5090a2f70459)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.466410 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8156f536-4329-4bc1-8ea3-653c5d719a19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dbd5c544c544544637a62c6e11372b59bdddc28c6ddb14e2cdd93d2d02a85b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9cd85dad256af814a22cd9787f3c1d4061ff2b76a0fb1a92b547a10c809a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d50224da1e43f0ad8835be229e79e94d6603a0fcb0879b9dd306e4cc357c96f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6def3f33927474e3a3c86835c9606ccfd32e235441e1646cdba4cef0b41e2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe4f8936af05697692a6547962fa77f3ac8bb357e3484324073f7768b374060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:00Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.482769 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b439925b-c79c-4b66-957f-5be27680bbc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8551072e274183e6126e06729369468bcf9a98a9a487ff2febe02b088a6a51aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd6957bcec2f931ddc0f584bb947cc6c8e19aee87c4fa2d2727151312e00bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea2a54d9ce79c067adf62a1a112758028562f68fd877d7b5c1f0ac808fde931\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba775d006f7fce834b114986ea63340af2cca2e7d10b6fc3be16555f278fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 14:56:16.707999 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 14:56:16.708248 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 14:56:16.709042 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1763409596/tls.crt::/tmp/serving-cert-1763409596/tls.key\\\\\\\"\\\\nI1001 14:56:17.040394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 14:56:17.043951 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 14:56:17.043976 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 14:56:17.043997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 14:56:17.044002 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 14:56:17.049470 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1001 14:56:17.049534 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 14:56:17.049584 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049607 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 14:56:17.049653 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 14:56:17.049673 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 14:56:17.049696 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 14:56:17.051084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d920faabd05e65200c4defe1588344af5ae2b34148b4213bdc86abe9dd19d236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:00Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.493805 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa69b46a-20c4-4ce1-8be9-e945c6865b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0651d298d7c1364b4de3b030574abd7d6ba8ccacab5872d74162878f92592cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b72097f8016d0e79134956d3555973cd7c137c30ab5eade1bd92805d16f66bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://495c07855c881f2a9cd788740e2197c165a38891c729a8563f3540bf5c6d8dcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fb83a6932caf24a6b2430836bc2800415028a8134a2355a816ccbb73efbaf46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:00Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.504480 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad0b1d01c4ff14005e40574c1663d3fafddd1527b8d283ee2f2364a861e3c351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:00Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.518337 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kmlgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2129365d-0a99-4cf0-a561-fd4126d1bfc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dabcd9bb31c364a82e0015bb58c48344f35fd73013cb9eb2c9d178ea6befbdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqx4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kmlgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:00Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.537445 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be36f4b-1171-4281-a7ac-43e411e080f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f58af4fa01651762bc2de081e844beb25bd1468804d6dbf99d01be10dd80e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jj6k4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:00Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.539153 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.539197 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.539209 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.539226 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.539238 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:00Z","lastTransitionTime":"2025-10-01T14:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.550721 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6ef7bbeb1f3ba58b4d1cfa2d55bcc4d7e7264d3db53e61495625b58a8503ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://679e6aca450f387cb31946627423ce8bc164d9815a7caf225e6cf1512f189cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:00Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.561235 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9jb9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe483b7b-ed55-4649-ac50-66ac981305e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e00c87efe2f7a38dd71171e28ae517733a09ed433bd3fec878757e5094d423ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2dtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb235560f4fabd7d33a9286e029c075fefa4dd44eea942cd8fe4ca74819ce722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2dtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9jb9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:00Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.570611 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b2b4b8e-b886-4fa6-abf2-6bffd3d7dd4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe139fdfee8f2ebb2368fa660edd669455c3b903836d7ef6212dea9921d8488b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5837a628a7ed87d4bc032e06b4732df175e922bf49ecbffee596f79c5357c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e47111864672ec3e393187147b7390f995634d4d32bf75915b5cdbb3915aca9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eab2c2fe2d4eae570e4686e0c48ff8e9407ff544bcd9f5339371287c23449333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eab2c2fe2d4eae570e4686e0c48ff8e9407ff544bcd9f5339371287c23449333\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:00Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.583112 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02063fddc7a255292ea7cdd6a318546b1573a0701787f4ab839ac7bea5b1ccb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:00Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.594562 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:00Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.603632 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7wr7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bb66959-800a-45dc-909f-1a093c578823\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abffda55d62e4f219933292ded99619fb5bfbbe87a5091c8aaaee6ea6162353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bdn7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7wr7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:00Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.613072 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"289ee6d3-fabe-417f-964c-76ca03c143cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37230c61c3cdf57d73df404731eb692cf20c46a8d983ee40c0aef7ee1f3ad839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://161a9b5c96d73213aa3d0261956487464c5e1398e47d221db9fccafdfdc51856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vck47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:00Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.631258 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a061b8e2-74a8-4953-bfa2-5090a2f70459\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76e2370562a77d3eb4433f434869c23f9e4743501a2069aed27f5fd25c61ec33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76e2370562a77d3eb4433f434869c23f9e4743501a2069aed27f5fd25c61ec33\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T14:56:58Z\\\",\\\"message\\\":\\\"pods:v4/a13607449821398607916) with []\\\\nI1001 14:56:58.971658 6554 factory.go:1336] Added *v1.Node event handler 7\\\\nI1001 14:56:58.971782 6554 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 14:56:58.971818 6554 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 14:56:58.971894 6554 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 14:56:58.971933 6554 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 14:56:58.971902 6554 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1001 14:56:58.971981 6554 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 14:56:58.972021 6554 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 14:56:58.972083 6554 factory.go:656] Stopping watch factory\\\\nI1001 14:56:58.972119 6554 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 14:56:58.972126 6554 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 14:56:58.972841 6554 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1001 14:56:58.972996 6554 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1001 14:56:58.973494 6554 ovnkube.go:599] Stopped ovnkube\\\\nI1001 14:56:58.973545 6554 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1001 14:56:58.973645 6554 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j7ntp_openshift-ovn-kubernetes(a061b8e2-74a8-4953-bfa2-5090a2f70459)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j7ntp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:00Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.642068 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.642142 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.642160 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.642182 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.642197 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:00Z","lastTransitionTime":"2025-10-01T14:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.643637 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:00Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.658631 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:00Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.668953 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9lvcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96a3328-c79b-4528-b9b5-badbc7380dd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dea414c74a97184e0507113cc8069ee732fe85125aee58c080165c511835f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs5q5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9lvcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:00Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.678118 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8qdkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a49c960d-cfd1-4745-976b-59c62e3dcf8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9mq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9mq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8qdkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:00Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.745006 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.745044 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.745055 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.745070 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.745082 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:00Z","lastTransitionTime":"2025-10-01T14:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.847337 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.847371 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.847382 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.847398 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.847409 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:00Z","lastTransitionTime":"2025-10-01T14:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.949923 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.949989 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.950008 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.950035 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.950056 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:00Z","lastTransitionTime":"2025-10-01T14:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.984505 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.984550 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.984580 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:57:00 crc kubenswrapper[4771]: I1001 14:57:00.984519 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:57:00 crc kubenswrapper[4771]: E1001 14:57:00.984613 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:57:00 crc kubenswrapper[4771]: E1001 14:57:00.984685 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:57:00 crc kubenswrapper[4771]: E1001 14:57:00.984778 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:57:00 crc kubenswrapper[4771]: E1001 14:57:00.984845 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.052702 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.052782 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.052793 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.052812 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.052825 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:01Z","lastTransitionTime":"2025-10-01T14:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.155388 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.155439 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.155450 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.155469 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.155483 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:01Z","lastTransitionTime":"2025-10-01T14:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.257374 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.257412 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.257420 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.257433 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.257442 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:01Z","lastTransitionTime":"2025-10-01T14:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.360181 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.360252 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.360264 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.360278 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.360288 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:01Z","lastTransitionTime":"2025-10-01T14:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.462065 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.462119 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.462127 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.462141 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.462154 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:01Z","lastTransitionTime":"2025-10-01T14:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.565124 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.565187 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.565198 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.565217 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.565229 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:01Z","lastTransitionTime":"2025-10-01T14:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.668030 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.668090 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.668104 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.668128 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.668144 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:01Z","lastTransitionTime":"2025-10-01T14:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.770100 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.770181 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.770200 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.770226 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.770246 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:01Z","lastTransitionTime":"2025-10-01T14:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.872774 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.872820 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.872832 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.872848 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.872860 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:01Z","lastTransitionTime":"2025-10-01T14:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.975520 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.975596 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.975608 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.975629 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:01 crc kubenswrapper[4771]: I1001 14:57:01.975653 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:01Z","lastTransitionTime":"2025-10-01T14:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:02 crc kubenswrapper[4771]: I1001 14:57:02.082839 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:02 crc kubenswrapper[4771]: I1001 14:57:02.082898 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:02 crc kubenswrapper[4771]: I1001 14:57:02.082911 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:02 crc kubenswrapper[4771]: I1001 14:57:02.083321 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:02 crc kubenswrapper[4771]: I1001 14:57:02.083335 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:02Z","lastTransitionTime":"2025-10-01T14:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:02 crc kubenswrapper[4771]: I1001 14:57:02.186393 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:02 crc kubenswrapper[4771]: I1001 14:57:02.186427 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:02 crc kubenswrapper[4771]: I1001 14:57:02.186438 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:02 crc kubenswrapper[4771]: I1001 14:57:02.186454 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:02 crc kubenswrapper[4771]: I1001 14:57:02.186465 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:02Z","lastTransitionTime":"2025-10-01T14:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:02 crc kubenswrapper[4771]: I1001 14:57:02.289487 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:02 crc kubenswrapper[4771]: I1001 14:57:02.289539 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:02 crc kubenswrapper[4771]: I1001 14:57:02.289556 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:02 crc kubenswrapper[4771]: I1001 14:57:02.289577 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:02 crc kubenswrapper[4771]: I1001 14:57:02.289595 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:02Z","lastTransitionTime":"2025-10-01T14:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:02 crc kubenswrapper[4771]: I1001 14:57:02.391898 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:02 crc kubenswrapper[4771]: I1001 14:57:02.391950 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:02 crc kubenswrapper[4771]: I1001 14:57:02.391961 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:02 crc kubenswrapper[4771]: I1001 14:57:02.391979 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:02 crc kubenswrapper[4771]: I1001 14:57:02.391989 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:02Z","lastTransitionTime":"2025-10-01T14:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:02 crc kubenswrapper[4771]: I1001 14:57:02.495285 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:02 crc kubenswrapper[4771]: I1001 14:57:02.495330 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:02 crc kubenswrapper[4771]: I1001 14:57:02.495342 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:02 crc kubenswrapper[4771]: I1001 14:57:02.495362 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:02 crc kubenswrapper[4771]: I1001 14:57:02.495376 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:02Z","lastTransitionTime":"2025-10-01T14:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:02 crc kubenswrapper[4771]: I1001 14:57:02.597365 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:02 crc kubenswrapper[4771]: I1001 14:57:02.597405 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:02 crc kubenswrapper[4771]: I1001 14:57:02.597413 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:02 crc kubenswrapper[4771]: I1001 14:57:02.597427 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:02 crc kubenswrapper[4771]: I1001 14:57:02.597436 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:02Z","lastTransitionTime":"2025-10-01T14:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:02 crc kubenswrapper[4771]: I1001 14:57:02.699832 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:02 crc kubenswrapper[4771]: I1001 14:57:02.699884 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:02 crc kubenswrapper[4771]: I1001 14:57:02.699893 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:02 crc kubenswrapper[4771]: I1001 14:57:02.699906 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:02 crc kubenswrapper[4771]: I1001 14:57:02.699918 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:02Z","lastTransitionTime":"2025-10-01T14:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:02 crc kubenswrapper[4771]: I1001 14:57:02.802810 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:02 crc kubenswrapper[4771]: I1001 14:57:02.802884 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:02 crc kubenswrapper[4771]: I1001 14:57:02.802905 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:02 crc kubenswrapper[4771]: I1001 14:57:02.802930 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:02 crc kubenswrapper[4771]: I1001 14:57:02.802949 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:02Z","lastTransitionTime":"2025-10-01T14:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:02 crc kubenswrapper[4771]: I1001 14:57:02.906024 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:02 crc kubenswrapper[4771]: I1001 14:57:02.906086 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:02 crc kubenswrapper[4771]: I1001 14:57:02.906105 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:02 crc kubenswrapper[4771]: I1001 14:57:02.906130 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:02 crc kubenswrapper[4771]: I1001 14:57:02.906147 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:02Z","lastTransitionTime":"2025-10-01T14:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:02 crc kubenswrapper[4771]: I1001 14:57:02.984585 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:57:02 crc kubenswrapper[4771]: I1001 14:57:02.984667 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:57:02 crc kubenswrapper[4771]: I1001 14:57:02.984711 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:57:02 crc kubenswrapper[4771]: I1001 14:57:02.984746 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:57:02 crc kubenswrapper[4771]: E1001 14:57:02.984888 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:57:02 crc kubenswrapper[4771]: E1001 14:57:02.985010 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:57:02 crc kubenswrapper[4771]: E1001 14:57:02.985129 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:57:02 crc kubenswrapper[4771]: E1001 14:57:02.985196 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.008518 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.008607 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.008622 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.008640 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.008654 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:03Z","lastTransitionTime":"2025-10-01T14:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.112214 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.112252 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.112261 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.112275 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.112285 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:03Z","lastTransitionTime":"2025-10-01T14:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.214848 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.214891 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.214903 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.214921 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.214936 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:03Z","lastTransitionTime":"2025-10-01T14:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.317537 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.317615 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.317627 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.317646 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.317660 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:03Z","lastTransitionTime":"2025-10-01T14:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.420040 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.420103 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.420120 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.420143 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.420158 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:03Z","lastTransitionTime":"2025-10-01T14:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.523294 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.523343 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.523357 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.523376 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.523392 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:03Z","lastTransitionTime":"2025-10-01T14:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.614627 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.614690 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.614710 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.614763 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.614780 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:03Z","lastTransitionTime":"2025-10-01T14:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:03 crc kubenswrapper[4771]: E1001 14:57:03.630032 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f03ada0f-e2c8-42c8-86e3-3e9572f1e63b\\\",\\\"systemUUID\\\":\\\"ab8b87ec-94d1-4eae-9ea3-b28f83991d01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:03Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.634065 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.634113 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.634125 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.634142 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.634153 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:03Z","lastTransitionTime":"2025-10-01T14:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:03 crc kubenswrapper[4771]: E1001 14:57:03.646752 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f03ada0f-e2c8-42c8-86e3-3e9572f1e63b\\\",\\\"systemUUID\\\":\\\"ab8b87ec-94d1-4eae-9ea3-b28f83991d01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:03Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.651486 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.651517 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.651525 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.651543 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.651552 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:03Z","lastTransitionTime":"2025-10-01T14:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:03 crc kubenswrapper[4771]: E1001 14:57:03.665449 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f03ada0f-e2c8-42c8-86e3-3e9572f1e63b\\\",\\\"systemUUID\\\":\\\"ab8b87ec-94d1-4eae-9ea3-b28f83991d01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:03Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.669128 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.669210 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.669226 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.669242 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.669267 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:03Z","lastTransitionTime":"2025-10-01T14:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:03 crc kubenswrapper[4771]: E1001 14:57:03.680696 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f03ada0f-e2c8-42c8-86e3-3e9572f1e63b\\\",\\\"systemUUID\\\":\\\"ab8b87ec-94d1-4eae-9ea3-b28f83991d01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:03Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.684117 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.684164 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.684176 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.684192 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.684204 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:03Z","lastTransitionTime":"2025-10-01T14:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:03 crc kubenswrapper[4771]: E1001 14:57:03.697777 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f03ada0f-e2c8-42c8-86e3-3e9572f1e63b\\\",\\\"systemUUID\\\":\\\"ab8b87ec-94d1-4eae-9ea3-b28f83991d01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:03Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:03 crc kubenswrapper[4771]: E1001 14:57:03.697941 4771 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.700557 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.700622 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.700637 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.700656 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.700669 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:03Z","lastTransitionTime":"2025-10-01T14:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.803194 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.803231 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.803240 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.803256 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.803266 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:03Z","lastTransitionTime":"2025-10-01T14:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.906105 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.906169 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.906181 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.906197 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:03 crc kubenswrapper[4771]: I1001 14:57:03.906208 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:03Z","lastTransitionTime":"2025-10-01T14:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.009030 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.009108 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.009128 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.009157 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.009204 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:04Z","lastTransitionTime":"2025-10-01T14:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.112618 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.112679 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.112698 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.112724 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.112775 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:04Z","lastTransitionTime":"2025-10-01T14:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.215331 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.215372 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.215385 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.215405 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.215418 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:04Z","lastTransitionTime":"2025-10-01T14:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.318405 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.318449 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.318460 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.318478 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.318491 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:04Z","lastTransitionTime":"2025-10-01T14:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.420934 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.420999 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.421025 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.421053 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.421075 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:04Z","lastTransitionTime":"2025-10-01T14:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.523558 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.523600 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.523609 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.523625 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.523636 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:04Z","lastTransitionTime":"2025-10-01T14:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.626250 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.626318 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.626335 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.626358 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.626376 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:04Z","lastTransitionTime":"2025-10-01T14:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.729221 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.729278 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.729294 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.729318 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.729335 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:04Z","lastTransitionTime":"2025-10-01T14:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.832649 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.832689 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.832698 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.832715 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.832743 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:04Z","lastTransitionTime":"2025-10-01T14:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.936017 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.936072 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.936084 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.936100 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.936112 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:04Z","lastTransitionTime":"2025-10-01T14:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.985136 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.985202 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.985203 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:57:04 crc kubenswrapper[4771]: I1001 14:57:04.985206 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:57:04 crc kubenswrapper[4771]: E1001 14:57:04.985332 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:57:04 crc kubenswrapper[4771]: E1001 14:57:04.985416 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:57:04 crc kubenswrapper[4771]: E1001 14:57:04.985512 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:57:04 crc kubenswrapper[4771]: E1001 14:57:04.985588 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.038503 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.038560 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.038578 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.038605 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.038629 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:05Z","lastTransitionTime":"2025-10-01T14:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.141185 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.141217 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.141230 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.141244 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.141254 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:05Z","lastTransitionTime":"2025-10-01T14:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.244431 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.244499 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.244520 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.244544 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.244564 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:05Z","lastTransitionTime":"2025-10-01T14:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.346682 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.346960 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.346978 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.347001 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.347017 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:05Z","lastTransitionTime":"2025-10-01T14:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.450188 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.450285 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.450302 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.450325 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.450343 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:05Z","lastTransitionTime":"2025-10-01T14:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.553090 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.553148 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.553166 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.553189 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.553207 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:05Z","lastTransitionTime":"2025-10-01T14:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.656463 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.656537 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.656556 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.656638 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.656658 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:05Z","lastTransitionTime":"2025-10-01T14:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.759266 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.759301 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.759312 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.759328 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.759340 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:05Z","lastTransitionTime":"2025-10-01T14:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.861902 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.861955 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.861967 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.861984 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.861997 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:05Z","lastTransitionTime":"2025-10-01T14:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.966247 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.966949 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.966983 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.967012 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:05 crc kubenswrapper[4771]: I1001 14:57:05.967030 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:05Z","lastTransitionTime":"2025-10-01T14:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.008079 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8qdkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a49c960d-cfd1-4745-976b-59c62e3dcf8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9mq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9mq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8qdkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:06Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.029045 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:06Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.043895 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:06Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.057277 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9lvcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96a3328-c79b-4528-b9b5-badbc7380dd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dea414c74a97184e0507113cc8069ee732fe85125aee58c080165c511835f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs5q5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9lvcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:06Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.068784 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad0b1d01c4ff14005e40574c1663d3fafddd1527b8d283ee2f2364a861e3c351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:06Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.071473 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.071528 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.071539 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.071557 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.071569 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:06Z","lastTransitionTime":"2025-10-01T14:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.082596 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kmlgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2129365d-0a99-4cf0-a561-fd4126d1bfc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dabcd9bb31c364a82e0015bb58c48344f35fd73013cb9eb2c9d178ea6befbdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqx4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kmlgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:06Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.100452 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be36f4b-1171-4281-a7ac-43e411e080f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f58af4fa01651762bc2de081e844beb25bd1468804d6dbf99d01be10dd80e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jj6k4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:06Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.124136 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8156f536-4329-4bc1-8ea3-653c5d719a19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dbd5c544c544544637a62c6e11372b59bdddc28c6ddb14e2cdd93d2d02a85b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9cd85dad256af814a22cd9787f3c1d4061ff2b76a0fb1a92b547a10c809a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d50224da1e43f0ad8835be229e79e94d6603a0fcb0879b9dd306e4cc357c96f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6def3f33927474e3a3c86835c9606ccfd32e235441e1646cdba4cef0b41e2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe4f8936af05697692a6547962fa77f3ac8bb357e3484324073f7768b374060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:06Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.140553 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b439925b-c79c-4b66-957f-5be27680bbc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8551072e274183e6126e06729369468bcf9a98a9a487ff2febe02b088a6a51aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd6957bcec2f931ddc0f584bb947cc6c8e19aee87c4fa2d2727151312e00bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea2a54d9ce79c067adf62a1a112758028562f68fd877d7b5c1f0ac808fde931\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba775d006f7fce834b114986ea63340af2cca2e7d10b6fc3be16555f278fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 14:56:16.707999 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 14:56:16.708248 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 14:56:16.709042 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1763409596/tls.crt::/tmp/serving-cert-1763409596/tls.key\\\\\\\"\\\\nI1001 14:56:17.040394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 14:56:17.043951 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 14:56:17.043976 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 14:56:17.043997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 14:56:17.044002 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 14:56:17.049470 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1001 14:56:17.049534 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 14:56:17.049584 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049607 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 14:56:17.049653 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 14:56:17.049673 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 14:56:17.049696 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 14:56:17.051084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d920faabd05e65200c4defe1588344af5ae2b34148b4213bdc86abe9dd19d236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:06Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.156693 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa69b46a-20c4-4ce1-8be9-e945c6865b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0651d298d7c1364b4de3b030574abd7d6ba8ccacab5872d74162878f92592cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b72097f8016d0e79134956d3555973cd7c137c30ab5eade1bd92805d16f66bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://495c07855c881f2a9cd788740e2197c165a38891c729a8563f3540bf5c6d8dcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fb83a6932caf24a6b2430836bc2800415028a8134a2355a816ccbb73efbaf46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:06Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.173241 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6ef7bbeb1f3ba58b4d1cfa2d55bcc4d7e7264d3db53e61495625b58a8503ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://679e6aca450f387cb31946627423ce8bc164d9815a7caf225e6cf1512f189cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:06Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.174597 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.174649 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.174662 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.174679 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.174690 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:06Z","lastTransitionTime":"2025-10-01T14:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.188254 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9jb9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe483b7b-ed55-4649-ac50-66ac981305e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e00c87efe2f7a38dd71171e28ae517733a09ed433bd3fec878757e5094d423ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2dtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb235560f4fabd7d33a9286e029c075fefa4dd44eea942cd8fe4ca74819ce722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2dtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9jb9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:06Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.201010 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7wr7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bb66959-800a-45dc-909f-1a093c578823\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abffda55d62e4f219933292ded99619fb5bfbbe87a5091c8aaaee6ea6162353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bdn7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7wr7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:06Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.213156 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"289ee6d3-fabe-417f-964c-76ca03c143cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37230c61c3cdf57d73df404731eb692cf20c46a8d983ee40c0aef7ee1f3ad839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://161a9b5c96d73213aa3d0261956487464c5e1398e47d221db9fccafdfdc51856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vck47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:06Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.233007 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a061b8e2-74a8-4953-bfa2-5090a2f70459\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76e2370562a77d3eb4433f434869c23f9e4743501a2069aed27f5fd25c61ec33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76e2370562a77d3eb4433f434869c23f9e4743501a2069aed27f5fd25c61ec33\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T14:56:58Z\\\",\\\"message\\\":\\\"pods:v4/a13607449821398607916) with []\\\\nI1001 14:56:58.971658 6554 factory.go:1336] Added *v1.Node event handler 7\\\\nI1001 14:56:58.971782 6554 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 14:56:58.971818 6554 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 14:56:58.971894 6554 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 14:56:58.971933 6554 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 14:56:58.971902 6554 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1001 14:56:58.971981 6554 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 14:56:58.972021 6554 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 14:56:58.972083 6554 factory.go:656] Stopping watch factory\\\\nI1001 14:56:58.972119 6554 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 14:56:58.972126 6554 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 14:56:58.972841 6554 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1001 14:56:58.972996 6554 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1001 14:56:58.973494 6554 ovnkube.go:599] Stopped ovnkube\\\\nI1001 14:56:58.973545 6554 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1001 14:56:58.973645 6554 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j7ntp_openshift-ovn-kubernetes(a061b8e2-74a8-4953-bfa2-5090a2f70459)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j7ntp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:06Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.247754 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b2b4b8e-b886-4fa6-abf2-6bffd3d7dd4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe139fdfee8f2ebb2368fa660edd669455c3b903836d7ef6212dea9921d8488b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5837a628a7ed87d4bc032e06b4732df175e922bf49ecbffee596f79c5357c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e47111864672ec3e393187147b7390f995634d4d32bf75915b5cdbb3915aca9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eab2c2fe2d4eae570e4686e0c48ff8e9407ff544bcd9f5339371287c23449333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eab2c2fe2d4eae570e4686e0c48ff8e9407ff544bcd9f5339371287c23449333\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:06Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.260903 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02063fddc7a255292ea7cdd6a318546b1573a0701787f4ab839ac7bea5b1ccb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:06Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.276478 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:06Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.278080 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.278125 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.278140 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.278158 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.278175 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:06Z","lastTransitionTime":"2025-10-01T14:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.380405 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.380465 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.380478 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.380501 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.380515 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:06Z","lastTransitionTime":"2025-10-01T14:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.483224 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.483287 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.483305 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.483345 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.483362 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:06Z","lastTransitionTime":"2025-10-01T14:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.585686 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.585752 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.585763 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.585781 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.585791 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:06Z","lastTransitionTime":"2025-10-01T14:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.688137 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.688187 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.688199 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.688213 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.688223 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:06Z","lastTransitionTime":"2025-10-01T14:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.790525 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.790567 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.790578 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.790593 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.790604 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:06Z","lastTransitionTime":"2025-10-01T14:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.893278 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.893323 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.893335 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.893352 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.893390 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:06Z","lastTransitionTime":"2025-10-01T14:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.984916 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.985015 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.985015 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:57:06 crc kubenswrapper[4771]: E1001 14:57:06.985100 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.985201 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:57:06 crc kubenswrapper[4771]: E1001 14:57:06.985268 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:57:06 crc kubenswrapper[4771]: E1001 14:57:06.985380 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:57:06 crc kubenswrapper[4771]: E1001 14:57:06.985532 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.995913 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.995968 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.995982 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.996001 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:06 crc kubenswrapper[4771]: I1001 14:57:06.996013 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:06Z","lastTransitionTime":"2025-10-01T14:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:07 crc kubenswrapper[4771]: I1001 14:57:07.003066 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 01 14:57:07 crc kubenswrapper[4771]: I1001 14:57:07.099238 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:07 crc kubenswrapper[4771]: I1001 14:57:07.099292 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:07 crc kubenswrapper[4771]: I1001 14:57:07.099304 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:07 crc kubenswrapper[4771]: I1001 14:57:07.099321 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:07 crc kubenswrapper[4771]: I1001 14:57:07.099337 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:07Z","lastTransitionTime":"2025-10-01T14:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:07 crc kubenswrapper[4771]: I1001 14:57:07.201866 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:07 crc kubenswrapper[4771]: I1001 14:57:07.201924 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:07 crc kubenswrapper[4771]: I1001 14:57:07.201939 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:07 crc kubenswrapper[4771]: I1001 14:57:07.201957 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:07 crc kubenswrapper[4771]: I1001 14:57:07.201972 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:07Z","lastTransitionTime":"2025-10-01T14:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:07 crc kubenswrapper[4771]: I1001 14:57:07.304797 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:07 crc kubenswrapper[4771]: I1001 14:57:07.304848 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:07 crc kubenswrapper[4771]: I1001 14:57:07.304857 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:07 crc kubenswrapper[4771]: I1001 14:57:07.304879 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:07 crc kubenswrapper[4771]: I1001 14:57:07.304893 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:07Z","lastTransitionTime":"2025-10-01T14:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:07 crc kubenswrapper[4771]: I1001 14:57:07.407997 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:07 crc kubenswrapper[4771]: I1001 14:57:07.408070 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:07 crc kubenswrapper[4771]: I1001 14:57:07.408088 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:07 crc kubenswrapper[4771]: I1001 14:57:07.408116 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:07 crc kubenswrapper[4771]: I1001 14:57:07.408135 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:07Z","lastTransitionTime":"2025-10-01T14:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:07 crc kubenswrapper[4771]: I1001 14:57:07.510141 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:07 crc kubenswrapper[4771]: I1001 14:57:07.510202 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:07 crc kubenswrapper[4771]: I1001 14:57:07.510216 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:07 crc kubenswrapper[4771]: I1001 14:57:07.510231 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:07 crc kubenswrapper[4771]: I1001 14:57:07.510242 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:07Z","lastTransitionTime":"2025-10-01T14:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:07 crc kubenswrapper[4771]: I1001 14:57:07.613090 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:07 crc kubenswrapper[4771]: I1001 14:57:07.613154 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:07 crc kubenswrapper[4771]: I1001 14:57:07.613163 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:07 crc kubenswrapper[4771]: I1001 14:57:07.613176 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:07 crc kubenswrapper[4771]: I1001 14:57:07.613185 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:07Z","lastTransitionTime":"2025-10-01T14:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:07 crc kubenswrapper[4771]: I1001 14:57:07.715390 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:07 crc kubenswrapper[4771]: I1001 14:57:07.715449 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:07 crc kubenswrapper[4771]: I1001 14:57:07.715461 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:07 crc kubenswrapper[4771]: I1001 14:57:07.715477 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:07 crc kubenswrapper[4771]: I1001 14:57:07.715489 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:07Z","lastTransitionTime":"2025-10-01T14:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:07 crc kubenswrapper[4771]: I1001 14:57:07.817870 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:07 crc kubenswrapper[4771]: I1001 14:57:07.817905 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:07 crc kubenswrapper[4771]: I1001 14:57:07.817916 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:07 crc kubenswrapper[4771]: I1001 14:57:07.817933 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:07 crc kubenswrapper[4771]: I1001 14:57:07.817946 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:07Z","lastTransitionTime":"2025-10-01T14:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:07 crc kubenswrapper[4771]: I1001 14:57:07.920493 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:07 crc kubenswrapper[4771]: I1001 14:57:07.920528 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:07 crc kubenswrapper[4771]: I1001 14:57:07.920540 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:07 crc kubenswrapper[4771]: I1001 14:57:07.920556 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:07 crc kubenswrapper[4771]: I1001 14:57:07.920567 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:07Z","lastTransitionTime":"2025-10-01T14:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.022570 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.022605 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.022614 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.022636 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.022648 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:08Z","lastTransitionTime":"2025-10-01T14:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.126064 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.126169 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.126193 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.126217 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.126234 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:08Z","lastTransitionTime":"2025-10-01T14:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.228810 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.228842 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.228851 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.228864 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.228873 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:08Z","lastTransitionTime":"2025-10-01T14:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.331772 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.331854 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.331881 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.331913 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.331935 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:08Z","lastTransitionTime":"2025-10-01T14:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.434923 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.434988 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.435012 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.435037 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.435055 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:08Z","lastTransitionTime":"2025-10-01T14:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.537615 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.537681 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.537699 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.537724 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.537778 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:08Z","lastTransitionTime":"2025-10-01T14:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.639896 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.639939 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.639949 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.639964 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.639976 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:08Z","lastTransitionTime":"2025-10-01T14:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.742577 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.742633 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.742649 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.742674 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.742691 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:08Z","lastTransitionTime":"2025-10-01T14:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.846643 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.846726 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.846780 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.846814 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.846837 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:08Z","lastTransitionTime":"2025-10-01T14:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.950501 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.950568 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.950585 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.950610 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.950628 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:08Z","lastTransitionTime":"2025-10-01T14:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.984854 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.984931 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.984988 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:57:08 crc kubenswrapper[4771]: E1001 14:57:08.985047 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:57:08 crc kubenswrapper[4771]: E1001 14:57:08.985253 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:57:08 crc kubenswrapper[4771]: I1001 14:57:08.985339 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:57:08 crc kubenswrapper[4771]: E1001 14:57:08.985420 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:57:08 crc kubenswrapper[4771]: E1001 14:57:08.985565 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.054271 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.054347 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.054374 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.054405 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.054428 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:09Z","lastTransitionTime":"2025-10-01T14:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.141120 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a49c960d-cfd1-4745-976b-59c62e3dcf8e-metrics-certs\") pod \"network-metrics-daemon-8qdkc\" (UID: \"a49c960d-cfd1-4745-976b-59c62e3dcf8e\") " pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:57:09 crc kubenswrapper[4771]: E1001 14:57:09.141385 4771 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 14:57:09 crc kubenswrapper[4771]: E1001 14:57:09.141453 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a49c960d-cfd1-4745-976b-59c62e3dcf8e-metrics-certs podName:a49c960d-cfd1-4745-976b-59c62e3dcf8e nodeName:}" failed. No retries permitted until 2025-10-01 14:57:41.14143209 +0000 UTC m=+105.760607281 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a49c960d-cfd1-4745-976b-59c62e3dcf8e-metrics-certs") pod "network-metrics-daemon-8qdkc" (UID: "a49c960d-cfd1-4745-976b-59c62e3dcf8e") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.157753 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.157794 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.157810 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.157829 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.157845 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:09Z","lastTransitionTime":"2025-10-01T14:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.260495 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.260547 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.260564 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.260586 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.260604 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:09Z","lastTransitionTime":"2025-10-01T14:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.364045 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.364120 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.364147 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.364173 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.364192 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:09Z","lastTransitionTime":"2025-10-01T14:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.467134 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.467170 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.467180 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.467192 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.467202 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:09Z","lastTransitionTime":"2025-10-01T14:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.569414 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.569473 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.569489 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.569514 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.569534 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:09Z","lastTransitionTime":"2025-10-01T14:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.672270 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.672323 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.672335 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.672351 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.672363 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:09Z","lastTransitionTime":"2025-10-01T14:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.774660 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.774709 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.774746 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.774764 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.774777 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:09Z","lastTransitionTime":"2025-10-01T14:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.877525 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.877571 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.877581 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.877594 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.877604 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:09Z","lastTransitionTime":"2025-10-01T14:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.980046 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.980104 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.980121 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.980141 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:09 crc kubenswrapper[4771]: I1001 14:57:09.980154 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:09Z","lastTransitionTime":"2025-10-01T14:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:10 crc kubenswrapper[4771]: I1001 14:57:10.083468 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:10 crc kubenswrapper[4771]: I1001 14:57:10.083550 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:10 crc kubenswrapper[4771]: I1001 14:57:10.083574 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:10 crc kubenswrapper[4771]: I1001 14:57:10.083607 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:10 crc kubenswrapper[4771]: I1001 14:57:10.083632 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:10Z","lastTransitionTime":"2025-10-01T14:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:10 crc kubenswrapper[4771]: I1001 14:57:10.187450 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:10 crc kubenswrapper[4771]: I1001 14:57:10.187505 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:10 crc kubenswrapper[4771]: I1001 14:57:10.187515 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:10 crc kubenswrapper[4771]: I1001 14:57:10.187535 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:10 crc kubenswrapper[4771]: I1001 14:57:10.187547 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:10Z","lastTransitionTime":"2025-10-01T14:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:10 crc kubenswrapper[4771]: I1001 14:57:10.291352 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:10 crc kubenswrapper[4771]: I1001 14:57:10.291408 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:10 crc kubenswrapper[4771]: I1001 14:57:10.291422 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:10 crc kubenswrapper[4771]: I1001 14:57:10.291445 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:10 crc kubenswrapper[4771]: I1001 14:57:10.291458 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:10Z","lastTransitionTime":"2025-10-01T14:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:10 crc kubenswrapper[4771]: I1001 14:57:10.395327 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:10 crc kubenswrapper[4771]: I1001 14:57:10.395369 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:10 crc kubenswrapper[4771]: I1001 14:57:10.395378 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:10 crc kubenswrapper[4771]: I1001 14:57:10.395400 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:10 crc kubenswrapper[4771]: I1001 14:57:10.395411 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:10Z","lastTransitionTime":"2025-10-01T14:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:10 crc kubenswrapper[4771]: I1001 14:57:10.498346 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:10 crc kubenswrapper[4771]: I1001 14:57:10.498415 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:10 crc kubenswrapper[4771]: I1001 14:57:10.498426 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:10 crc kubenswrapper[4771]: I1001 14:57:10.498448 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:10 crc kubenswrapper[4771]: I1001 14:57:10.498461 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:10Z","lastTransitionTime":"2025-10-01T14:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:10 crc kubenswrapper[4771]: I1001 14:57:10.602474 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:10 crc kubenswrapper[4771]: I1001 14:57:10.602541 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:10 crc kubenswrapper[4771]: I1001 14:57:10.602559 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:10 crc kubenswrapper[4771]: I1001 14:57:10.602583 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:10 crc kubenswrapper[4771]: I1001 14:57:10.602603 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:10Z","lastTransitionTime":"2025-10-01T14:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:10 crc kubenswrapper[4771]: I1001 14:57:10.705900 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:10 crc kubenswrapper[4771]: I1001 14:57:10.705979 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:10 crc kubenswrapper[4771]: I1001 14:57:10.706003 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:10 crc kubenswrapper[4771]: I1001 14:57:10.706033 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:10 crc kubenswrapper[4771]: I1001 14:57:10.706059 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:10Z","lastTransitionTime":"2025-10-01T14:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:10 crc kubenswrapper[4771]: I1001 14:57:10.809804 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:10 crc kubenswrapper[4771]: I1001 14:57:10.809870 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:10 crc kubenswrapper[4771]: I1001 14:57:10.809888 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:10 crc kubenswrapper[4771]: I1001 14:57:10.809915 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:10 crc kubenswrapper[4771]: I1001 14:57:10.809933 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:10Z","lastTransitionTime":"2025-10-01T14:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:10 crc kubenswrapper[4771]: I1001 14:57:10.913200 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:10 crc kubenswrapper[4771]: I1001 14:57:10.913280 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:10 crc kubenswrapper[4771]: I1001 14:57:10.913303 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:10 crc kubenswrapper[4771]: I1001 14:57:10.913333 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:10 crc kubenswrapper[4771]: I1001 14:57:10.913355 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:10Z","lastTransitionTime":"2025-10-01T14:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:10 crc kubenswrapper[4771]: I1001 14:57:10.984394 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:57:10 crc kubenswrapper[4771]: I1001 14:57:10.984447 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:57:10 crc kubenswrapper[4771]: I1001 14:57:10.984535 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:57:10 crc kubenswrapper[4771]: E1001 14:57:10.984761 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:57:10 crc kubenswrapper[4771]: I1001 14:57:10.984901 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:57:10 crc kubenswrapper[4771]: E1001 14:57:10.984972 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:57:10 crc kubenswrapper[4771]: E1001 14:57:10.985139 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:57:10 crc kubenswrapper[4771]: E1001 14:57:10.985320 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.015875 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.015928 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.015945 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.015967 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.015985 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:11Z","lastTransitionTime":"2025-10-01T14:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.119492 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.119588 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.119626 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.119665 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.119686 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:11Z","lastTransitionTime":"2025-10-01T14:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.223125 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.223195 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.223213 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.223238 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.223256 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:11Z","lastTransitionTime":"2025-10-01T14:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.325939 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.326023 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.326050 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.326073 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.326092 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:11Z","lastTransitionTime":"2025-10-01T14:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.430467 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.430540 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.430565 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.430594 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.430618 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:11Z","lastTransitionTime":"2025-10-01T14:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.534182 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.534251 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.534274 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.534299 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.534318 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:11Z","lastTransitionTime":"2025-10-01T14:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.639486 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.639550 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.639573 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.639601 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.639623 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:11Z","lastTransitionTime":"2025-10-01T14:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.742294 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.742346 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.742363 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.742384 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.742401 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:11Z","lastTransitionTime":"2025-10-01T14:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.846403 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.846462 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.846478 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.846502 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.846519 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:11Z","lastTransitionTime":"2025-10-01T14:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.949540 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.950459 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.950912 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.951300 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:11 crc kubenswrapper[4771]: I1001 14:57:11.951457 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:11Z","lastTransitionTime":"2025-10-01T14:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.054608 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.054698 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.054720 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.054778 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.054799 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:12Z","lastTransitionTime":"2025-10-01T14:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.158512 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.158570 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.158589 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.158613 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.158631 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:12Z","lastTransitionTime":"2025-10-01T14:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.261848 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.261901 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.261920 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.261949 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.261967 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:12Z","lastTransitionTime":"2025-10-01T14:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.365144 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.365210 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.365232 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.365261 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.365284 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:12Z","lastTransitionTime":"2025-10-01T14:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.468488 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.468549 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.468568 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.468588 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.468603 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:12Z","lastTransitionTime":"2025-10-01T14:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.572087 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.572228 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.572260 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.572284 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.572303 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:12Z","lastTransitionTime":"2025-10-01T14:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.675796 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.675902 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.675969 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.676005 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.676023 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:12Z","lastTransitionTime":"2025-10-01T14:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.780358 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.780419 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.780436 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.780462 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.780483 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:12Z","lastTransitionTime":"2025-10-01T14:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.883766 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.883824 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.883837 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.883859 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.883874 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:12Z","lastTransitionTime":"2025-10-01T14:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.984723 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.984715 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.984813 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.984904 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:57:12 crc kubenswrapper[4771]: E1001 14:57:12.985060 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:57:12 crc kubenswrapper[4771]: E1001 14:57:12.986114 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:57:12 crc kubenswrapper[4771]: E1001 14:57:12.986236 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:57:12 crc kubenswrapper[4771]: E1001 14:57:12.986374 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.986912 4771 scope.go:117] "RemoveContainer" containerID="76e2370562a77d3eb4433f434869c23f9e4743501a2069aed27f5fd25c61ec33" Oct 01 14:57:12 crc kubenswrapper[4771]: E1001 14:57:12.987430 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j7ntp_openshift-ovn-kubernetes(a061b8e2-74a8-4953-bfa2-5090a2f70459)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.987591 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.987655 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.987679 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.987726 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:12 crc kubenswrapper[4771]: I1001 14:57:12.987784 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:12Z","lastTransitionTime":"2025-10-01T14:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.090289 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.090354 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.090371 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.090395 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.090414 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:13Z","lastTransitionTime":"2025-10-01T14:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.193690 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.193769 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.193784 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.193805 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.193821 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:13Z","lastTransitionTime":"2025-10-01T14:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.297562 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.297607 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.297620 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.297644 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.297660 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:13Z","lastTransitionTime":"2025-10-01T14:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.401236 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.401274 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.401285 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.401303 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.401314 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:13Z","lastTransitionTime":"2025-10-01T14:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.488843 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9lvcz_c96a3328-c79b-4528-b9b5-badbc7380dd6/kube-multus/0.log" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.488892 4771 generic.go:334] "Generic (PLEG): container finished" podID="c96a3328-c79b-4528-b9b5-badbc7380dd6" containerID="1dea414c74a97184e0507113cc8069ee732fe85125aee58c080165c511835f1d" exitCode=1 Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.488923 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9lvcz" event={"ID":"c96a3328-c79b-4528-b9b5-badbc7380dd6","Type":"ContainerDied","Data":"1dea414c74a97184e0507113cc8069ee732fe85125aee58c080165c511835f1d"} Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.489296 4771 scope.go:117] "RemoveContainer" containerID="1dea414c74a97184e0507113cc8069ee732fe85125aee58c080165c511835f1d" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.503595 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.503647 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.503661 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.503677 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.504100 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:13Z","lastTransitionTime":"2025-10-01T14:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.510793 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b439925b-c79c-4b66-957f-5be27680bbc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8551072e274183e6126e06729369468bcf9a98a9a487ff2febe02b088a6a51aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd6957bcec2f931ddc0f584bb947cc6c8e19aee87c4fa2d2727151312e00bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea2a54d9ce79c067adf62a1a112758028562f68fd877d7b5c1f0ac808fde931\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba775d006f7fce834b114986ea63340af2cca2e7d10b6fc3be16555f278fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 14:56:16.707999 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 14:56:16.708248 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 14:56:16.709042 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1763409596/tls.crt::/tmp/serving-cert-1763409596/tls.key\\\\\\\"\\\\nI1001 14:56:17.040394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 14:56:17.043951 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 14:56:17.043976 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 14:56:17.043997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 14:56:17.044002 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 14:56:17.049470 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1001 14:56:17.049534 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 14:56:17.049584 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049607 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 14:56:17.049653 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 14:56:17.049673 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 14:56:17.049696 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 14:56:17.051084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d920faabd05e65200c4defe1588344af5ae2b34148b4213bdc86abe9dd19d236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:13Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.527449 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa69b46a-20c4-4ce1-8be9-e945c6865b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0651d298d7c1364b4de3b030574abd7d6ba8ccacab5872d74162878f92592cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b72097f8016d0e79134956d3555973cd7c137c30ab5eade1bd92805d16f66bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://495c07855c881f2a9cd788740e2197c165a38891c729a8563f3540bf5c6d8dcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fb83a6932caf24a6b2430836bc2800415028a8134a2355a816ccbb73efbaf46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:13Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.541985 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad0b1d01c4ff14005e40574c1663d3fafddd1527b8d283ee2f2364a861e3c351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:13Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.553920 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kmlgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2129365d-0a99-4cf0-a561-fd4126d1bfc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dabcd9bb31c364a82e0015bb58c48344f35fd73013cb9eb2c9d178ea6befbdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqx4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kmlgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:13Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.569659 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be36f4b-1171-4281-a7ac-43e411e080f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f58af4fa01651762bc2de081e844beb25bd1468804d6dbf99d01be10dd80e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jj6k4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:13Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.594102 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8156f536-4329-4bc1-8ea3-653c5d719a19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dbd5c544c544544637a62c6e11372b59bdddc28c6ddb14e2cdd93d2d02a85b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9cd85dad256af814a22cd9787f3c1d4061ff2b76a0fb1a92b547a10c809a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d50224da1e43f0ad8835be229e79e94d6603a0fcb0879b9dd306e4cc357c96f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6def3f33927474e3a3c86835c9606ccfd32e235441e1646cdba4cef0b41e2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe4f8936af05697692a6547962fa77f3ac8bb357e3484324073f7768b374060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:13Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.606621 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.606670 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.606683 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.606698 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.606711 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:13Z","lastTransitionTime":"2025-10-01T14:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.610384 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6ef7bbeb1f3ba58b4d1cfa2d55bcc4d7e7264d3db53e61495625b58a8503ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://679e6aca450f387cb31946627423ce8bc164d9815a7caf225e6cf1512f189cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:13Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.628194 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9jb9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe483b7b-ed55-4649-ac50-66ac981305e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e00c87efe2f7a38dd71171e28ae517733a09ed433bd3fec878757e5094d423ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2dtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb235560f4fabd7d33a9286e029c075fefa4dd44eea942cd8fe4ca74819ce722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2dtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9jb9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:13Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.644261 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4333c8d-eeca-4f52-a25b-4ba337b94469\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://621005989b46f09ca22c899db00d49019b74cf946212ec59e70ed6d11fd88118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc6ccbd034a5825c2e5e55954fa2ad1b33ba80ec6c5c4dcbcf629fa71d57f3a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc6ccbd034a5825c2e5e55954fa2ad1b33ba80ec6c5c4dcbcf629fa71d57f3a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:13Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.659398 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02063fddc7a255292ea7cdd6a318546b1573a0701787f4ab839ac7bea5b1ccb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:13Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.671189 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:13Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.683756 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7wr7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bb66959-800a-45dc-909f-1a093c578823\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abffda55d62e4f219933292ded99619fb5bfbbe87a5091c8aaaee6ea6162353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bdn7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7wr7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:13Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.699085 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"289ee6d3-fabe-417f-964c-76ca03c143cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37230c61c3cdf57d73df404731eb692cf20c46a8d983ee40c0aef7ee1f3ad839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://161a9b5c96d73213aa3d0261956487464c5e1398e47d221db9fccafdfdc51856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vck47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:13Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.709132 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.709356 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.709442 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.709648 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.709758 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:13Z","lastTransitionTime":"2025-10-01T14:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.727555 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a061b8e2-74a8-4953-bfa2-5090a2f70459\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76e2370562a77d3eb4433f434869c23f9e4743501a2069aed27f5fd25c61ec33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76e2370562a77d3eb4433f434869c23f9e4743501a2069aed27f5fd25c61ec33\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T14:56:58Z\\\",\\\"message\\\":\\\"pods:v4/a13607449821398607916) with []\\\\nI1001 14:56:58.971658 6554 factory.go:1336] Added *v1.Node event handler 7\\\\nI1001 14:56:58.971782 6554 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 14:56:58.971818 6554 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 14:56:58.971894 6554 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 14:56:58.971933 6554 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 14:56:58.971902 6554 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1001 14:56:58.971981 6554 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 14:56:58.972021 6554 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 14:56:58.972083 6554 factory.go:656] Stopping watch factory\\\\nI1001 14:56:58.972119 6554 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 14:56:58.972126 6554 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 14:56:58.972841 6554 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1001 14:56:58.972996 6554 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1001 14:56:58.973494 6554 ovnkube.go:599] Stopped ovnkube\\\\nI1001 14:56:58.973545 6554 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1001 14:56:58.973645 6554 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j7ntp_openshift-ovn-kubernetes(a061b8e2-74a8-4953-bfa2-5090a2f70459)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j7ntp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:13Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.744646 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b2b4b8e-b886-4fa6-abf2-6bffd3d7dd4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe139fdfee8f2ebb2368fa660edd669455c3b903836d7ef6212dea9921d8488b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5837a628a7ed87d4bc032e06b4732df175e922bf49ecbffee596f79c5357c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e47111864672ec3e393187147b7390f995634d4d32bf75915b5cdbb3915aca9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eab2c2fe2d4eae570e4686e0c48ff8e9407ff544bcd9f5339371287c23449333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eab2c2fe2d4eae570e4686e0c48ff8e9407ff544bcd9f5339371287c23449333\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:13Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.759116 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:13Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.777259 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9lvcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96a3328-c79b-4528-b9b5-badbc7380dd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dea414c74a97184e0507113cc8069ee732fe85125aee58c080165c511835f1d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1dea414c74a97184e0507113cc8069ee732fe85125aee58c080165c511835f1d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T14:57:13Z\\\",\\\"message\\\":\\\"2025-10-01T14:56:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_cc4ecaff-34b0-427e-9f9c-08595102f3ba\\\\n2025-10-01T14:56:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_cc4ecaff-34b0-427e-9f9c-08595102f3ba to /host/opt/cni/bin/\\\\n2025-10-01T14:56:28Z [verbose] multus-daemon started\\\\n2025-10-01T14:56:28Z [verbose] Readiness Indicator file check\\\\n2025-10-01T14:57:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs5q5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9lvcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:13Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.790908 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8qdkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a49c960d-cfd1-4745-976b-59c62e3dcf8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9mq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9mq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8qdkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:13Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.807531 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:13Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.811893 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.812101 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.812265 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.812396 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.812505 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:13Z","lastTransitionTime":"2025-10-01T14:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.915611 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.915994 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.916143 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.916281 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.916454 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:13Z","lastTransitionTime":"2025-10-01T14:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.945222 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.945265 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.945281 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.945302 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.945319 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:13Z","lastTransitionTime":"2025-10-01T14:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:13 crc kubenswrapper[4771]: E1001 14:57:13.964866 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f03ada0f-e2c8-42c8-86e3-3e9572f1e63b\\\",\\\"systemUUID\\\":\\\"ab8b87ec-94d1-4eae-9ea3-b28f83991d01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:13Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.970256 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.970321 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.970341 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.970364 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.970407 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:13Z","lastTransitionTime":"2025-10-01T14:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:13 crc kubenswrapper[4771]: E1001 14:57:13.991298 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f03ada0f-e2c8-42c8-86e3-3e9572f1e63b\\\",\\\"systemUUID\\\":\\\"ab8b87ec-94d1-4eae-9ea3-b28f83991d01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:13Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.995940 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.995991 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.996005 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.996020 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:13 crc kubenswrapper[4771]: I1001 14:57:13.996035 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:13Z","lastTransitionTime":"2025-10-01T14:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:14 crc kubenswrapper[4771]: E1001 14:57:14.016843 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:13Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:13Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:13Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:13Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f03ada0f-e2c8-42c8-86e3-3e9572f1e63b\\\",\\\"systemUUID\\\":\\\"ab8b87ec-94d1-4eae-9ea3-b28f83991d01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:14Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.021964 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.022261 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.022411 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.022578 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.022782 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:14Z","lastTransitionTime":"2025-10-01T14:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:14 crc kubenswrapper[4771]: E1001 14:57:14.042650 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f03ada0f-e2c8-42c8-86e3-3e9572f1e63b\\\",\\\"systemUUID\\\":\\\"ab8b87ec-94d1-4eae-9ea3-b28f83991d01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:14Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.051023 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.051072 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.051083 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.051105 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.051117 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:14Z","lastTransitionTime":"2025-10-01T14:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:14 crc kubenswrapper[4771]: E1001 14:57:14.073376 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f03ada0f-e2c8-42c8-86e3-3e9572f1e63b\\\",\\\"systemUUID\\\":\\\"ab8b87ec-94d1-4eae-9ea3-b28f83991d01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:14Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:14 crc kubenswrapper[4771]: E1001 14:57:14.073547 4771 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.076585 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.076695 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.076719 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.076791 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.076823 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:14Z","lastTransitionTime":"2025-10-01T14:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.180986 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.181084 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.181109 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.181139 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.181159 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:14Z","lastTransitionTime":"2025-10-01T14:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.283865 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.283910 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.283921 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.283939 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.283953 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:14Z","lastTransitionTime":"2025-10-01T14:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.386688 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.386788 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.386809 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.386835 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.386852 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:14Z","lastTransitionTime":"2025-10-01T14:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.489952 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.490042 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.490068 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.490100 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.490124 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:14Z","lastTransitionTime":"2025-10-01T14:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.495525 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9lvcz_c96a3328-c79b-4528-b9b5-badbc7380dd6/kube-multus/0.log" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.495599 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9lvcz" event={"ID":"c96a3328-c79b-4528-b9b5-badbc7380dd6","Type":"ContainerStarted","Data":"4fb4d8406dba14d03e2f5ab3e220aadd2d4181a22563e3e178108c5d8e1b4e2b"} Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.518541 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6ef7bbeb1f3ba58b4d1cfa2d55bcc4d7e7264d3db53e61495625b58a8503ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://679e6aca450f387cb31946627423ce8bc164d9815a7caf225e6cf1512f189cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:14Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.537968 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9jb9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe483b7b-ed55-4649-ac50-66ac981305e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e00c87efe2f7a38dd71171e28ae517733a09ed433bd3fec878757e5094d423ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2dtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb235560f4fabd7d33a9286e029c075fefa4dd44eea942cd8fe4ca74819ce722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2dtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9jb9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:14Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.552566 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4333c8d-eeca-4f52-a25b-4ba337b94469\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://621005989b46f09ca22c899db00d49019b74cf946212ec59e70ed6d11fd88118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc6ccbd034a5825c2e5e55954fa2ad1b33ba80ec6c5c4dcbcf629fa71d57f3a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc6ccbd034a5825c2e5e55954fa2ad1b33ba80ec6c5c4dcbcf629fa71d57f3a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:14Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.570377 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02063fddc7a255292ea7cdd6a318546b1573a0701787f4ab839ac7bea5b1ccb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:14Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.584603 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:14Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.593572 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.593620 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.593633 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.593652 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.593665 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:14Z","lastTransitionTime":"2025-10-01T14:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.599050 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7wr7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bb66959-800a-45dc-909f-1a093c578823\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abffda55d62e4f219933292ded99619fb5bfbbe87a5091c8aaaee6ea6162353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bdn7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7wr7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:14Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.619324 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"289ee6d3-fabe-417f-964c-76ca03c143cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37230c61c3cdf57d73df404731eb692cf20c46a8d983ee40c0aef7ee1f3ad839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://161a9b5c96d73213aa3d0261956487464c5e1398e47d221db9fccafdfdc51856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vck47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:14Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.644814 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a061b8e2-74a8-4953-bfa2-5090a2f70459\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76e2370562a77d3eb4433f434869c23f9e4743501a2069aed27f5fd25c61ec33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76e2370562a77d3eb4433f434869c23f9e4743501a2069aed27f5fd25c61ec33\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T14:56:58Z\\\",\\\"message\\\":\\\"pods:v4/a13607449821398607916) with []\\\\nI1001 14:56:58.971658 6554 factory.go:1336] Added *v1.Node event handler 7\\\\nI1001 14:56:58.971782 6554 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 14:56:58.971818 6554 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 14:56:58.971894 6554 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 14:56:58.971933 6554 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 14:56:58.971902 6554 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1001 14:56:58.971981 6554 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 14:56:58.972021 6554 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 14:56:58.972083 6554 factory.go:656] Stopping watch factory\\\\nI1001 14:56:58.972119 6554 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 14:56:58.972126 6554 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 14:56:58.972841 6554 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1001 14:56:58.972996 6554 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1001 14:56:58.973494 6554 ovnkube.go:599] Stopped ovnkube\\\\nI1001 14:56:58.973545 6554 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1001 14:56:58.973645 6554 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j7ntp_openshift-ovn-kubernetes(a061b8e2-74a8-4953-bfa2-5090a2f70459)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j7ntp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:14Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.661802 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b2b4b8e-b886-4fa6-abf2-6bffd3d7dd4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe139fdfee8f2ebb2368fa660edd669455c3b903836d7ef6212dea9921d8488b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5837a628a7ed87d4bc032e06b4732df175e922bf49ecbffee596f79c5357c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e47111864672ec3e393187147b7390f995634d4d32bf75915b5cdbb3915aca9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eab2c2fe2d4eae570e4686e0c48ff8e9407ff544bcd9f5339371287c23449333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eab2c2fe2d4eae570e4686e0c48ff8e9407ff544bcd9f5339371287c23449333\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:14Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.678261 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:14Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.694293 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9lvcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96a3328-c79b-4528-b9b5-badbc7380dd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fb4d8406dba14d03e2f5ab3e220aadd2d4181a22563e3e178108c5d8e1b4e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1dea414c74a97184e0507113cc8069ee732fe85125aee58c080165c511835f1d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T14:57:13Z\\\",\\\"message\\\":\\\"2025-10-01T14:56:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_cc4ecaff-34b0-427e-9f9c-08595102f3ba\\\\n2025-10-01T14:56:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_cc4ecaff-34b0-427e-9f9c-08595102f3ba to /host/opt/cni/bin/\\\\n2025-10-01T14:56:28Z [verbose] multus-daemon started\\\\n2025-10-01T14:56:28Z [verbose] Readiness Indicator file check\\\\n2025-10-01T14:57:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:57:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs5q5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9lvcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:14Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.696546 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.696621 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.696644 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.696672 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.696694 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:14Z","lastTransitionTime":"2025-10-01T14:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.708056 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8qdkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a49c960d-cfd1-4745-976b-59c62e3dcf8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9mq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9mq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8qdkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:14Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.725487 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:14Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.745334 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b439925b-c79c-4b66-957f-5be27680bbc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8551072e274183e6126e06729369468bcf9a98a9a487ff2febe02b088a6a51aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd6957bcec2f931ddc0f584bb947cc6c8e19aee87c4fa2d2727151312e00bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea2a54d9ce79c067adf62a1a112758028562f68fd877d7b5c1f0ac808fde931\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba775d006f7fce834b114986ea63340af2cca2e7d10b6fc3be16555f278fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 14:56:16.707999 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 14:56:16.708248 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 14:56:16.709042 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1763409596/tls.crt::/tmp/serving-cert-1763409596/tls.key\\\\\\\"\\\\nI1001 14:56:17.040394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 14:56:17.043951 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 14:56:17.043976 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 14:56:17.043997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 14:56:17.044002 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 14:56:17.049470 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1001 14:56:17.049534 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 14:56:17.049584 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049607 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 14:56:17.049653 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 14:56:17.049673 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 14:56:17.049696 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 14:56:17.051084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d920faabd05e65200c4defe1588344af5ae2b34148b4213bdc86abe9dd19d236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:14Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.759925 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa69b46a-20c4-4ce1-8be9-e945c6865b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0651d298d7c1364b4de3b030574abd7d6ba8ccacab5872d74162878f92592cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b72097f8016d0e79134956d3555973cd7c137c30ab5eade1bd92805d16f66bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://495c07855c881f2a9cd788740e2197c165a38891c729a8563f3540bf5c6d8dcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fb83a6932caf24a6b2430836bc2800415028a8134a2355a816ccbb73efbaf46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:14Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.771047 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad0b1d01c4ff14005e40574c1663d3fafddd1527b8d283ee2f2364a861e3c351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:14Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.782948 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kmlgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2129365d-0a99-4cf0-a561-fd4126d1bfc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dabcd9bb31c364a82e0015bb58c48344f35fd73013cb9eb2c9d178ea6befbdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqx4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kmlgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:14Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.800185 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.800255 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.800273 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.800297 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.800317 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:14Z","lastTransitionTime":"2025-10-01T14:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.801948 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be36f4b-1171-4281-a7ac-43e411e080f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f58af4fa01651762bc2de081e844beb25bd1468804d6dbf99d01be10dd80e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jj6k4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:14Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.820401 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8156f536-4329-4bc1-8ea3-653c5d719a19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dbd5c544c544544637a62c6e11372b59bdddc28c6ddb14e2cdd93d2d02a85b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9cd85dad256af814a22cd9787f3c1d4061ff2b76a0fb1a92b547a10c809a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d50224da1e43f0ad8835be229e79e94d6603a0fcb0879b9dd306e4cc357c96f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6def3f33927474e3a3c86835c9606ccfd32e235441e1646cdba4cef0b41e2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe4f8936af05697692a6547962fa77f3ac8bb357e3484324073f7768b374060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:14Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.904264 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.904342 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.904364 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.904389 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.904413 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:14Z","lastTransitionTime":"2025-10-01T14:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.984338 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.984354 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.984365 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:57:14 crc kubenswrapper[4771]: E1001 14:57:14.984992 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:57:14 crc kubenswrapper[4771]: E1001 14:57:14.984969 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:57:14 crc kubenswrapper[4771]: E1001 14:57:14.985115 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:57:14 crc kubenswrapper[4771]: I1001 14:57:14.984423 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:57:14 crc kubenswrapper[4771]: E1001 14:57:14.985226 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.008026 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.008365 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.008493 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.008631 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.008792 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:15Z","lastTransitionTime":"2025-10-01T14:57:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.111951 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.112371 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.112552 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.112694 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.112905 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:15Z","lastTransitionTime":"2025-10-01T14:57:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.216116 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.216171 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.216188 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.216210 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.216229 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:15Z","lastTransitionTime":"2025-10-01T14:57:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.319973 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.320051 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.320076 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.320109 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.320134 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:15Z","lastTransitionTime":"2025-10-01T14:57:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.424038 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.424508 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.424687 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.425141 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.425444 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:15Z","lastTransitionTime":"2025-10-01T14:57:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.528708 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.528979 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.529061 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.529156 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.529237 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:15Z","lastTransitionTime":"2025-10-01T14:57:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.633469 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.633536 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.633554 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.633580 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.633597 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:15Z","lastTransitionTime":"2025-10-01T14:57:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.737715 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.738325 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.738506 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.738692 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.738922 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:15Z","lastTransitionTime":"2025-10-01T14:57:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.841376 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.841438 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.841456 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.841479 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.841497 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:15Z","lastTransitionTime":"2025-10-01T14:57:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.945197 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.945632 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.945850 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.946071 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:15 crc kubenswrapper[4771]: I1001 14:57:15.946251 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:15Z","lastTransitionTime":"2025-10-01T14:57:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.004540 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b2b4b8e-b886-4fa6-abf2-6bffd3d7dd4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe139fdfee8f2ebb2368fa660edd669455c3b903836d7ef6212dea9921d8488b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5837a628a7ed87d4bc032e06b4732df175e922bf49ecbffee596f79c5357c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e47111864672ec3e393187147b7390f995634d4d32bf75915b5cdbb3915aca9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eab2c2fe2d4eae570e4686e0c48ff8e9407ff544bcd9f5339371287c23449333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eab2c2fe2d4eae570e4686e0c48ff8e9407ff544bcd9f5339371287c23449333\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:16Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.026154 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02063fddc7a255292ea7cdd6a318546b1573a0701787f4ab839ac7bea5b1ccb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:16Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.039274 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:16Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.049860 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.049903 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.049914 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.049928 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.049938 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:16Z","lastTransitionTime":"2025-10-01T14:57:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.050951 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7wr7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bb66959-800a-45dc-909f-1a093c578823\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abffda55d62e4f219933292ded99619fb5bfbbe87a5091c8aaaee6ea6162353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bdn7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7wr7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:16Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.063253 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"289ee6d3-fabe-417f-964c-76ca03c143cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37230c61c3cdf57d73df404731eb692cf20c46a8d983ee40c0aef7ee1f3ad839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://161a9b5c96d73213aa3d0261956487464c5e1398e47d221db9fccafdfdc51856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vck47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:16Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.083315 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a061b8e2-74a8-4953-bfa2-5090a2f70459\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76e2370562a77d3eb4433f434869c23f9e4743501a2069aed27f5fd25c61ec33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76e2370562a77d3eb4433f434869c23f9e4743501a2069aed27f5fd25c61ec33\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T14:56:58Z\\\",\\\"message\\\":\\\"pods:v4/a13607449821398607916) with []\\\\nI1001 14:56:58.971658 6554 factory.go:1336] Added *v1.Node event handler 7\\\\nI1001 14:56:58.971782 6554 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 14:56:58.971818 6554 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 14:56:58.971894 6554 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 14:56:58.971933 6554 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 14:56:58.971902 6554 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1001 14:56:58.971981 6554 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 14:56:58.972021 6554 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 14:56:58.972083 6554 factory.go:656] Stopping watch factory\\\\nI1001 14:56:58.972119 6554 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 14:56:58.972126 6554 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 14:56:58.972841 6554 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1001 14:56:58.972996 6554 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1001 14:56:58.973494 6554 ovnkube.go:599] Stopped ovnkube\\\\nI1001 14:56:58.973545 6554 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1001 14:56:58.973645 6554 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-j7ntp_openshift-ovn-kubernetes(a061b8e2-74a8-4953-bfa2-5090a2f70459)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j7ntp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:16Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.096933 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:16Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.116471 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:16Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.135144 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9lvcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96a3328-c79b-4528-b9b5-badbc7380dd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fb4d8406dba14d03e2f5ab3e220aadd2d4181a22563e3e178108c5d8e1b4e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1dea414c74a97184e0507113cc8069ee732fe85125aee58c080165c511835f1d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T14:57:13Z\\\",\\\"message\\\":\\\"2025-10-01T14:56:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_cc4ecaff-34b0-427e-9f9c-08595102f3ba\\\\n2025-10-01T14:56:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_cc4ecaff-34b0-427e-9f9c-08595102f3ba to /host/opt/cni/bin/\\\\n2025-10-01T14:56:28Z [verbose] multus-daemon started\\\\n2025-10-01T14:56:28Z [verbose] Readiness Indicator file check\\\\n2025-10-01T14:57:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:57:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs5q5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9lvcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:16Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.149281 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8qdkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a49c960d-cfd1-4745-976b-59c62e3dcf8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9mq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9mq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8qdkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:16Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.166135 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.166178 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.166189 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.166205 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.166216 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:16Z","lastTransitionTime":"2025-10-01T14:57:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.202489 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8156f536-4329-4bc1-8ea3-653c5d719a19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dbd5c544c544544637a62c6e11372b59bdddc28c6ddb14e2cdd93d2d02a85b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9cd85dad256af814a22cd9787f3c1d4061ff2b76a0fb1a92b547a10c809a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d50224da1e43f0ad8835be229e79e94d6603a0fcb0879b9dd306e4cc357c96f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6def3f33927474e3a3c86835c9606ccfd32e235441e1646cdba4cef0b41e2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe4f8936af05697692a6547962fa77f3ac8bb357e3484324073f7768b374060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:16Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.219166 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b439925b-c79c-4b66-957f-5be27680bbc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8551072e274183e6126e06729369468bcf9a98a9a487ff2febe02b088a6a51aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd6957bcec2f931ddc0f584bb947cc6c8e19aee87c4fa2d2727151312e00bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea2a54d9ce79c067adf62a1a112758028562f68fd877d7b5c1f0ac808fde931\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba775d006f7fce834b114986ea63340af2cca2e7d10b6fc3be16555f278fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 14:56:16.707999 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 14:56:16.708248 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 14:56:16.709042 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1763409596/tls.crt::/tmp/serving-cert-1763409596/tls.key\\\\\\\"\\\\nI1001 14:56:17.040394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 14:56:17.043951 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 14:56:17.043976 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 14:56:17.043997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 14:56:17.044002 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 14:56:17.049470 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1001 14:56:17.049534 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 14:56:17.049584 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049607 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 14:56:17.049653 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 14:56:17.049673 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 14:56:17.049696 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 14:56:17.051084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d920faabd05e65200c4defe1588344af5ae2b34148b4213bdc86abe9dd19d236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:16Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.233335 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa69b46a-20c4-4ce1-8be9-e945c6865b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0651d298d7c1364b4de3b030574abd7d6ba8ccacab5872d74162878f92592cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b72097f8016d0e79134956d3555973cd7c137c30ab5eade1bd92805d16f66bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://495c07855c881f2a9cd788740e2197c165a38891c729a8563f3540bf5c6d8dcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fb83a6932caf24a6b2430836bc2800415028a8134a2355a816ccbb73efbaf46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:16Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.247208 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad0b1d01c4ff14005e40574c1663d3fafddd1527b8d283ee2f2364a861e3c351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:16Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.261475 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kmlgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2129365d-0a99-4cf0-a561-fd4126d1bfc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dabcd9bb31c364a82e0015bb58c48344f35fd73013cb9eb2c9d178ea6befbdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqx4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kmlgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:16Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.269100 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.269157 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.269175 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.269199 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.269216 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:16Z","lastTransitionTime":"2025-10-01T14:57:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.276272 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be36f4b-1171-4281-a7ac-43e411e080f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f58af4fa01651762bc2de081e844beb25bd1468804d6dbf99d01be10dd80e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jj6k4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:16Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.288366 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4333c8d-eeca-4f52-a25b-4ba337b94469\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://621005989b46f09ca22c899db00d49019b74cf946212ec59e70ed6d11fd88118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc6ccbd034a5825c2e5e55954fa2ad1b33ba80ec6c5c4dcbcf629fa71d57f3a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc6ccbd034a5825c2e5e55954fa2ad1b33ba80ec6c5c4dcbcf629fa71d57f3a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:16Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.304363 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6ef7bbeb1f3ba58b4d1cfa2d55bcc4d7e7264d3db53e61495625b58a8503ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://679e6aca450f387cb31946627423ce8bc164d9815a7caf225e6cf1512f189cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:16Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.316962 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9jb9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe483b7b-ed55-4649-ac50-66ac981305e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e00c87efe2f7a38dd71171e28ae517733a09ed433bd3fec878757e5094d423ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2dtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb235560f4fabd7d33a9286e029c075fefa4dd44eea942cd8fe4ca74819ce722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2dtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9jb9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:16Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.371330 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.371369 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.371380 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.371394 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.371404 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:16Z","lastTransitionTime":"2025-10-01T14:57:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.474268 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.474595 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.474844 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.475013 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.475147 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:16Z","lastTransitionTime":"2025-10-01T14:57:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.577564 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.577703 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.577717 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.577746 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.577757 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:16Z","lastTransitionTime":"2025-10-01T14:57:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.680949 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.681662 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.681772 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.681925 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.682028 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:16Z","lastTransitionTime":"2025-10-01T14:57:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.784983 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.785300 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.785371 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.785456 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.785535 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:16Z","lastTransitionTime":"2025-10-01T14:57:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.888616 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.888700 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.888720 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.888825 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.888859 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:16Z","lastTransitionTime":"2025-10-01T14:57:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.984574 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.984646 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.984606 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.984574 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:57:16 crc kubenswrapper[4771]: E1001 14:57:16.984931 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:57:16 crc kubenswrapper[4771]: E1001 14:57:16.985203 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:57:16 crc kubenswrapper[4771]: E1001 14:57:16.985286 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:57:16 crc kubenswrapper[4771]: E1001 14:57:16.985885 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.995872 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.996851 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.996918 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.996988 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:16 crc kubenswrapper[4771]: I1001 14:57:16.997098 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:16Z","lastTransitionTime":"2025-10-01T14:57:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:17 crc kubenswrapper[4771]: I1001 14:57:17.100994 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:17 crc kubenswrapper[4771]: I1001 14:57:17.101062 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:17 crc kubenswrapper[4771]: I1001 14:57:17.101080 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:17 crc kubenswrapper[4771]: I1001 14:57:17.101106 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:17 crc kubenswrapper[4771]: I1001 14:57:17.101124 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:17Z","lastTransitionTime":"2025-10-01T14:57:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:17 crc kubenswrapper[4771]: I1001 14:57:17.204620 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:17 crc kubenswrapper[4771]: I1001 14:57:17.205045 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:17 crc kubenswrapper[4771]: I1001 14:57:17.205148 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:17 crc kubenswrapper[4771]: I1001 14:57:17.205245 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:17 crc kubenswrapper[4771]: I1001 14:57:17.205337 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:17Z","lastTransitionTime":"2025-10-01T14:57:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:17 crc kubenswrapper[4771]: I1001 14:57:17.308333 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:17 crc kubenswrapper[4771]: I1001 14:57:17.308821 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:17 crc kubenswrapper[4771]: I1001 14:57:17.309061 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:17 crc kubenswrapper[4771]: I1001 14:57:17.309246 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:17 crc kubenswrapper[4771]: I1001 14:57:17.309420 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:17Z","lastTransitionTime":"2025-10-01T14:57:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:17 crc kubenswrapper[4771]: I1001 14:57:17.413468 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:17 crc kubenswrapper[4771]: I1001 14:57:17.413972 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:17 crc kubenswrapper[4771]: I1001 14:57:17.414129 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:17 crc kubenswrapper[4771]: I1001 14:57:17.414270 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:17 crc kubenswrapper[4771]: I1001 14:57:17.414582 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:17Z","lastTransitionTime":"2025-10-01T14:57:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:17 crc kubenswrapper[4771]: I1001 14:57:17.517617 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:17 crc kubenswrapper[4771]: I1001 14:57:17.517680 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:17 crc kubenswrapper[4771]: I1001 14:57:17.517702 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:17 crc kubenswrapper[4771]: I1001 14:57:17.517727 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:17 crc kubenswrapper[4771]: I1001 14:57:17.517771 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:17Z","lastTransitionTime":"2025-10-01T14:57:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:17 crc kubenswrapper[4771]: I1001 14:57:17.620486 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:17 crc kubenswrapper[4771]: I1001 14:57:17.620816 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:17 crc kubenswrapper[4771]: I1001 14:57:17.620917 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:17 crc kubenswrapper[4771]: I1001 14:57:17.621045 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:17 crc kubenswrapper[4771]: I1001 14:57:17.621147 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:17Z","lastTransitionTime":"2025-10-01T14:57:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:17 crc kubenswrapper[4771]: I1001 14:57:17.724560 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:17 crc kubenswrapper[4771]: I1001 14:57:17.724625 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:17 crc kubenswrapper[4771]: I1001 14:57:17.724644 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:17 crc kubenswrapper[4771]: I1001 14:57:17.724673 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:17 crc kubenswrapper[4771]: I1001 14:57:17.724692 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:17Z","lastTransitionTime":"2025-10-01T14:57:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:17 crc kubenswrapper[4771]: I1001 14:57:17.828257 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:17 crc kubenswrapper[4771]: I1001 14:57:17.828343 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:17 crc kubenswrapper[4771]: I1001 14:57:17.828369 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:17 crc kubenswrapper[4771]: I1001 14:57:17.828401 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:17 crc kubenswrapper[4771]: I1001 14:57:17.828425 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:17Z","lastTransitionTime":"2025-10-01T14:57:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:17 crc kubenswrapper[4771]: I1001 14:57:17.931316 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:17 crc kubenswrapper[4771]: I1001 14:57:17.931387 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:17 crc kubenswrapper[4771]: I1001 14:57:17.931409 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:17 crc kubenswrapper[4771]: I1001 14:57:17.931436 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:17 crc kubenswrapper[4771]: I1001 14:57:17.931457 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:17Z","lastTransitionTime":"2025-10-01T14:57:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.034037 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.034113 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.034137 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.034168 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.034190 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:18Z","lastTransitionTime":"2025-10-01T14:57:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.137517 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.137583 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.137601 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.137629 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.137650 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:18Z","lastTransitionTime":"2025-10-01T14:57:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.240774 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.240813 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.240824 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.240840 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.240851 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:18Z","lastTransitionTime":"2025-10-01T14:57:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.345130 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.345203 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.345221 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.345245 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.345264 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:18Z","lastTransitionTime":"2025-10-01T14:57:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.448897 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.448958 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.448979 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.449004 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.449021 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:18Z","lastTransitionTime":"2025-10-01T14:57:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.551067 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.551361 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.551435 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.551509 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.551643 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:18Z","lastTransitionTime":"2025-10-01T14:57:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.654843 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.654915 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.654941 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.654969 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.654992 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:18Z","lastTransitionTime":"2025-10-01T14:57:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.758067 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.758130 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.758148 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.758173 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.758190 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:18Z","lastTransitionTime":"2025-10-01T14:57:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.861336 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.861395 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.861412 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.861435 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.861452 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:18Z","lastTransitionTime":"2025-10-01T14:57:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.964272 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.964363 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.964384 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.964440 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.964460 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:18Z","lastTransitionTime":"2025-10-01T14:57:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.984815 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:57:18 crc kubenswrapper[4771]: E1001 14:57:18.985148 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.984926 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.984909 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:57:18 crc kubenswrapper[4771]: I1001 14:57:18.984842 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:57:18 crc kubenswrapper[4771]: E1001 14:57:18.985794 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:57:18 crc kubenswrapper[4771]: E1001 14:57:18.985524 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:57:18 crc kubenswrapper[4771]: E1001 14:57:18.985859 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.066919 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.066953 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.066963 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.066978 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.066987 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:19Z","lastTransitionTime":"2025-10-01T14:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.170676 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.170782 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.170807 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.170839 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.170864 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:19Z","lastTransitionTime":"2025-10-01T14:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.274261 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.274535 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.274704 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.274963 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.275144 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:19Z","lastTransitionTime":"2025-10-01T14:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.378134 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.378177 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.378193 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.378215 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.378231 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:19Z","lastTransitionTime":"2025-10-01T14:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.481571 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.481645 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.481662 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.481688 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.481705 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:19Z","lastTransitionTime":"2025-10-01T14:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.584995 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.585065 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.585083 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.585117 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.585136 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:19Z","lastTransitionTime":"2025-10-01T14:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.688330 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.688401 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.688425 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.688455 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.688478 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:19Z","lastTransitionTime":"2025-10-01T14:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.791958 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.792034 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.792054 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.792079 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.792100 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:19Z","lastTransitionTime":"2025-10-01T14:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.895164 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.895200 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.895211 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.895227 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.895238 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:19Z","lastTransitionTime":"2025-10-01T14:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.998273 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.998331 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.998350 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.998373 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:19 crc kubenswrapper[4771]: I1001 14:57:19.998391 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:19Z","lastTransitionTime":"2025-10-01T14:57:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.101395 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.101477 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.101496 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.101521 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.101540 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:20Z","lastTransitionTime":"2025-10-01T14:57:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.204213 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.204257 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.204267 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.204284 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.204299 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:20Z","lastTransitionTime":"2025-10-01T14:57:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.306953 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.307084 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.307120 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.307140 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.307150 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:20Z","lastTransitionTime":"2025-10-01T14:57:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.409821 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.409941 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.410016 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.410068 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.410089 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:20Z","lastTransitionTime":"2025-10-01T14:57:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.513703 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.513809 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.513834 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.513858 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.513875 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:20Z","lastTransitionTime":"2025-10-01T14:57:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.617187 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.617245 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.617271 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.617299 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.617319 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:20Z","lastTransitionTime":"2025-10-01T14:57:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.720169 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.720233 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.720258 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.720284 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.720307 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:20Z","lastTransitionTime":"2025-10-01T14:57:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.823281 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.823348 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.823365 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.823447 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.823483 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:20Z","lastTransitionTime":"2025-10-01T14:57:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.875103 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:57:20 crc kubenswrapper[4771]: E1001 14:57:20.875446 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:24.87540417 +0000 UTC m=+149.494579371 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.927585 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.927638 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.927657 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.927681 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.927699 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:20Z","lastTransitionTime":"2025-10-01T14:57:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.976211 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.976280 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.976327 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.976396 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:57:20 crc kubenswrapper[4771]: E1001 14:57:20.976446 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 14:57:20 crc kubenswrapper[4771]: E1001 14:57:20.976502 4771 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 14:57:20 crc kubenswrapper[4771]: E1001 14:57:20.976560 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 14:57:20 crc kubenswrapper[4771]: E1001 14:57:20.976564 4771 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 14:57:20 crc kubenswrapper[4771]: E1001 14:57:20.976583 4771 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 14:57:20 crc kubenswrapper[4771]: E1001 14:57:20.976587 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 14:58:24.976565563 +0000 UTC m=+149.595740764 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 01 14:57:20 crc kubenswrapper[4771]: E1001 14:57:20.976765 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-01 14:58:24.976710946 +0000 UTC m=+149.595886157 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 01 14:57:20 crc kubenswrapper[4771]: E1001 14:57:20.976446 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 01 14:57:20 crc kubenswrapper[4771]: E1001 14:57:20.976802 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-01 14:58:24.976784268 +0000 UTC m=+149.595959469 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 14:57:20 crc kubenswrapper[4771]: E1001 14:57:20.976833 4771 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 01 14:57:20 crc kubenswrapper[4771]: E1001 14:57:20.976860 4771 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 14:57:20 crc kubenswrapper[4771]: E1001 14:57:20.976936 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-01 14:58:24.976911701 +0000 UTC m=+149.596086902 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.985561 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.985575 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.985575 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:57:20 crc kubenswrapper[4771]: I1001 14:57:20.985601 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:57:20 crc kubenswrapper[4771]: E1001 14:57:20.985931 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:57:20 crc kubenswrapper[4771]: E1001 14:57:20.986028 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:57:20 crc kubenswrapper[4771]: E1001 14:57:20.986157 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:57:20 crc kubenswrapper[4771]: E1001 14:57:20.986303 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.031259 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.031328 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.031352 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.031377 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.031396 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:21Z","lastTransitionTime":"2025-10-01T14:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.134879 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.134956 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.134981 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.135014 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.135039 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:21Z","lastTransitionTime":"2025-10-01T14:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.239060 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.239202 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.239222 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.239249 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.239266 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:21Z","lastTransitionTime":"2025-10-01T14:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.342233 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.342304 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.342324 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.342352 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.342372 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:21Z","lastTransitionTime":"2025-10-01T14:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.446552 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.446629 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.446647 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.446673 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.446691 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:21Z","lastTransitionTime":"2025-10-01T14:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.549770 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.549840 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.549861 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.549889 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.549912 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:21Z","lastTransitionTime":"2025-10-01T14:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.653215 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.653293 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.653317 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.653348 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.653364 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:21Z","lastTransitionTime":"2025-10-01T14:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.756376 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.756445 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.756466 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.756491 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.756508 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:21Z","lastTransitionTime":"2025-10-01T14:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.859402 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.859472 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.859487 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.859517 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.859535 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:21Z","lastTransitionTime":"2025-10-01T14:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.963117 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.963194 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.963211 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.963236 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:21 crc kubenswrapper[4771]: I1001 14:57:21.963254 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:21Z","lastTransitionTime":"2025-10-01T14:57:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.067031 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.067093 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.067115 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.067139 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.067157 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:22Z","lastTransitionTime":"2025-10-01T14:57:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.169964 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.170025 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.170042 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.170064 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.171883 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:22Z","lastTransitionTime":"2025-10-01T14:57:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.275625 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.275722 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.275784 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.275813 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.275838 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:22Z","lastTransitionTime":"2025-10-01T14:57:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.378724 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.378842 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.378867 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.378897 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.378918 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:22Z","lastTransitionTime":"2025-10-01T14:57:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.482367 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.482462 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.482487 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.482514 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.482535 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:22Z","lastTransitionTime":"2025-10-01T14:57:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.586015 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.586135 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.586157 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.586184 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.586205 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:22Z","lastTransitionTime":"2025-10-01T14:57:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.689526 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.689601 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.689625 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.689653 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.689671 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:22Z","lastTransitionTime":"2025-10-01T14:57:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.792678 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.792776 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.792803 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.792830 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.792848 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:22Z","lastTransitionTime":"2025-10-01T14:57:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.896030 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.896113 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.896137 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.896167 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.896188 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:22Z","lastTransitionTime":"2025-10-01T14:57:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.984300 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.984375 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.984382 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.984325 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:57:22 crc kubenswrapper[4771]: E1001 14:57:22.984486 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:57:22 crc kubenswrapper[4771]: E1001 14:57:22.984780 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:57:22 crc kubenswrapper[4771]: E1001 14:57:22.984879 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:57:22 crc kubenswrapper[4771]: E1001 14:57:22.984988 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.999198 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.999231 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.999248 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.999270 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:22 crc kubenswrapper[4771]: I1001 14:57:22.999287 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:22Z","lastTransitionTime":"2025-10-01T14:57:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:23 crc kubenswrapper[4771]: I1001 14:57:23.102809 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:23 crc kubenswrapper[4771]: I1001 14:57:23.102883 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:23 crc kubenswrapper[4771]: I1001 14:57:23.102913 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:23 crc kubenswrapper[4771]: I1001 14:57:23.102949 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:23 crc kubenswrapper[4771]: I1001 14:57:23.102976 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:23Z","lastTransitionTime":"2025-10-01T14:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:23 crc kubenswrapper[4771]: I1001 14:57:23.205397 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:23 crc kubenswrapper[4771]: I1001 14:57:23.205470 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:23 crc kubenswrapper[4771]: I1001 14:57:23.205488 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:23 crc kubenswrapper[4771]: I1001 14:57:23.205511 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:23 crc kubenswrapper[4771]: I1001 14:57:23.205528 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:23Z","lastTransitionTime":"2025-10-01T14:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:23 crc kubenswrapper[4771]: I1001 14:57:23.308954 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:23 crc kubenswrapper[4771]: I1001 14:57:23.309054 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:23 crc kubenswrapper[4771]: I1001 14:57:23.309074 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:23 crc kubenswrapper[4771]: I1001 14:57:23.309097 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:23 crc kubenswrapper[4771]: I1001 14:57:23.309117 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:23Z","lastTransitionTime":"2025-10-01T14:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:23 crc kubenswrapper[4771]: I1001 14:57:23.412190 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:23 crc kubenswrapper[4771]: I1001 14:57:23.412229 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:23 crc kubenswrapper[4771]: I1001 14:57:23.412240 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:23 crc kubenswrapper[4771]: I1001 14:57:23.412256 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:23 crc kubenswrapper[4771]: I1001 14:57:23.412268 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:23Z","lastTransitionTime":"2025-10-01T14:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:23 crc kubenswrapper[4771]: I1001 14:57:23.515134 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:23 crc kubenswrapper[4771]: I1001 14:57:23.515203 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:23 crc kubenswrapper[4771]: I1001 14:57:23.515221 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:23 crc kubenswrapper[4771]: I1001 14:57:23.515245 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:23 crc kubenswrapper[4771]: I1001 14:57:23.515262 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:23Z","lastTransitionTime":"2025-10-01T14:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:23 crc kubenswrapper[4771]: I1001 14:57:23.617798 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:23 crc kubenswrapper[4771]: I1001 14:57:23.617939 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:23 crc kubenswrapper[4771]: I1001 14:57:23.617963 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:23 crc kubenswrapper[4771]: I1001 14:57:23.617987 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:23 crc kubenswrapper[4771]: I1001 14:57:23.618005 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:23Z","lastTransitionTime":"2025-10-01T14:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:23 crc kubenswrapper[4771]: I1001 14:57:23.720523 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:23 crc kubenswrapper[4771]: I1001 14:57:23.720592 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:23 crc kubenswrapper[4771]: I1001 14:57:23.720609 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:23 crc kubenswrapper[4771]: I1001 14:57:23.720634 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:23 crc kubenswrapper[4771]: I1001 14:57:23.720653 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:23Z","lastTransitionTime":"2025-10-01T14:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:23 crc kubenswrapper[4771]: I1001 14:57:23.823681 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:23 crc kubenswrapper[4771]: I1001 14:57:23.823801 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:23 crc kubenswrapper[4771]: I1001 14:57:23.823830 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:23 crc kubenswrapper[4771]: I1001 14:57:23.823860 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:23 crc kubenswrapper[4771]: I1001 14:57:23.823884 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:23Z","lastTransitionTime":"2025-10-01T14:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:23 crc kubenswrapper[4771]: I1001 14:57:23.926662 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:23 crc kubenswrapper[4771]: I1001 14:57:23.926722 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:23 crc kubenswrapper[4771]: I1001 14:57:23.926767 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:23 crc kubenswrapper[4771]: I1001 14:57:23.926791 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:23 crc kubenswrapper[4771]: I1001 14:57:23.926808 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:23Z","lastTransitionTime":"2025-10-01T14:57:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.029500 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.029549 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.029566 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.029590 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.029606 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:24Z","lastTransitionTime":"2025-10-01T14:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.132852 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.132922 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.132944 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.132973 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.132996 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:24Z","lastTransitionTime":"2025-10-01T14:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.236627 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.236696 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.236711 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.236760 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.236773 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:24Z","lastTransitionTime":"2025-10-01T14:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.340355 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.340410 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.340417 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.340433 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.340442 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:24Z","lastTransitionTime":"2025-10-01T14:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.421152 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.421211 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.421221 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.421237 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.421249 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:24Z","lastTransitionTime":"2025-10-01T14:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:24 crc kubenswrapper[4771]: E1001 14:57:24.438849 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f03ada0f-e2c8-42c8-86e3-3e9572f1e63b\\\",\\\"systemUUID\\\":\\\"ab8b87ec-94d1-4eae-9ea3-b28f83991d01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:24Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.448121 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.448176 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.448194 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.448209 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.448219 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:24Z","lastTransitionTime":"2025-10-01T14:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:24 crc kubenswrapper[4771]: E1001 14:57:24.471044 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f03ada0f-e2c8-42c8-86e3-3e9572f1e63b\\\",\\\"systemUUID\\\":\\\"ab8b87ec-94d1-4eae-9ea3-b28f83991d01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:24Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.476060 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.476131 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.476154 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.476183 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.476207 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:24Z","lastTransitionTime":"2025-10-01T14:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:24 crc kubenswrapper[4771]: E1001 14:57:24.497815 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f03ada0f-e2c8-42c8-86e3-3e9572f1e63b\\\",\\\"systemUUID\\\":\\\"ab8b87ec-94d1-4eae-9ea3-b28f83991d01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:24Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.501696 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.501748 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.501761 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.501785 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.501797 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:24Z","lastTransitionTime":"2025-10-01T14:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:24 crc kubenswrapper[4771]: E1001 14:57:24.514094 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f03ada0f-e2c8-42c8-86e3-3e9572f1e63b\\\",\\\"systemUUID\\\":\\\"ab8b87ec-94d1-4eae-9ea3-b28f83991d01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:24Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.517658 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.517684 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.517695 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.517713 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.517725 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:24Z","lastTransitionTime":"2025-10-01T14:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:24 crc kubenswrapper[4771]: E1001 14:57:24.531841 4771 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404552Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865352Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-01T14:57:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f03ada0f-e2c8-42c8-86e3-3e9572f1e63b\\\",\\\"systemUUID\\\":\\\"ab8b87ec-94d1-4eae-9ea3-b28f83991d01\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:24Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:24 crc kubenswrapper[4771]: E1001 14:57:24.531995 4771 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.534010 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.534052 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.534061 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.534077 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.534088 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:24Z","lastTransitionTime":"2025-10-01T14:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.637185 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.637514 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.637623 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.637716 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.637852 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:24Z","lastTransitionTime":"2025-10-01T14:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.740835 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.740881 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.740893 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.740910 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.740925 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:24Z","lastTransitionTime":"2025-10-01T14:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.844121 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.844181 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.844201 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.844226 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.844244 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:24Z","lastTransitionTime":"2025-10-01T14:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.947064 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.947120 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.947138 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.947160 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.947177 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:24Z","lastTransitionTime":"2025-10-01T14:57:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.985131 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.985253 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:57:24 crc kubenswrapper[4771]: E1001 14:57:24.985293 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.985355 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.985379 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:57:24 crc kubenswrapper[4771]: E1001 14:57:24.985471 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:57:24 crc kubenswrapper[4771]: E1001 14:57:24.985843 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:57:24 crc kubenswrapper[4771]: E1001 14:57:24.986493 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:57:24 crc kubenswrapper[4771]: I1001 14:57:24.986806 4771 scope.go:117] "RemoveContainer" containerID="76e2370562a77d3eb4433f434869c23f9e4743501a2069aed27f5fd25c61ec33" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.050265 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.050329 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.050345 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.050403 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.050421 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:25Z","lastTransitionTime":"2025-10-01T14:57:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.153677 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.154119 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.154132 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.154157 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.154174 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:25Z","lastTransitionTime":"2025-10-01T14:57:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.257248 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.257317 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.257328 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.257342 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.257352 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:25Z","lastTransitionTime":"2025-10-01T14:57:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.360776 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.360863 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.360888 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.360923 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.360948 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:25Z","lastTransitionTime":"2025-10-01T14:57:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.464041 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.464085 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.464096 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.464111 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.464123 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:25Z","lastTransitionTime":"2025-10-01T14:57:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.541610 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j7ntp_a061b8e2-74a8-4953-bfa2-5090a2f70459/ovnkube-controller/2.log" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.544112 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" event={"ID":"a061b8e2-74a8-4953-bfa2-5090a2f70459","Type":"ContainerStarted","Data":"b73262e3265b2bd1a144d98c79c0de8e14fcd6af11f724dee4758e1424506b86"} Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.544598 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.561992 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4333c8d-eeca-4f52-a25b-4ba337b94469\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://621005989b46f09ca22c899db00d49019b74cf946212ec59e70ed6d11fd88118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc6ccbd034a5825c2e5e55954fa2ad1b33ba80ec6c5c4dcbcf629fa71d57f3a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc6ccbd034a5825c2e5e55954fa2ad1b33ba80ec6c5c4dcbcf629fa71d57f3a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:25Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.565988 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.566028 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.566037 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.566052 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.566064 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:25Z","lastTransitionTime":"2025-10-01T14:57:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.573677 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6ef7bbeb1f3ba58b4d1cfa2d55bcc4d7e7264d3db53e61495625b58a8503ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://679e6aca450f387cb31946627423ce8bc164d9815a7caf225e6cf1512f189cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:25Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.587652 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9jb9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe483b7b-ed55-4649-ac50-66ac981305e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e00c87efe2f7a38dd71171e28ae517733a09ed433bd3fec878757e5094d423ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2dtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb235560f4fabd7d33a9286e029c075fefa4dd44eea942cd8fe4ca74819ce722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2dtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9jb9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:25Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.603904 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b2b4b8e-b886-4fa6-abf2-6bffd3d7dd4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe139fdfee8f2ebb2368fa660edd669455c3b903836d7ef6212dea9921d8488b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5837a628a7ed87d4bc032e06b4732df175e922bf49ecbffee596f79c5357c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e47111864672ec3e393187147b7390f995634d4d32bf75915b5cdbb3915aca9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eab2c2fe2d4eae570e4686e0c48ff8e9407ff544bcd9f5339371287c23449333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eab2c2fe2d4eae570e4686e0c48ff8e9407ff544bcd9f5339371287c23449333\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:25Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.624797 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02063fddc7a255292ea7cdd6a318546b1573a0701787f4ab839ac7bea5b1ccb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:25Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.641762 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:25Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.654056 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7wr7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bb66959-800a-45dc-909f-1a093c578823\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abffda55d62e4f219933292ded99619fb5bfbbe87a5091c8aaaee6ea6162353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bdn7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7wr7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:25Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.667629 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.667696 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.667709 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.667726 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.667764 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:25Z","lastTransitionTime":"2025-10-01T14:57:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.676669 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"289ee6d3-fabe-417f-964c-76ca03c143cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37230c61c3cdf57d73df404731eb692cf20c46a8d983ee40c0aef7ee1f3ad839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://161a9b5c96d73213aa3d0261956487464c5e1398e47d221db9fccafdfdc51856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vck47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:25Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.700409 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a061b8e2-74a8-4953-bfa2-5090a2f70459\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b73262e3265b2bd1a144d98c79c0de8e14fcd6af11f724dee4758e1424506b86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76e2370562a77d3eb4433f434869c23f9e4743501a2069aed27f5fd25c61ec33\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T14:56:58Z\\\",\\\"message\\\":\\\"pods:v4/a13607449821398607916) with []\\\\nI1001 14:56:58.971658 6554 factory.go:1336] Added *v1.Node event handler 7\\\\nI1001 14:56:58.971782 6554 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 14:56:58.971818 6554 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 14:56:58.971894 6554 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 14:56:58.971933 6554 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 14:56:58.971902 6554 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1001 14:56:58.971981 6554 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 14:56:58.972021 6554 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 14:56:58.972083 6554 factory.go:656] Stopping watch factory\\\\nI1001 14:56:58.972119 6554 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 14:56:58.972126 6554 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 14:56:58.972841 6554 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1001 14:56:58.972996 6554 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1001 14:56:58.973494 6554 ovnkube.go:599] Stopped ovnkube\\\\nI1001 14:56:58.973545 6554 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1001 14:56:58.973645 6554 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j7ntp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:25Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.716580 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:25Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.734523 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:25Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.754708 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9lvcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96a3328-c79b-4528-b9b5-badbc7380dd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fb4d8406dba14d03e2f5ab3e220aadd2d4181a22563e3e178108c5d8e1b4e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1dea414c74a97184e0507113cc8069ee732fe85125aee58c080165c511835f1d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T14:57:13Z\\\",\\\"message\\\":\\\"2025-10-01T14:56:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_cc4ecaff-34b0-427e-9f9c-08595102f3ba\\\\n2025-10-01T14:56:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_cc4ecaff-34b0-427e-9f9c-08595102f3ba to /host/opt/cni/bin/\\\\n2025-10-01T14:56:28Z [verbose] multus-daemon started\\\\n2025-10-01T14:56:28Z [verbose] Readiness Indicator file check\\\\n2025-10-01T14:57:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:57:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs5q5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9lvcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:25Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.771480 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8qdkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a49c960d-cfd1-4745-976b-59c62e3dcf8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9mq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9mq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8qdkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:25Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.772056 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.772607 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.772650 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.772683 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.772708 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:25Z","lastTransitionTime":"2025-10-01T14:57:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.797550 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8156f536-4329-4bc1-8ea3-653c5d719a19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dbd5c544c544544637a62c6e11372b59bdddc28c6ddb14e2cdd93d2d02a85b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9cd85dad256af814a22cd9787f3c1d4061ff2b76a0fb1a92b547a10c809a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d50224da1e43f0ad8835be229e79e94d6603a0fcb0879b9dd306e4cc357c96f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6def3f33927474e3a3c86835c9606ccfd32e235441e1646cdba4cef0b41e2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe4f8936af05697692a6547962fa77f3ac8bb357e3484324073f7768b374060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:25Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.813008 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b439925b-c79c-4b66-957f-5be27680bbc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8551072e274183e6126e06729369468bcf9a98a9a487ff2febe02b088a6a51aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd6957bcec2f931ddc0f584bb947cc6c8e19aee87c4fa2d2727151312e00bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea2a54d9ce79c067adf62a1a112758028562f68fd877d7b5c1f0ac808fde931\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba775d006f7fce834b114986ea63340af2cca2e7d10b6fc3be16555f278fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 14:56:16.707999 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 14:56:16.708248 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 14:56:16.709042 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1763409596/tls.crt::/tmp/serving-cert-1763409596/tls.key\\\\\\\"\\\\nI1001 14:56:17.040394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 14:56:17.043951 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 14:56:17.043976 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 14:56:17.043997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 14:56:17.044002 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 14:56:17.049470 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1001 14:56:17.049534 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 14:56:17.049584 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049607 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 14:56:17.049653 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 14:56:17.049673 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 14:56:17.049696 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 14:56:17.051084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d920faabd05e65200c4defe1588344af5ae2b34148b4213bdc86abe9dd19d236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:25Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.829240 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa69b46a-20c4-4ce1-8be9-e945c6865b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0651d298d7c1364b4de3b030574abd7d6ba8ccacab5872d74162878f92592cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b72097f8016d0e79134956d3555973cd7c137c30ab5eade1bd92805d16f66bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://495c07855c881f2a9cd788740e2197c165a38891c729a8563f3540bf5c6d8dcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fb83a6932caf24a6b2430836bc2800415028a8134a2355a816ccbb73efbaf46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:25Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.843065 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad0b1d01c4ff14005e40574c1663d3fafddd1527b8d283ee2f2364a861e3c351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:25Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.858208 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kmlgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2129365d-0a99-4cf0-a561-fd4126d1bfc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dabcd9bb31c364a82e0015bb58c48344f35fd73013cb9eb2c9d178ea6befbdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqx4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kmlgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:25Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.876087 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.876139 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.876157 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.876180 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.876205 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:25Z","lastTransitionTime":"2025-10-01T14:57:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.881830 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be36f4b-1171-4281-a7ac-43e411e080f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f58af4fa01651762bc2de081e844beb25bd1468804d6dbf99d01be10dd80e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jj6k4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:25Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.979753 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.979802 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.979814 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.979832 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.979845 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:25Z","lastTransitionTime":"2025-10-01T14:57:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:25 crc kubenswrapper[4771]: I1001 14:57:25.999588 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8qdkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a49c960d-cfd1-4745-976b-59c62e3dcf8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9mq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9mq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8qdkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:25Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.019393 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.035059 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.055585 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9lvcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96a3328-c79b-4528-b9b5-badbc7380dd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fb4d8406dba14d03e2f5ab3e220aadd2d4181a22563e3e178108c5d8e1b4e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1dea414c74a97184e0507113cc8069ee732fe85125aee58c080165c511835f1d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T14:57:13Z\\\",\\\"message\\\":\\\"2025-10-01T14:56:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_cc4ecaff-34b0-427e-9f9c-08595102f3ba\\\\n2025-10-01T14:56:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_cc4ecaff-34b0-427e-9f9c-08595102f3ba to /host/opt/cni/bin/\\\\n2025-10-01T14:56:28Z [verbose] multus-daemon started\\\\n2025-10-01T14:56:28Z [verbose] Readiness Indicator file check\\\\n2025-10-01T14:57:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:57:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs5q5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9lvcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.074627 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad0b1d01c4ff14005e40574c1663d3fafddd1527b8d283ee2f2364a861e3c351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.082720 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.082852 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.082875 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.082900 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.082920 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:26Z","lastTransitionTime":"2025-10-01T14:57:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.089673 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kmlgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2129365d-0a99-4cf0-a561-fd4126d1bfc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dabcd9bb31c364a82e0015bb58c48344f35fd73013cb9eb2c9d178ea6befbdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqx4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kmlgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.108322 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be36f4b-1171-4281-a7ac-43e411e080f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f58af4fa01651762bc2de081e844beb25bd1468804d6dbf99d01be10dd80e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jj6k4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.144584 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8156f536-4329-4bc1-8ea3-653c5d719a19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dbd5c544c544544637a62c6e11372b59bdddc28c6ddb14e2cdd93d2d02a85b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9cd85dad256af814a22cd9787f3c1d4061ff2b76a0fb1a92b547a10c809a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d50224da1e43f0ad8835be229e79e94d6603a0fcb0879b9dd306e4cc357c96f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6def3f33927474e3a3c86835c9606ccfd32e235441e1646cdba4cef0b41e2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe4f8936af05697692a6547962fa77f3ac8bb357e3484324073f7768b374060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.166292 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b439925b-c79c-4b66-957f-5be27680bbc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8551072e274183e6126e06729369468bcf9a98a9a487ff2febe02b088a6a51aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd6957bcec2f931ddc0f584bb947cc6c8e19aee87c4fa2d2727151312e00bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea2a54d9ce79c067adf62a1a112758028562f68fd877d7b5c1f0ac808fde931\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba775d006f7fce834b114986ea63340af2cca2e7d10b6fc3be16555f278fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 14:56:16.707999 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 14:56:16.708248 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 14:56:16.709042 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1763409596/tls.crt::/tmp/serving-cert-1763409596/tls.key\\\\\\\"\\\\nI1001 14:56:17.040394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 14:56:17.043951 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 14:56:17.043976 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 14:56:17.043997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 14:56:17.044002 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 14:56:17.049470 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1001 14:56:17.049534 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 14:56:17.049584 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049607 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 14:56:17.049653 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 14:56:17.049673 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 14:56:17.049696 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 14:56:17.051084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d920faabd05e65200c4defe1588344af5ae2b34148b4213bdc86abe9dd19d236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.185254 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.185373 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.185449 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.185483 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.185556 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:26Z","lastTransitionTime":"2025-10-01T14:57:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.186832 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa69b46a-20c4-4ce1-8be9-e945c6865b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0651d298d7c1364b4de3b030574abd7d6ba8ccacab5872d74162878f92592cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b72097f8016d0e79134956d3555973cd7c137c30ab5eade1bd92805d16f66bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://495c07855c881f2a9cd788740e2197c165a38891c729a8563f3540bf5c6d8dcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fb83a6932caf24a6b2430836bc2800415028a8134a2355a816ccbb73efbaf46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.201572 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4333c8d-eeca-4f52-a25b-4ba337b94469\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://621005989b46f09ca22c899db00d49019b74cf946212ec59e70ed6d11fd88118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc6ccbd034a5825c2e5e55954fa2ad1b33ba80ec6c5c4dcbcf629fa71d57f3a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc6ccbd034a5825c2e5e55954fa2ad1b33ba80ec6c5c4dcbcf629fa71d57f3a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.218899 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6ef7bbeb1f3ba58b4d1cfa2d55bcc4d7e7264d3db53e61495625b58a8503ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://679e6aca450f387cb31946627423ce8bc164d9815a7caf225e6cf1512f189cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.237329 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9jb9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe483b7b-ed55-4649-ac50-66ac981305e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e00c87efe2f7a38dd71171e28ae517733a09ed433bd3fec878757e5094d423ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2dtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb235560f4fabd7d33a9286e029c075fefa4dd44eea942cd8fe4ca74819ce722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2dtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9jb9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.254787 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7wr7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bb66959-800a-45dc-909f-1a093c578823\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abffda55d62e4f219933292ded99619fb5bfbbe87a5091c8aaaee6ea6162353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bdn7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7wr7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.271059 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"289ee6d3-fabe-417f-964c-76ca03c143cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37230c61c3cdf57d73df404731eb692cf20c46a8d983ee40c0aef7ee1f3ad839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://161a9b5c96d73213aa3d0261956487464c5e1398e47d221db9fccafdfdc51856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vck47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.287616 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.287678 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.287696 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.287720 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.287778 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:26Z","lastTransitionTime":"2025-10-01T14:57:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.295356 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a061b8e2-74a8-4953-bfa2-5090a2f70459\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b73262e3265b2bd1a144d98c79c0de8e14fcd6af11f724dee4758e1424506b86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76e2370562a77d3eb4433f434869c23f9e4743501a2069aed27f5fd25c61ec33\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T14:56:58Z\\\",\\\"message\\\":\\\"pods:v4/a13607449821398607916) with []\\\\nI1001 14:56:58.971658 6554 factory.go:1336] Added *v1.Node event handler 7\\\\nI1001 14:56:58.971782 6554 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 14:56:58.971818 6554 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 14:56:58.971894 6554 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 14:56:58.971933 6554 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 14:56:58.971902 6554 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1001 14:56:58.971981 6554 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 14:56:58.972021 6554 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 14:56:58.972083 6554 factory.go:656] Stopping watch factory\\\\nI1001 14:56:58.972119 6554 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 14:56:58.972126 6554 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 14:56:58.972841 6554 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1001 14:56:58.972996 6554 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1001 14:56:58.973494 6554 ovnkube.go:599] Stopped ovnkube\\\\nI1001 14:56:58.973545 6554 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1001 14:56:58.973645 6554 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j7ntp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.309182 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b2b4b8e-b886-4fa6-abf2-6bffd3d7dd4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe139fdfee8f2ebb2368fa660edd669455c3b903836d7ef6212dea9921d8488b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5837a628a7ed87d4bc032e06b4732df175e922bf49ecbffee596f79c5357c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e47111864672ec3e393187147b7390f995634d4d32bf75915b5cdbb3915aca9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eab2c2fe2d4eae570e4686e0c48ff8e9407ff544bcd9f5339371287c23449333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eab2c2fe2d4eae570e4686e0c48ff8e9407ff544bcd9f5339371287c23449333\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.328147 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02063fddc7a255292ea7cdd6a318546b1573a0701787f4ab839ac7bea5b1ccb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.346266 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.390548 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.390626 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.390652 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.390684 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.390710 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:26Z","lastTransitionTime":"2025-10-01T14:57:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.494540 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.494599 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.494612 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.494630 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.494642 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:26Z","lastTransitionTime":"2025-10-01T14:57:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.549240 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j7ntp_a061b8e2-74a8-4953-bfa2-5090a2f70459/ovnkube-controller/3.log" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.549975 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j7ntp_a061b8e2-74a8-4953-bfa2-5090a2f70459/ovnkube-controller/2.log" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.553089 4771 generic.go:334] "Generic (PLEG): container finished" podID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerID="b73262e3265b2bd1a144d98c79c0de8e14fcd6af11f724dee4758e1424506b86" exitCode=1 Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.553131 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" event={"ID":"a061b8e2-74a8-4953-bfa2-5090a2f70459","Type":"ContainerDied","Data":"b73262e3265b2bd1a144d98c79c0de8e14fcd6af11f724dee4758e1424506b86"} Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.553172 4771 scope.go:117] "RemoveContainer" containerID="76e2370562a77d3eb4433f434869c23f9e4743501a2069aed27f5fd25c61ec33" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.554614 4771 scope.go:117] "RemoveContainer" containerID="b73262e3265b2bd1a144d98c79c0de8e14fcd6af11f724dee4758e1424506b86" Oct 01 14:57:26 crc kubenswrapper[4771]: E1001 14:57:26.554857 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-j7ntp_openshift-ovn-kubernetes(a061b8e2-74a8-4953-bfa2-5090a2f70459)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.572573 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://02063fddc7a255292ea7cdd6a318546b1573a0701787f4ab839ac7bea5b1ccb7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.591666 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.597528 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.597569 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.597578 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.597595 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.597606 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:26Z","lastTransitionTime":"2025-10-01T14:57:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.603355 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7wr7q" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0bb66959-800a-45dc-909f-1a093c578823\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7abffda55d62e4f219933292ded99619fb5bfbbe87a5091c8aaaee6ea6162353\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bdn7t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7wr7q\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.619445 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"289ee6d3-fabe-417f-964c-76ca03c143cc\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://37230c61c3cdf57d73df404731eb692cf20c46a8d983ee40c0aef7ee1f3ad839\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://161a9b5c96d73213aa3d0261956487464c5e1398e47d221db9fccafdfdc51856\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2svt4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-vck47\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.643045 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a061b8e2-74a8-4953-bfa2-5090a2f70459\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b73262e3265b2bd1a144d98c79c0de8e14fcd6af11f724dee4758e1424506b86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76e2370562a77d3eb4433f434869c23f9e4743501a2069aed27f5fd25c61ec33\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T14:56:58Z\\\",\\\"message\\\":\\\"pods:v4/a13607449821398607916) with []\\\\nI1001 14:56:58.971658 6554 factory.go:1336] Added *v1.Node event handler 7\\\\nI1001 14:56:58.971782 6554 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1001 14:56:58.971818 6554 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1001 14:56:58.971894 6554 handler.go:208] Removed *v1.Node event handler 2\\\\nI1001 14:56:58.971933 6554 handler.go:208] Removed *v1.Node event handler 7\\\\nI1001 14:56:58.971902 6554 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI1001 14:56:58.971981 6554 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1001 14:56:58.972021 6554 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1001 14:56:58.972083 6554 factory.go:656] Stopping watch factory\\\\nI1001 14:56:58.972119 6554 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1001 14:56:58.972126 6554 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1001 14:56:58.972841 6554 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI1001 14:56:58.972996 6554 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI1001 14:56:58.973494 6554 ovnkube.go:599] Stopped ovnkube\\\\nI1001 14:56:58.973545 6554 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF1001 14:56:58.973645 6554 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b73262e3265b2bd1a144d98c79c0de8e14fcd6af11f724dee4758e1424506b86\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T14:57:25Z\\\",\\\"message\\\":\\\" Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 14:57:25.899286 6904 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1001 14:57:25.899340 6904 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 14:57:25.899481 6904 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 14:57:25.899682 6904 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI1001 14:57:25.899775 6904 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 14:57:25.899863 6904 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI1001 14:57:25.900213 6904 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:57:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nnspx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-j7ntp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.660234 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b2b4b8e-b886-4fa6-abf2-6bffd3d7dd4f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe139fdfee8f2ebb2368fa660edd669455c3b903836d7ef6212dea9921d8488b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5837a628a7ed87d4bc032e06b4732df175e922bf49ecbffee596f79c5357c53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e47111864672ec3e393187147b7390f995634d4d32bf75915b5cdbb3915aca9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eab2c2fe2d4eae570e4686e0c48ff8e9407ff544bcd9f5339371287c23449333\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://eab2c2fe2d4eae570e4686e0c48ff8e9407ff544bcd9f5339371287c23449333\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.675895 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.690236 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-9lvcz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c96a3328-c79b-4528-b9b5-badbc7380dd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:57:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fb4d8406dba14d03e2f5ab3e220aadd2d4181a22563e3e178108c5d8e1b4e2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1dea414c74a97184e0507113cc8069ee732fe85125aee58c080165c511835f1d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-01T14:57:13Z\\\",\\\"message\\\":\\\"2025-10-01T14:56:27+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_cc4ecaff-34b0-427e-9f9c-08595102f3ba\\\\n2025-10-01T14:56:27+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_cc4ecaff-34b0-427e-9f9c-08595102f3ba to /host/opt/cni/bin/\\\\n2025-10-01T14:56:28Z [verbose] multus-daemon started\\\\n2025-10-01T14:56:28Z [verbose] Readiness Indicator file check\\\\n2025-10-01T14:57:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:57:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xs5q5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-9lvcz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.700471 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.700616 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.700641 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.700666 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.700698 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:26Z","lastTransitionTime":"2025-10-01T14:57:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.705240 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-8qdkc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a49c960d-cfd1-4745-976b-59c62e3dcf8e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9mq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m9mq6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-8qdkc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.719635 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:16Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.735411 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b439925b-c79c-4b66-957f-5be27680bbc7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8551072e274183e6126e06729369468bcf9a98a9a487ff2febe02b088a6a51aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dd6957bcec2f931ddc0f584bb947cc6c8e19aee87c4fa2d2727151312e00bc2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://eea2a54d9ce79c067adf62a1a112758028562f68fd877d7b5c1f0ac808fde931\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://77ba775d006f7fce834b114986ea63340af2cca2e7d10b6fc3be16555f278fd1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9e373175246c2f3fda16bb068145e8bde0498cedf3a53f4c7b96168cce67570\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"message\\\":\\\"le observer\\\\nW1001 14:56:16.707999 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI1001 14:56:16.708248 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1001 14:56:16.709042 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1763409596/tls.crt::/tmp/serving-cert-1763409596/tls.key\\\\\\\"\\\\nI1001 14:56:17.040394 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1001 14:56:17.043951 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1001 14:56:17.043976 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1001 14:56:17.043997 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1001 14:56:17.044002 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1001 14:56:17.049470 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nI1001 14:56:17.049534 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW1001 14:56:17.049584 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049607 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1001 14:56:17.049630 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1001 14:56:17.049653 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1001 14:56:17.049673 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1001 14:56:17.049696 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1001 14:56:17.051084 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:11Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d920faabd05e65200c4defe1588344af5ae2b34148b4213bdc86abe9dd19d236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://80a2883398e8193ceeef8a6a139ff16b0009e0b11d1afd4327cf63548e50eef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.753006 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fa69b46a-20c4-4ce1-8be9-e945c6865b0f\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0651d298d7c1364b4de3b030574abd7d6ba8ccacab5872d74162878f92592cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b72097f8016d0e79134956d3555973cd7c137c30ab5eade1bd92805d16f66bf2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://495c07855c881f2a9cd788740e2197c165a38891c729a8563f3540bf5c6d8dcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fb83a6932caf24a6b2430836bc2800415028a8134a2355a816ccbb73efbaf46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.769025 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:21Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad0b1d01c4ff14005e40574c1663d3fafddd1527b8d283ee2f2364a861e3c351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.784800 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kmlgz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2129365d-0a99-4cf0-a561-fd4126d1bfc7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1dabcd9bb31c364a82e0015bb58c48344f35fd73013cb9eb2c9d178ea6befbdc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qqx4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:22Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kmlgz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.800912 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4be36f4b-1171-4281-a7ac-43e411e080f7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://65f58af4fa01651762bc2de081e844beb25bd1468804d6dbf99d01be10dd80e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3c3e127f71165b9f19abc802e9550c4f4deab3f83aca8ba8871499300ded6f3a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8c0ee00488c800954e177fbacf16bd430961f9b97f3680eff3c46b643f6de68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdeff3f36db0a07da4971f008f54a6f9b2245ec15dbe1217abeb44c8a8d923dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2de6fba686faa921950529c819a2c523a4b839272dddde9ddd4e176da4644a96\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bebcf45d5ac49baeafce59deaae0e47b8c5d978d2e7e2f88fbd0c54a6d9a8917\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fa4d46d8b475c3b21131ef4884d68ddc2c2279e12808b68ef00ee06b59713645\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:56:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:56:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27krx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-jj6k4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.803238 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.803289 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.803298 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.803313 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.803323 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:26Z","lastTransitionTime":"2025-10-01T14:57:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.820807 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8156f536-4329-4bc1-8ea3-653c5d719a19\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7dbd5c544c544544637a62c6e11372b59bdddc28c6ddb14e2cdd93d2d02a85b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb9cd85dad256af814a22cd9787f3c1d4061ff2b76a0fb1a92b547a10c809a52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d50224da1e43f0ad8835be229e79e94d6603a0fcb0879b9dd306e4cc357c96f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6def3f33927474e3a3c86835c9606ccfd32e235441e1646cdba4cef0b41e2f57\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe4f8936af05697692a6547962fa77f3ac8bb357e3484324073f7768b374060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f28dcab30cfdbe08cc44441794920f2674ec868d84f9e7fc81e09301e7aa889b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://866ed3ec6eea99bc3cda764b64a5593eb6340e8d8d25acc03c4552501ed13b9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7c3ad90be16a2a3a125986ce6bf68df189d87ea903123c5cdeb0306d5fe6446\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.834813 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:18Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e6ef7bbeb1f3ba58b4d1cfa2d55bcc4d7e7264d3db53e61495625b58a8503ee3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://679e6aca450f387cb31946627423ce8bc164d9815a7caf225e6cf1512f189cff\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.849251 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9jb9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fe483b7b-ed55-4649-ac50-66ac981305e2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:56:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e00c87efe2f7a38dd71171e28ae517733a09ed433bd3fec878757e5094d423ca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2dtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fb235560f4fabd7d33a9286e029c075fefa4dd44eea942cd8fe4ca74819ce722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:56:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2dtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:56:35Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-9jb9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.859966 4771 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b4333c8d-eeca-4f52-a25b-4ba337b94469\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-01T14:55:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://621005989b46f09ca22c899db00d49019b74cf946212ec59e70ed6d11fd88118\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-01T14:55:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc6ccbd034a5825c2e5e55954fa2ad1b33ba80ec6c5c4dcbcf629fa71d57f3a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bc6ccbd034a5825c2e5e55954fa2ad1b33ba80ec6c5c4dcbcf629fa71d57f3a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-01T14:55:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-01T14:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-01T14:55:56Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-01T14:57:26Z is after 2025-08-24T17:21:41Z" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.907064 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.907112 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.907128 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.907151 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.907168 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:26Z","lastTransitionTime":"2025-10-01T14:57:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.984664 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.984801 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.984865 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:57:26 crc kubenswrapper[4771]: E1001 14:57:26.985052 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:57:26 crc kubenswrapper[4771]: I1001 14:57:26.985074 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:57:26 crc kubenswrapper[4771]: E1001 14:57:26.985414 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:57:26 crc kubenswrapper[4771]: E1001 14:57:26.985558 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:57:26 crc kubenswrapper[4771]: E1001 14:57:26.985678 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.010523 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.010579 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.010591 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.010640 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.010659 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:27Z","lastTransitionTime":"2025-10-01T14:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.114013 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.114054 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.114065 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.114083 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.114163 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:27Z","lastTransitionTime":"2025-10-01T14:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.218081 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.218116 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.218127 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.218142 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.218154 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:27Z","lastTransitionTime":"2025-10-01T14:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.321029 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.321068 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.321079 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.321094 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.321107 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:27Z","lastTransitionTime":"2025-10-01T14:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.422970 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.423023 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.423040 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.423063 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.423080 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:27Z","lastTransitionTime":"2025-10-01T14:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.526552 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.526671 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.526695 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.526724 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.526778 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:27Z","lastTransitionTime":"2025-10-01T14:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.560568 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j7ntp_a061b8e2-74a8-4953-bfa2-5090a2f70459/ovnkube-controller/3.log" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.565676 4771 scope.go:117] "RemoveContainer" containerID="b73262e3265b2bd1a144d98c79c0de8e14fcd6af11f724dee4758e1424506b86" Oct 01 14:57:27 crc kubenswrapper[4771]: E1001 14:57:27.565961 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-j7ntp_openshift-ovn-kubernetes(a061b8e2-74a8-4953-bfa2-5090a2f70459)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.632828 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.632878 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.632895 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.632919 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.632940 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:27Z","lastTransitionTime":"2025-10-01T14:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.658821 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-9lvcz" podStartSLOduration=65.658793026 podStartE2EDuration="1m5.658793026s" podCreationTimestamp="2025-10-01 14:56:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:57:27.644414389 +0000 UTC m=+92.263589600" watchObservedRunningTime="2025-10-01 14:57:27.658793026 +0000 UTC m=+92.277968227" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.710930 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=70.710909104 podStartE2EDuration="1m10.710909104s" podCreationTimestamp="2025-10-01 14:56:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:57:27.710260399 +0000 UTC m=+92.329435640" watchObservedRunningTime="2025-10-01 14:57:27.710909104 +0000 UTC m=+92.330084285" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.711227 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-jj6k4" podStartSLOduration=65.711222112 podStartE2EDuration="1m5.711222112s" podCreationTimestamp="2025-10-01 14:56:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:57:27.685612744 +0000 UTC m=+92.304787925" watchObservedRunningTime="2025-10-01 14:57:27.711222112 +0000 UTC m=+92.330397283" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.735393 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.735458 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.735474 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.735492 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.735505 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:27Z","lastTransitionTime":"2025-10-01T14:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.748030 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=70.74801011 podStartE2EDuration="1m10.74801011s" podCreationTimestamp="2025-10-01 14:56:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:57:27.747885467 +0000 UTC m=+92.367060718" watchObservedRunningTime="2025-10-01 14:57:27.74801011 +0000 UTC m=+92.367185291" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.782468 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=65.782431551 podStartE2EDuration="1m5.782431551s" podCreationTimestamp="2025-10-01 14:56:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:57:27.765078332 +0000 UTC m=+92.384253513" watchObservedRunningTime="2025-10-01 14:57:27.782431551 +0000 UTC m=+92.401606732" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.810666 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-kmlgz" podStartSLOduration=65.810641312 podStartE2EDuration="1m5.810641312s" podCreationTimestamp="2025-10-01 14:56:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:57:27.796663535 +0000 UTC m=+92.415838726" watchObservedRunningTime="2025-10-01 14:57:27.810641312 +0000 UTC m=+92.429816523" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.811496 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=21.811484292 podStartE2EDuration="21.811484292s" podCreationTimestamp="2025-10-01 14:57:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:57:27.810252732 +0000 UTC m=+92.429427913" watchObservedRunningTime="2025-10-01 14:57:27.811484292 +0000 UTC m=+92.430659493" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.837848 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.837908 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.837925 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.837950 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.837967 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:27Z","lastTransitionTime":"2025-10-01T14:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.886503 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-9jb9x" podStartSLOduration=64.886472263 podStartE2EDuration="1m4.886472263s" podCreationTimestamp="2025-10-01 14:56:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:57:27.844118341 +0000 UTC m=+92.463293592" watchObservedRunningTime="2025-10-01 14:57:27.886472263 +0000 UTC m=+92.505647474" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.900841 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=43.900821829 podStartE2EDuration="43.900821829s" podCreationTimestamp="2025-10-01 14:56:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:57:27.900482681 +0000 UTC m=+92.519657872" watchObservedRunningTime="2025-10-01 14:57:27.900821829 +0000 UTC m=+92.519997000" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.939696 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.939761 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.939775 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.939794 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.939809 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:27Z","lastTransitionTime":"2025-10-01T14:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.948726 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-7wr7q" podStartSLOduration=65.948699614 podStartE2EDuration="1m5.948699614s" podCreationTimestamp="2025-10-01 14:56:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:57:27.947301591 +0000 UTC m=+92.566476762" watchObservedRunningTime="2025-10-01 14:57:27.948699614 +0000 UTC m=+92.567874825" Oct 01 14:57:27 crc kubenswrapper[4771]: I1001 14:57:27.964335 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podStartSLOduration=65.964306581 podStartE2EDuration="1m5.964306581s" podCreationTimestamp="2025-10-01 14:56:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:57:27.959808763 +0000 UTC m=+92.578983944" watchObservedRunningTime="2025-10-01 14:57:27.964306581 +0000 UTC m=+92.583481762" Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.042533 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.042614 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.042628 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.042646 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.042657 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:28Z","lastTransitionTime":"2025-10-01T14:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.145490 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.145555 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.145573 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.145598 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.145617 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:28Z","lastTransitionTime":"2025-10-01T14:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.248639 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.248694 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.248705 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.248724 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.248764 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:28Z","lastTransitionTime":"2025-10-01T14:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.351542 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.351603 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.351620 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.351642 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.351660 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:28Z","lastTransitionTime":"2025-10-01T14:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.454906 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.454972 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.454989 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.455013 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.455030 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:28Z","lastTransitionTime":"2025-10-01T14:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.558529 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.560264 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.560444 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.560629 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.560806 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:28Z","lastTransitionTime":"2025-10-01T14:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.665482 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.665954 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.666145 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.666247 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.666344 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:28Z","lastTransitionTime":"2025-10-01T14:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.769713 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.770174 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.770268 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.770370 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.770456 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:28Z","lastTransitionTime":"2025-10-01T14:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.873174 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.873220 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.873233 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.873252 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.873264 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:28Z","lastTransitionTime":"2025-10-01T14:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.976066 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.976131 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.976148 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.976173 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.976192 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:28Z","lastTransitionTime":"2025-10-01T14:57:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.984302 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.984336 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:57:28 crc kubenswrapper[4771]: E1001 14:57:28.984410 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.984469 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:57:28 crc kubenswrapper[4771]: I1001 14:57:28.984474 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:57:28 crc kubenswrapper[4771]: E1001 14:57:28.984805 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:57:28 crc kubenswrapper[4771]: E1001 14:57:28.984939 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:57:28 crc kubenswrapper[4771]: E1001 14:57:28.985352 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:57:29 crc kubenswrapper[4771]: I1001 14:57:29.079874 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:29 crc kubenswrapper[4771]: I1001 14:57:29.079930 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:29 crc kubenswrapper[4771]: I1001 14:57:29.079940 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:29 crc kubenswrapper[4771]: I1001 14:57:29.079959 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:29 crc kubenswrapper[4771]: I1001 14:57:29.079970 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:29Z","lastTransitionTime":"2025-10-01T14:57:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:29 crc kubenswrapper[4771]: I1001 14:57:29.184977 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:29 crc kubenswrapper[4771]: I1001 14:57:29.185090 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:29 crc kubenswrapper[4771]: I1001 14:57:29.185115 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:29 crc kubenswrapper[4771]: I1001 14:57:29.185142 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:29 crc kubenswrapper[4771]: I1001 14:57:29.185160 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:29Z","lastTransitionTime":"2025-10-01T14:57:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:29 crc kubenswrapper[4771]: I1001 14:57:29.288868 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:29 crc kubenswrapper[4771]: I1001 14:57:29.288961 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:29 crc kubenswrapper[4771]: I1001 14:57:29.288984 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:29 crc kubenswrapper[4771]: I1001 14:57:29.289013 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:29 crc kubenswrapper[4771]: I1001 14:57:29.289036 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:29Z","lastTransitionTime":"2025-10-01T14:57:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:29 crc kubenswrapper[4771]: I1001 14:57:29.391955 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:29 crc kubenswrapper[4771]: I1001 14:57:29.392418 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:29 crc kubenswrapper[4771]: I1001 14:57:29.392433 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:29 crc kubenswrapper[4771]: I1001 14:57:29.392454 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:29 crc kubenswrapper[4771]: I1001 14:57:29.392470 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:29Z","lastTransitionTime":"2025-10-01T14:57:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:29 crc kubenswrapper[4771]: I1001 14:57:29.495701 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:29 crc kubenswrapper[4771]: I1001 14:57:29.496084 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:29 crc kubenswrapper[4771]: I1001 14:57:29.496230 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:29 crc kubenswrapper[4771]: I1001 14:57:29.496435 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:29 crc kubenswrapper[4771]: I1001 14:57:29.496580 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:29Z","lastTransitionTime":"2025-10-01T14:57:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:29 crc kubenswrapper[4771]: I1001 14:57:29.600248 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:29 crc kubenswrapper[4771]: I1001 14:57:29.600311 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:29 crc kubenswrapper[4771]: I1001 14:57:29.600319 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:29 crc kubenswrapper[4771]: I1001 14:57:29.600336 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:29 crc kubenswrapper[4771]: I1001 14:57:29.600345 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:29Z","lastTransitionTime":"2025-10-01T14:57:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:29 crc kubenswrapper[4771]: I1001 14:57:29.703143 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:29 crc kubenswrapper[4771]: I1001 14:57:29.703193 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:29 crc kubenswrapper[4771]: I1001 14:57:29.703204 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:29 crc kubenswrapper[4771]: I1001 14:57:29.703222 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:29 crc kubenswrapper[4771]: I1001 14:57:29.703234 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:29Z","lastTransitionTime":"2025-10-01T14:57:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:29 crc kubenswrapper[4771]: I1001 14:57:29.806230 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:29 crc kubenswrapper[4771]: I1001 14:57:29.806298 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:29 crc kubenswrapper[4771]: I1001 14:57:29.806320 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:29 crc kubenswrapper[4771]: I1001 14:57:29.806352 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:29 crc kubenswrapper[4771]: I1001 14:57:29.806377 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:29Z","lastTransitionTime":"2025-10-01T14:57:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:29 crc kubenswrapper[4771]: I1001 14:57:29.909928 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:29 crc kubenswrapper[4771]: I1001 14:57:29.910009 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:29 crc kubenswrapper[4771]: I1001 14:57:29.910033 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:29 crc kubenswrapper[4771]: I1001 14:57:29.910058 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:29 crc kubenswrapper[4771]: I1001 14:57:29.910076 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:29Z","lastTransitionTime":"2025-10-01T14:57:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.013501 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.013609 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.013637 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.013680 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.013703 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:30Z","lastTransitionTime":"2025-10-01T14:57:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.117329 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.117391 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.117414 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.117445 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.117468 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:30Z","lastTransitionTime":"2025-10-01T14:57:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.219963 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.220110 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.220133 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.220161 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.220185 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:30Z","lastTransitionTime":"2025-10-01T14:57:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.324066 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.324513 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.324678 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.324913 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.325087 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:30Z","lastTransitionTime":"2025-10-01T14:57:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.428641 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.428714 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.428786 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.428816 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.428843 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:30Z","lastTransitionTime":"2025-10-01T14:57:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.531372 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.531436 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.531457 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.531487 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.531510 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:30Z","lastTransitionTime":"2025-10-01T14:57:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.634985 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.635079 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.635099 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.635124 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.635148 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:30Z","lastTransitionTime":"2025-10-01T14:57:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.738420 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.738474 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.738485 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.738502 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.738512 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:30Z","lastTransitionTime":"2025-10-01T14:57:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.842145 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.842193 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.842212 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.842236 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.842254 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:30Z","lastTransitionTime":"2025-10-01T14:57:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.945470 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.945554 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.945590 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.945631 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.945653 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:30Z","lastTransitionTime":"2025-10-01T14:57:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.984346 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.984382 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.984400 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:57:30 crc kubenswrapper[4771]: I1001 14:57:30.984447 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:57:30 crc kubenswrapper[4771]: E1001 14:57:30.984536 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:57:30 crc kubenswrapper[4771]: E1001 14:57:30.984684 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:57:30 crc kubenswrapper[4771]: E1001 14:57:30.984854 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:57:30 crc kubenswrapper[4771]: E1001 14:57:30.985050 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.048400 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.048467 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.048483 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.048508 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.048534 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:31Z","lastTransitionTime":"2025-10-01T14:57:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.151756 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.151795 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.151805 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.151820 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.151829 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:31Z","lastTransitionTime":"2025-10-01T14:57:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.255022 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.255363 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.255573 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.255863 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.256059 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:31Z","lastTransitionTime":"2025-10-01T14:57:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.359372 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.359520 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.359536 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.359555 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.359570 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:31Z","lastTransitionTime":"2025-10-01T14:57:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.462472 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.462537 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.462554 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.462581 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.462598 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:31Z","lastTransitionTime":"2025-10-01T14:57:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.565098 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.565152 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.565162 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.565178 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.565190 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:31Z","lastTransitionTime":"2025-10-01T14:57:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.668010 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.668450 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.668657 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.668954 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.669186 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:31Z","lastTransitionTime":"2025-10-01T14:57:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.771570 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.771659 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.771677 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.771702 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.771723 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:31Z","lastTransitionTime":"2025-10-01T14:57:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.874470 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.874531 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.874550 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.874572 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.874593 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:31Z","lastTransitionTime":"2025-10-01T14:57:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.978067 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.978192 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.978206 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.978263 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:31 crc kubenswrapper[4771]: I1001 14:57:31.978276 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:31Z","lastTransitionTime":"2025-10-01T14:57:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:32 crc kubenswrapper[4771]: I1001 14:57:32.081697 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:32 crc kubenswrapper[4771]: I1001 14:57:32.081958 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:32 crc kubenswrapper[4771]: I1001 14:57:32.082116 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:32 crc kubenswrapper[4771]: I1001 14:57:32.082190 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:32 crc kubenswrapper[4771]: I1001 14:57:32.082256 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:32Z","lastTransitionTime":"2025-10-01T14:57:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:32 crc kubenswrapper[4771]: I1001 14:57:32.184983 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:32 crc kubenswrapper[4771]: I1001 14:57:32.185015 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:32 crc kubenswrapper[4771]: I1001 14:57:32.185023 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:32 crc kubenswrapper[4771]: I1001 14:57:32.185037 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:32 crc kubenswrapper[4771]: I1001 14:57:32.185047 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:32Z","lastTransitionTime":"2025-10-01T14:57:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:32 crc kubenswrapper[4771]: I1001 14:57:32.288161 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:32 crc kubenswrapper[4771]: I1001 14:57:32.288193 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:32 crc kubenswrapper[4771]: I1001 14:57:32.288202 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:32 crc kubenswrapper[4771]: I1001 14:57:32.288215 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:32 crc kubenswrapper[4771]: I1001 14:57:32.288223 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:32Z","lastTransitionTime":"2025-10-01T14:57:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:32 crc kubenswrapper[4771]: I1001 14:57:32.391368 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:32 crc kubenswrapper[4771]: I1001 14:57:32.391435 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:32 crc kubenswrapper[4771]: I1001 14:57:32.391452 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:32 crc kubenswrapper[4771]: I1001 14:57:32.391478 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:32 crc kubenswrapper[4771]: I1001 14:57:32.391496 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:32Z","lastTransitionTime":"2025-10-01T14:57:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:32 crc kubenswrapper[4771]: I1001 14:57:32.495134 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:32 crc kubenswrapper[4771]: I1001 14:57:32.495201 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:32 crc kubenswrapper[4771]: I1001 14:57:32.495222 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:32 crc kubenswrapper[4771]: I1001 14:57:32.495250 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:32 crc kubenswrapper[4771]: I1001 14:57:32.495273 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:32Z","lastTransitionTime":"2025-10-01T14:57:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:32 crc kubenswrapper[4771]: I1001 14:57:32.597147 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:32 crc kubenswrapper[4771]: I1001 14:57:32.597207 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:32 crc kubenswrapper[4771]: I1001 14:57:32.597224 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:32 crc kubenswrapper[4771]: I1001 14:57:32.597248 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:32 crc kubenswrapper[4771]: I1001 14:57:32.597267 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:32Z","lastTransitionTime":"2025-10-01T14:57:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:32 crc kubenswrapper[4771]: I1001 14:57:32.700016 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:32 crc kubenswrapper[4771]: I1001 14:57:32.700067 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:32 crc kubenswrapper[4771]: I1001 14:57:32.700083 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:32 crc kubenswrapper[4771]: I1001 14:57:32.700111 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:32 crc kubenswrapper[4771]: I1001 14:57:32.700130 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:32Z","lastTransitionTime":"2025-10-01T14:57:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:32 crc kubenswrapper[4771]: I1001 14:57:32.803156 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:32 crc kubenswrapper[4771]: I1001 14:57:32.803236 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:32 crc kubenswrapper[4771]: I1001 14:57:32.803260 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:32 crc kubenswrapper[4771]: I1001 14:57:32.803289 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:32 crc kubenswrapper[4771]: I1001 14:57:32.803318 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:32Z","lastTransitionTime":"2025-10-01T14:57:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:32 crc kubenswrapper[4771]: I1001 14:57:32.906238 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:32 crc kubenswrapper[4771]: I1001 14:57:32.906283 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:32 crc kubenswrapper[4771]: I1001 14:57:32.906296 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:32 crc kubenswrapper[4771]: I1001 14:57:32.906313 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:32 crc kubenswrapper[4771]: I1001 14:57:32.906327 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:32Z","lastTransitionTime":"2025-10-01T14:57:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:32 crc kubenswrapper[4771]: I1001 14:57:32.984683 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:57:32 crc kubenswrapper[4771]: I1001 14:57:32.984725 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:57:32 crc kubenswrapper[4771]: I1001 14:57:32.984810 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:57:32 crc kubenswrapper[4771]: I1001 14:57:32.984903 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:57:32 crc kubenswrapper[4771]: E1001 14:57:32.985635 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:57:32 crc kubenswrapper[4771]: E1001 14:57:32.985126 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:57:32 crc kubenswrapper[4771]: E1001 14:57:32.985283 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:57:32 crc kubenswrapper[4771]: E1001 14:57:32.984947 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.009974 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.010036 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.010098 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.010124 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.010147 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:33Z","lastTransitionTime":"2025-10-01T14:57:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.114721 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.115138 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.115362 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.115521 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.115677 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:33Z","lastTransitionTime":"2025-10-01T14:57:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.219022 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.219069 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.219080 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.219105 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.219119 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:33Z","lastTransitionTime":"2025-10-01T14:57:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.324557 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.324959 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.325189 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.325382 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.325915 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:33Z","lastTransitionTime":"2025-10-01T14:57:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.428542 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.428615 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.428633 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.428656 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.428672 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:33Z","lastTransitionTime":"2025-10-01T14:57:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.531960 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.532027 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.532044 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.532068 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.532086 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:33Z","lastTransitionTime":"2025-10-01T14:57:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.634867 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.634931 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.634952 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.634984 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.635007 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:33Z","lastTransitionTime":"2025-10-01T14:57:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.738184 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.738561 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.738724 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.738916 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.739072 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:33Z","lastTransitionTime":"2025-10-01T14:57:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.841943 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.842006 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.842033 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.842066 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.842087 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:33Z","lastTransitionTime":"2025-10-01T14:57:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.945560 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.945619 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.945641 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.945668 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:33 crc kubenswrapper[4771]: I1001 14:57:33.945690 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:33Z","lastTransitionTime":"2025-10-01T14:57:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.049808 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.049888 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.049911 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.049940 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.049966 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:34Z","lastTransitionTime":"2025-10-01T14:57:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.153479 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.153816 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.153971 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.154133 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.154265 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:34Z","lastTransitionTime":"2025-10-01T14:57:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.257714 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.257818 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.257843 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.257876 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.257901 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:34Z","lastTransitionTime":"2025-10-01T14:57:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.360413 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.360486 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.360510 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.360539 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.360561 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:34Z","lastTransitionTime":"2025-10-01T14:57:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.464920 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.465297 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.465490 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.465649 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.465836 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:34Z","lastTransitionTime":"2025-10-01T14:57:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.569273 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.569349 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.569367 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.569392 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.569410 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:34Z","lastTransitionTime":"2025-10-01T14:57:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.645981 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.646043 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.646055 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.646085 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.646105 4771 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-01T14:57:34Z","lastTransitionTime":"2025-10-01T14:57:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.712892 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-xr7gd"] Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.713419 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xr7gd" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.716434 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.716558 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.717553 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.717657 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.727897 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b76e32d-43b9-437c-82e2-0e1c1860f6aa-service-ca\") pod \"cluster-version-operator-5c965bbfc6-xr7gd\" (UID: \"0b76e32d-43b9-437c-82e2-0e1c1860f6aa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xr7gd" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.727955 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b76e32d-43b9-437c-82e2-0e1c1860f6aa-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-xr7gd\" (UID: \"0b76e32d-43b9-437c-82e2-0e1c1860f6aa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xr7gd" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.728005 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0b76e32d-43b9-437c-82e2-0e1c1860f6aa-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-xr7gd\" (UID: \"0b76e32d-43b9-437c-82e2-0e1c1860f6aa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xr7gd" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.728085 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b76e32d-43b9-437c-82e2-0e1c1860f6aa-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-xr7gd\" (UID: \"0b76e32d-43b9-437c-82e2-0e1c1860f6aa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xr7gd" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.728124 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0b76e32d-43b9-437c-82e2-0e1c1860f6aa-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-xr7gd\" (UID: \"0b76e32d-43b9-437c-82e2-0e1c1860f6aa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xr7gd" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.829565 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0b76e32d-43b9-437c-82e2-0e1c1860f6aa-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-xr7gd\" (UID: \"0b76e32d-43b9-437c-82e2-0e1c1860f6aa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xr7gd" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.829694 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b76e32d-43b9-437c-82e2-0e1c1860f6aa-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-xr7gd\" (UID: \"0b76e32d-43b9-437c-82e2-0e1c1860f6aa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xr7gd" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.829770 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0b76e32d-43b9-437c-82e2-0e1c1860f6aa-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-xr7gd\" (UID: \"0b76e32d-43b9-437c-82e2-0e1c1860f6aa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xr7gd" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.829823 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b76e32d-43b9-437c-82e2-0e1c1860f6aa-service-ca\") pod \"cluster-version-operator-5c965bbfc6-xr7gd\" (UID: \"0b76e32d-43b9-437c-82e2-0e1c1860f6aa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xr7gd" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.829843 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b76e32d-43b9-437c-82e2-0e1c1860f6aa-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-xr7gd\" (UID: \"0b76e32d-43b9-437c-82e2-0e1c1860f6aa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xr7gd" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.829868 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/0b76e32d-43b9-437c-82e2-0e1c1860f6aa-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-xr7gd\" (UID: \"0b76e32d-43b9-437c-82e2-0e1c1860f6aa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xr7gd" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.829806 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/0b76e32d-43b9-437c-82e2-0e1c1860f6aa-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-xr7gd\" (UID: \"0b76e32d-43b9-437c-82e2-0e1c1860f6aa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xr7gd" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.831778 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b76e32d-43b9-437c-82e2-0e1c1860f6aa-service-ca\") pod \"cluster-version-operator-5c965bbfc6-xr7gd\" (UID: \"0b76e32d-43b9-437c-82e2-0e1c1860f6aa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xr7gd" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.840698 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b76e32d-43b9-437c-82e2-0e1c1860f6aa-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-xr7gd\" (UID: \"0b76e32d-43b9-437c-82e2-0e1c1860f6aa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xr7gd" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.856499 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b76e32d-43b9-437c-82e2-0e1c1860f6aa-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-xr7gd\" (UID: \"0b76e32d-43b9-437c-82e2-0e1c1860f6aa\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xr7gd" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.984709 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.984768 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.984813 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:57:34 crc kubenswrapper[4771]: I1001 14:57:34.984876 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:57:34 crc kubenswrapper[4771]: E1001 14:57:34.985404 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:57:34 crc kubenswrapper[4771]: E1001 14:57:34.985417 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:57:34 crc kubenswrapper[4771]: E1001 14:57:34.985822 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:57:34 crc kubenswrapper[4771]: E1001 14:57:34.985665 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:57:35 crc kubenswrapper[4771]: I1001 14:57:35.036242 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xr7gd" Oct 01 14:57:35 crc kubenswrapper[4771]: I1001 14:57:35.603996 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xr7gd" event={"ID":"0b76e32d-43b9-437c-82e2-0e1c1860f6aa","Type":"ContainerStarted","Data":"c0b1ae54a44626ce963da8e5670abcb606a9790eb55cfe9cf39caea6eb577f37"} Oct 01 14:57:35 crc kubenswrapper[4771]: I1001 14:57:35.604055 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xr7gd" event={"ID":"0b76e32d-43b9-437c-82e2-0e1c1860f6aa","Type":"ContainerStarted","Data":"c1781e457ce46c34573aa4c15b061ba069f4b2ac34397f2011bc2ea50f1fa251"} Oct 01 14:57:35 crc kubenswrapper[4771]: I1001 14:57:35.619665 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xr7gd" podStartSLOduration=73.619635694 podStartE2EDuration="1m13.619635694s" podCreationTimestamp="2025-10-01 14:56:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:57:35.619369098 +0000 UTC m=+100.238544299" watchObservedRunningTime="2025-10-01 14:57:35.619635694 +0000 UTC m=+100.238810865" Oct 01 14:57:36 crc kubenswrapper[4771]: I1001 14:57:36.985265 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:57:36 crc kubenswrapper[4771]: I1001 14:57:36.985328 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:57:36 crc kubenswrapper[4771]: I1001 14:57:36.985355 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:57:36 crc kubenswrapper[4771]: I1001 14:57:36.985375 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:57:36 crc kubenswrapper[4771]: E1001 14:57:36.985438 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:57:36 crc kubenswrapper[4771]: E1001 14:57:36.985541 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:57:36 crc kubenswrapper[4771]: E1001 14:57:36.985698 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:57:36 crc kubenswrapper[4771]: E1001 14:57:36.985838 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:57:38 crc kubenswrapper[4771]: I1001 14:57:38.984970 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:57:38 crc kubenswrapper[4771]: E1001 14:57:38.985541 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:57:38 crc kubenswrapper[4771]: I1001 14:57:38.985000 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:57:38 crc kubenswrapper[4771]: I1001 14:57:38.984965 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:57:38 crc kubenswrapper[4771]: E1001 14:57:38.985627 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:57:38 crc kubenswrapper[4771]: I1001 14:57:38.985054 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:57:38 crc kubenswrapper[4771]: E1001 14:57:38.985695 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:57:38 crc kubenswrapper[4771]: E1001 14:57:38.985801 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:57:40 crc kubenswrapper[4771]: I1001 14:57:40.984323 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:57:40 crc kubenswrapper[4771]: I1001 14:57:40.984437 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:57:40 crc kubenswrapper[4771]: I1001 14:57:40.984327 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:57:40 crc kubenswrapper[4771]: E1001 14:57:40.984531 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:57:40 crc kubenswrapper[4771]: I1001 14:57:40.984356 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:57:40 crc kubenswrapper[4771]: E1001 14:57:40.984620 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:57:40 crc kubenswrapper[4771]: E1001 14:57:40.984678 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:57:40 crc kubenswrapper[4771]: E1001 14:57:40.984802 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:57:41 crc kubenswrapper[4771]: I1001 14:57:41.209619 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a49c960d-cfd1-4745-976b-59c62e3dcf8e-metrics-certs\") pod \"network-metrics-daemon-8qdkc\" (UID: \"a49c960d-cfd1-4745-976b-59c62e3dcf8e\") " pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:57:41 crc kubenswrapper[4771]: E1001 14:57:41.209866 4771 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 14:57:41 crc kubenswrapper[4771]: E1001 14:57:41.210399 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a49c960d-cfd1-4745-976b-59c62e3dcf8e-metrics-certs podName:a49c960d-cfd1-4745-976b-59c62e3dcf8e nodeName:}" failed. No retries permitted until 2025-10-01 14:58:45.210364091 +0000 UTC m=+169.829539302 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a49c960d-cfd1-4745-976b-59c62e3dcf8e-metrics-certs") pod "network-metrics-daemon-8qdkc" (UID: "a49c960d-cfd1-4745-976b-59c62e3dcf8e") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 01 14:57:41 crc kubenswrapper[4771]: I1001 14:57:41.986667 4771 scope.go:117] "RemoveContainer" containerID="b73262e3265b2bd1a144d98c79c0de8e14fcd6af11f724dee4758e1424506b86" Oct 01 14:57:41 crc kubenswrapper[4771]: E1001 14:57:41.987067 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-j7ntp_openshift-ovn-kubernetes(a061b8e2-74a8-4953-bfa2-5090a2f70459)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" Oct 01 14:57:42 crc kubenswrapper[4771]: I1001 14:57:42.985244 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:57:42 crc kubenswrapper[4771]: I1001 14:57:42.985381 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:57:42 crc kubenswrapper[4771]: I1001 14:57:42.985491 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:57:42 crc kubenswrapper[4771]: E1001 14:57:42.985494 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:57:42 crc kubenswrapper[4771]: I1001 14:57:42.985547 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:57:42 crc kubenswrapper[4771]: E1001 14:57:42.985657 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:57:42 crc kubenswrapper[4771]: E1001 14:57:42.985920 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:57:42 crc kubenswrapper[4771]: E1001 14:57:42.986095 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:57:44 crc kubenswrapper[4771]: I1001 14:57:44.985068 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:57:44 crc kubenswrapper[4771]: I1001 14:57:44.985133 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:57:44 crc kubenswrapper[4771]: I1001 14:57:44.985133 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:57:44 crc kubenswrapper[4771]: I1001 14:57:44.985086 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:57:44 crc kubenswrapper[4771]: E1001 14:57:44.985239 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:57:44 crc kubenswrapper[4771]: E1001 14:57:44.985500 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:57:44 crc kubenswrapper[4771]: E1001 14:57:44.985826 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:57:44 crc kubenswrapper[4771]: E1001 14:57:44.986013 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:57:46 crc kubenswrapper[4771]: I1001 14:57:46.984791 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:57:46 crc kubenswrapper[4771]: E1001 14:57:46.985440 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:57:46 crc kubenswrapper[4771]: I1001 14:57:46.984809 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:57:46 crc kubenswrapper[4771]: E1001 14:57:46.985705 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:57:46 crc kubenswrapper[4771]: I1001 14:57:46.984813 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:57:46 crc kubenswrapper[4771]: E1001 14:57:46.986048 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:57:46 crc kubenswrapper[4771]: I1001 14:57:46.984860 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:57:46 crc kubenswrapper[4771]: E1001 14:57:46.986280 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:57:48 crc kubenswrapper[4771]: I1001 14:57:48.984963 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:57:48 crc kubenswrapper[4771]: I1001 14:57:48.985035 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:57:48 crc kubenswrapper[4771]: I1001 14:57:48.985080 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:57:48 crc kubenswrapper[4771]: I1001 14:57:48.985036 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:57:48 crc kubenswrapper[4771]: E1001 14:57:48.985220 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:57:48 crc kubenswrapper[4771]: E1001 14:57:48.985432 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:57:48 crc kubenswrapper[4771]: E1001 14:57:48.985465 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:57:48 crc kubenswrapper[4771]: E1001 14:57:48.985521 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:57:50 crc kubenswrapper[4771]: I1001 14:57:50.984221 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:57:50 crc kubenswrapper[4771]: I1001 14:57:50.984220 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:57:50 crc kubenswrapper[4771]: E1001 14:57:50.984418 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:57:50 crc kubenswrapper[4771]: I1001 14:57:50.984253 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:57:50 crc kubenswrapper[4771]: E1001 14:57:50.984544 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:57:50 crc kubenswrapper[4771]: I1001 14:57:50.984253 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:57:50 crc kubenswrapper[4771]: E1001 14:57:50.984707 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:57:50 crc kubenswrapper[4771]: E1001 14:57:50.984915 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:57:52 crc kubenswrapper[4771]: I1001 14:57:52.985116 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:57:52 crc kubenswrapper[4771]: I1001 14:57:52.985183 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:57:52 crc kubenswrapper[4771]: E1001 14:57:52.985267 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:57:52 crc kubenswrapper[4771]: I1001 14:57:52.985138 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:57:52 crc kubenswrapper[4771]: I1001 14:57:52.985428 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:57:52 crc kubenswrapper[4771]: E1001 14:57:52.985951 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:57:52 crc kubenswrapper[4771]: E1001 14:57:52.986059 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:57:52 crc kubenswrapper[4771]: E1001 14:57:52.986313 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:57:54 crc kubenswrapper[4771]: I1001 14:57:54.984510 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:57:54 crc kubenswrapper[4771]: I1001 14:57:54.984533 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:57:54 crc kubenswrapper[4771]: E1001 14:57:54.984672 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:57:54 crc kubenswrapper[4771]: I1001 14:57:54.984718 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:57:54 crc kubenswrapper[4771]: I1001 14:57:54.984830 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:57:54 crc kubenswrapper[4771]: E1001 14:57:54.985046 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:57:54 crc kubenswrapper[4771]: E1001 14:57:54.985102 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:57:54 crc kubenswrapper[4771]: E1001 14:57:54.985191 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:57:55 crc kubenswrapper[4771]: E1001 14:57:55.942121 4771 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 01 14:57:55 crc kubenswrapper[4771]: I1001 14:57:55.987311 4771 scope.go:117] "RemoveContainer" containerID="b73262e3265b2bd1a144d98c79c0de8e14fcd6af11f724dee4758e1424506b86" Oct 01 14:57:55 crc kubenswrapper[4771]: E1001 14:57:55.987546 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-j7ntp_openshift-ovn-kubernetes(a061b8e2-74a8-4953-bfa2-5090a2f70459)\"" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" Oct 01 14:57:56 crc kubenswrapper[4771]: E1001 14:57:56.146545 4771 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 01 14:57:56 crc kubenswrapper[4771]: I1001 14:57:56.985119 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:57:56 crc kubenswrapper[4771]: I1001 14:57:56.985240 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:57:56 crc kubenswrapper[4771]: E1001 14:57:56.985358 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:57:56 crc kubenswrapper[4771]: E1001 14:57:56.985441 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:57:56 crc kubenswrapper[4771]: I1001 14:57:56.985887 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:57:56 crc kubenswrapper[4771]: E1001 14:57:56.985959 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:57:56 crc kubenswrapper[4771]: I1001 14:57:56.985990 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:57:56 crc kubenswrapper[4771]: E1001 14:57:56.986100 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:57:58 crc kubenswrapper[4771]: I1001 14:57:58.985151 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:57:58 crc kubenswrapper[4771]: I1001 14:57:58.985220 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:57:58 crc kubenswrapper[4771]: I1001 14:57:58.985213 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:57:58 crc kubenswrapper[4771]: E1001 14:57:58.985330 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:57:58 crc kubenswrapper[4771]: E1001 14:57:58.985470 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:57:58 crc kubenswrapper[4771]: E1001 14:57:58.985690 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:57:58 crc kubenswrapper[4771]: I1001 14:57:58.985855 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:57:58 crc kubenswrapper[4771]: E1001 14:57:58.986008 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:57:59 crc kubenswrapper[4771]: I1001 14:57:59.698935 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9lvcz_c96a3328-c79b-4528-b9b5-badbc7380dd6/kube-multus/1.log" Oct 01 14:57:59 crc kubenswrapper[4771]: I1001 14:57:59.699900 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9lvcz_c96a3328-c79b-4528-b9b5-badbc7380dd6/kube-multus/0.log" Oct 01 14:57:59 crc kubenswrapper[4771]: I1001 14:57:59.699988 4771 generic.go:334] "Generic (PLEG): container finished" podID="c96a3328-c79b-4528-b9b5-badbc7380dd6" containerID="4fb4d8406dba14d03e2f5ab3e220aadd2d4181a22563e3e178108c5d8e1b4e2b" exitCode=1 Oct 01 14:57:59 crc kubenswrapper[4771]: I1001 14:57:59.700061 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9lvcz" event={"ID":"c96a3328-c79b-4528-b9b5-badbc7380dd6","Type":"ContainerDied","Data":"4fb4d8406dba14d03e2f5ab3e220aadd2d4181a22563e3e178108c5d8e1b4e2b"} Oct 01 14:57:59 crc kubenswrapper[4771]: I1001 14:57:59.700175 4771 scope.go:117] "RemoveContainer" containerID="1dea414c74a97184e0507113cc8069ee732fe85125aee58c080165c511835f1d" Oct 01 14:57:59 crc kubenswrapper[4771]: I1001 14:57:59.701052 4771 scope.go:117] "RemoveContainer" containerID="4fb4d8406dba14d03e2f5ab3e220aadd2d4181a22563e3e178108c5d8e1b4e2b" Oct 01 14:57:59 crc kubenswrapper[4771]: E1001 14:57:59.701458 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-9lvcz_openshift-multus(c96a3328-c79b-4528-b9b5-badbc7380dd6)\"" pod="openshift-multus/multus-9lvcz" podUID="c96a3328-c79b-4528-b9b5-badbc7380dd6" Oct 01 14:58:00 crc kubenswrapper[4771]: I1001 14:58:00.706399 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9lvcz_c96a3328-c79b-4528-b9b5-badbc7380dd6/kube-multus/1.log" Oct 01 14:58:00 crc kubenswrapper[4771]: I1001 14:58:00.985017 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:58:00 crc kubenswrapper[4771]: I1001 14:58:00.985082 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:58:00 crc kubenswrapper[4771]: I1001 14:58:00.985025 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:58:00 crc kubenswrapper[4771]: I1001 14:58:00.985194 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:58:00 crc kubenswrapper[4771]: E1001 14:58:00.985974 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:58:00 crc kubenswrapper[4771]: E1001 14:58:00.985338 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:58:00 crc kubenswrapper[4771]: E1001 14:58:00.985401 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:58:00 crc kubenswrapper[4771]: E1001 14:58:00.985246 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:58:01 crc kubenswrapper[4771]: E1001 14:58:01.148015 4771 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 01 14:58:02 crc kubenswrapper[4771]: I1001 14:58:02.985009 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:58:02 crc kubenswrapper[4771]: I1001 14:58:02.985201 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:58:02 crc kubenswrapper[4771]: E1001 14:58:02.986411 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:58:02 crc kubenswrapper[4771]: I1001 14:58:02.985327 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:58:02 crc kubenswrapper[4771]: E1001 14:58:02.986540 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:58:02 crc kubenswrapper[4771]: I1001 14:58:02.985287 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:58:02 crc kubenswrapper[4771]: E1001 14:58:02.986625 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:58:02 crc kubenswrapper[4771]: E1001 14:58:02.986428 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:58:04 crc kubenswrapper[4771]: I1001 14:58:04.984701 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:58:04 crc kubenswrapper[4771]: I1001 14:58:04.984805 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:58:04 crc kubenswrapper[4771]: I1001 14:58:04.984813 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:58:04 crc kubenswrapper[4771]: I1001 14:58:04.984880 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:58:04 crc kubenswrapper[4771]: E1001 14:58:04.984959 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:58:04 crc kubenswrapper[4771]: E1001 14:58:04.985031 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:58:04 crc kubenswrapper[4771]: E1001 14:58:04.985112 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:58:04 crc kubenswrapper[4771]: E1001 14:58:04.985210 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:58:06 crc kubenswrapper[4771]: E1001 14:58:06.149273 4771 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 01 14:58:06 crc kubenswrapper[4771]: I1001 14:58:06.984951 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:58:06 crc kubenswrapper[4771]: I1001 14:58:06.985001 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:58:06 crc kubenswrapper[4771]: I1001 14:58:06.985085 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:58:06 crc kubenswrapper[4771]: E1001 14:58:06.985098 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:58:06 crc kubenswrapper[4771]: I1001 14:58:06.984964 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:58:06 crc kubenswrapper[4771]: E1001 14:58:06.985356 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:58:06 crc kubenswrapper[4771]: E1001 14:58:06.985549 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:58:06 crc kubenswrapper[4771]: E1001 14:58:06.985629 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:58:08 crc kubenswrapper[4771]: I1001 14:58:08.984380 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:58:08 crc kubenswrapper[4771]: I1001 14:58:08.984438 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:58:08 crc kubenswrapper[4771]: E1001 14:58:08.984564 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:58:08 crc kubenswrapper[4771]: I1001 14:58:08.984380 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:58:08 crc kubenswrapper[4771]: I1001 14:58:08.984667 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:58:08 crc kubenswrapper[4771]: E1001 14:58:08.984980 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:58:08 crc kubenswrapper[4771]: E1001 14:58:08.985140 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:58:08 crc kubenswrapper[4771]: E1001 14:58:08.985585 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:58:09 crc kubenswrapper[4771]: I1001 14:58:09.986385 4771 scope.go:117] "RemoveContainer" containerID="b73262e3265b2bd1a144d98c79c0de8e14fcd6af11f724dee4758e1424506b86" Oct 01 14:58:10 crc kubenswrapper[4771]: I1001 14:58:10.745944 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j7ntp_a061b8e2-74a8-4953-bfa2-5090a2f70459/ovnkube-controller/3.log" Oct 01 14:58:10 crc kubenswrapper[4771]: I1001 14:58:10.749053 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" event={"ID":"a061b8e2-74a8-4953-bfa2-5090a2f70459","Type":"ContainerStarted","Data":"4a303f76a4308e3dfb264f405721e20c795c4324b793808feadb388e6589dae3"} Oct 01 14:58:10 crc kubenswrapper[4771]: I1001 14:58:10.749534 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:58:10 crc kubenswrapper[4771]: I1001 14:58:10.777150 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" podStartSLOduration=108.777131031 podStartE2EDuration="1m48.777131031s" podCreationTimestamp="2025-10-01 14:56:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:58:10.775175672 +0000 UTC m=+135.394350843" watchObservedRunningTime="2025-10-01 14:58:10.777131031 +0000 UTC m=+135.396306202" Oct 01 14:58:10 crc kubenswrapper[4771]: I1001 14:58:10.964863 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8qdkc"] Oct 01 14:58:10 crc kubenswrapper[4771]: I1001 14:58:10.964961 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:58:10 crc kubenswrapper[4771]: E1001 14:58:10.965046 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:58:10 crc kubenswrapper[4771]: I1001 14:58:10.984982 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:58:10 crc kubenswrapper[4771]: E1001 14:58:10.985208 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:58:10 crc kubenswrapper[4771]: I1001 14:58:10.985560 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:58:10 crc kubenswrapper[4771]: E1001 14:58:10.985785 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:58:10 crc kubenswrapper[4771]: I1001 14:58:10.986086 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:58:10 crc kubenswrapper[4771]: E1001 14:58:10.986223 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:58:11 crc kubenswrapper[4771]: E1001 14:58:11.151449 4771 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 01 14:58:11 crc kubenswrapper[4771]: I1001 14:58:11.985291 4771 scope.go:117] "RemoveContainer" containerID="4fb4d8406dba14d03e2f5ab3e220aadd2d4181a22563e3e178108c5d8e1b4e2b" Oct 01 14:58:12 crc kubenswrapper[4771]: I1001 14:58:12.762812 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9lvcz_c96a3328-c79b-4528-b9b5-badbc7380dd6/kube-multus/1.log" Oct 01 14:58:12 crc kubenswrapper[4771]: I1001 14:58:12.762880 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9lvcz" event={"ID":"c96a3328-c79b-4528-b9b5-badbc7380dd6","Type":"ContainerStarted","Data":"2d79a918c545b25fd3949c7e72a6a5446d3d66da50fda7363410f26ddf35b04e"} Oct 01 14:58:12 crc kubenswrapper[4771]: I1001 14:58:12.984777 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:58:12 crc kubenswrapper[4771]: I1001 14:58:12.984805 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:58:12 crc kubenswrapper[4771]: I1001 14:58:12.984828 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:58:12 crc kubenswrapper[4771]: I1001 14:58:12.984917 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:58:12 crc kubenswrapper[4771]: E1001 14:58:12.984956 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:58:12 crc kubenswrapper[4771]: E1001 14:58:12.985134 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:58:12 crc kubenswrapper[4771]: E1001 14:58:12.985331 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:58:12 crc kubenswrapper[4771]: E1001 14:58:12.985390 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:58:14 crc kubenswrapper[4771]: I1001 14:58:14.984846 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:58:14 crc kubenswrapper[4771]: I1001 14:58:14.984901 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:58:14 crc kubenswrapper[4771]: I1001 14:58:14.984932 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:58:14 crc kubenswrapper[4771]: I1001 14:58:14.985024 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:58:14 crc kubenswrapper[4771]: E1001 14:58:14.985017 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8qdkc" podUID="a49c960d-cfd1-4745-976b-59c62e3dcf8e" Oct 01 14:58:14 crc kubenswrapper[4771]: E1001 14:58:14.985291 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 01 14:58:14 crc kubenswrapper[4771]: E1001 14:58:14.985442 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 01 14:58:14 crc kubenswrapper[4771]: E1001 14:58:14.985571 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 01 14:58:16 crc kubenswrapper[4771]: I1001 14:58:16.984442 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:58:16 crc kubenswrapper[4771]: I1001 14:58:16.984532 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:58:16 crc kubenswrapper[4771]: I1001 14:58:16.984589 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:58:16 crc kubenswrapper[4771]: I1001 14:58:16.984615 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:58:16 crc kubenswrapper[4771]: I1001 14:58:16.987960 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 01 14:58:16 crc kubenswrapper[4771]: I1001 14:58:16.988188 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 01 14:58:16 crc kubenswrapper[4771]: I1001 14:58:16.989954 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 01 14:58:16 crc kubenswrapper[4771]: I1001 14:58:16.990003 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 01 14:58:16 crc kubenswrapper[4771]: I1001 14:58:16.990212 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 01 14:58:16 crc kubenswrapper[4771]: I1001 14:58:16.990931 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 01 14:58:24 crc kubenswrapper[4771]: I1001 14:58:24.921843 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:24 crc kubenswrapper[4771]: E1001 14:58:24.922041 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 15:00:26.922013211 +0000 UTC m=+271.541188382 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.024041 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.024301 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.024395 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.024485 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.025414 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.029953 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.030646 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.031792 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.109435 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.141629 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.157279 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 01 14:58:25 crc kubenswrapper[4771]: W1001 14:58:25.407127 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-fe3df796c7f0df6276a8b8a365e14159e39ea34ba0e6b9d9c98225ee9898d4ef WatchSource:0}: Error finding container fe3df796c7f0df6276a8b8a365e14159e39ea34ba0e6b9d9c98225ee9898d4ef: Status 404 returned error can't find the container with id fe3df796c7f0df6276a8b8a365e14159e39ea34ba0e6b9d9c98225ee9898d4ef Oct 01 14:58:25 crc kubenswrapper[4771]: W1001 14:58:25.622131 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-bd48907958a9925028bbcf227b9ff6ff5722f069802b4383a3f86eb0f32fbdc8 WatchSource:0}: Error finding container bd48907958a9925028bbcf227b9ff6ff5722f069802b4383a3f86eb0f32fbdc8: Status 404 returned error can't find the container with id bd48907958a9925028bbcf227b9ff6ff5722f069802b4383a3f86eb0f32fbdc8 Oct 01 14:58:25 crc kubenswrapper[4771]: W1001 14:58:25.624313 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-e1aae3f3c0b2e10c4b968c7208c9e40660f106aa4afaceca55f88b5a0a362fe9 WatchSource:0}: Error finding container e1aae3f3c0b2e10c4b968c7208c9e40660f106aa4afaceca55f88b5a0a362fe9: Status 404 returned error can't find the container with id e1aae3f3c0b2e10c4b968c7208c9e40660f106aa4afaceca55f88b5a0a362fe9 Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.769135 4771 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.802498 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-shl79"] Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.803117 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-shl79" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.813668 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"bf1d5ddcd70e740506f2ed102d205a9f35efc4e70e83d243a5fdde7b2f5a220b"} Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.813751 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"bd48907958a9925028bbcf227b9ff6ff5722f069802b4383a3f86eb0f32fbdc8"} Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.825033 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"385417ecaa6962386d3f651ba91fe7b2b939a9f219a6f716295aea64f4ce0bf2"} Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.825083 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"e1aae3f3c0b2e10c4b968c7208c9e40660f106aa4afaceca55f88b5a0a362fe9"} Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.825336 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.827227 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"a3763af33e3a88c0b9a3cbdcc754b2ab64cf176fae6f333c5add8f1fe3576db4"} Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.827283 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"fe3df796c7f0df6276a8b8a365e14159e39ea34ba0e6b9d9c98225ee9898d4ef"} Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.830228 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.830367 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.830761 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.832123 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-qlkrl"] Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.832614 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-qlkrl" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.835451 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.835646 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.856011 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.856254 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.856292 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.856419 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.857449 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.857482 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.857504 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.857650 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.863787 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.864319 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-75jhw"] Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.864618 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.864747 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-75jhw" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.865094 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-kwgcl"] Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.865409 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-kwgcl" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.865764 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-kd4c9"] Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.866229 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kd4c9" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.881048 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.885379 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2vqkq"] Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.885665 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zhdpb"] Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.886277 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.886592 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2vqkq" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.887103 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.888277 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-qlkrl"] Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.889676 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-t6brp"] Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.889997 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-shl79"] Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.890071 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-t6brp" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.890077 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.894073 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.896578 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.896604 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.896666 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.896809 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.896962 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.905310 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.905980 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.906012 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.906080 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.906112 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.906017 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.906381 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.907024 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.907079 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.907263 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.907337 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.922697 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.922810 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.922945 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.923195 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.924504 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.924829 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.927672 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mjvfw"] Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.929436 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.929690 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.929836 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.930031 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.930082 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.930131 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.930258 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.931168 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-nrlb2"] Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.931636 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.931657 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-ldrz6"] Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.931981 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-tk4n9"] Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.932348 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tk4n9" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.932421 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-nrlb2" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.932532 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-mjvfw" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.932583 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ldrz6" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.932621 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.933040 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wcgzq"] Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.933650 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wcgzq" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.933884 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g7zvd"] Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.934504 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g7zvd" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.936668 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rss87\" (UniqueName: \"kubernetes.io/projected/8b800e30-2559-4c0b-9732-7a069ae3da91-kube-api-access-rss87\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.936706 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/13778ea7-497e-431d-a3a9-96979d2e4885-encryption-config\") pod \"apiserver-76f77b778f-shl79\" (UID: \"13778ea7-497e-431d-a3a9-96979d2e4885\") " pod="openshift-apiserver/apiserver-76f77b778f-shl79" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.936756 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/13778ea7-497e-431d-a3a9-96979d2e4885-etcd-serving-ca\") pod \"apiserver-76f77b778f-shl79\" (UID: \"13778ea7-497e-431d-a3a9-96979d2e4885\") " pod="openshift-apiserver/apiserver-76f77b778f-shl79" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.936783 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/13778ea7-497e-431d-a3a9-96979d2e4885-node-pullsecrets\") pod \"apiserver-76f77b778f-shl79\" (UID: \"13778ea7-497e-431d-a3a9-96979d2e4885\") " pod="openshift-apiserver/apiserver-76f77b778f-shl79" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.936821 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-kwgcl"] Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.936834 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8b800e30-2559-4c0b-9732-7a069ae3da91-trusted-ca\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.936846 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsjtg"] Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.936874 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.937094 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.937138 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.937266 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.937295 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.937361 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsjtg" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.937400 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7cpjp"] Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.937452 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.937606 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.937660 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/13778ea7-497e-431d-a3a9-96979d2e4885-audit-dir\") pod \"apiserver-76f77b778f-shl79\" (UID: \"13778ea7-497e-431d-a3a9-96979d2e4885\") " pod="openshift-apiserver/apiserver-76f77b778f-shl79" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.937677 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8b800e30-2559-4c0b-9732-7a069ae3da91-bound-sa-token\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.937700 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13778ea7-497e-431d-a3a9-96979d2e4885-serving-cert\") pod \"apiserver-76f77b778f-shl79\" (UID: \"13778ea7-497e-431d-a3a9-96979d2e4885\") " pod="openshift-apiserver/apiserver-76f77b778f-shl79" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.937716 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13778ea7-497e-431d-a3a9-96979d2e4885-trusted-ca-bundle\") pod \"apiserver-76f77b778f-shl79\" (UID: \"13778ea7-497e-431d-a3a9-96979d2e4885\") " pod="openshift-apiserver/apiserver-76f77b778f-shl79" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.937757 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb5d2ccc-be5f-4f01-ad01-f44095342ed4-trusted-ca\") pod \"console-operator-58897d9998-75jhw\" (UID: \"cb5d2ccc-be5f-4f01-ad01-f44095342ed4\") " pod="openshift-console-operator/console-operator-58897d9998-75jhw" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.937773 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.937786 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7cpjp" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.937774 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8b800e30-2559-4c0b-9732-7a069ae3da91-ca-trust-extracted\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.937909 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8b800e30-2559-4c0b-9732-7a069ae3da91-registry-certificates\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.937972 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/13778ea7-497e-431d-a3a9-96979d2e4885-audit\") pod \"apiserver-76f77b778f-shl79\" (UID: \"13778ea7-497e-431d-a3a9-96979d2e4885\") " pod="openshift-apiserver/apiserver-76f77b778f-shl79" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.937992 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c932514-1a55-4ed3-a220-7ddcce5a4ca4-serving-cert\") pod \"etcd-operator-b45778765-t6brp\" (UID: \"6c932514-1a55-4ed3-a220-7ddcce5a4ca4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t6brp" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.938041 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fcdm\" (UniqueName: \"kubernetes.io/projected/64c93e25-6cb1-443c-bbe4-8fb155713ddb-kube-api-access-7fcdm\") pod \"machine-approver-56656f9798-kd4c9\" (UID: \"64c93e25-6cb1-443c-bbe4-8fb155713ddb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kd4c9" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.938065 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f160df7c-e97b-4c5a-badf-08379f8e27bf-config\") pod \"machine-api-operator-5694c8668f-qlkrl\" (UID: \"f160df7c-e97b-4c5a-badf-08379f8e27bf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qlkrl" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.938111 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6c932514-1a55-4ed3-a220-7ddcce5a4ca4-etcd-service-ca\") pod \"etcd-operator-b45778765-t6brp\" (UID: \"6c932514-1a55-4ed3-a220-7ddcce5a4ca4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t6brp" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.938632 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vjwt\" (UniqueName: \"kubernetes.io/projected/76fbf853-2ec4-4f50-a0d1-c633314219b3-kube-api-access-6vjwt\") pod \"openshift-controller-manager-operator-756b6f6bc6-2vqkq\" (UID: \"76fbf853-2ec4-4f50-a0d1-c633314219b3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2vqkq" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.938658 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8b800e30-2559-4c0b-9732-7a069ae3da91-registry-tls\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.938692 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r4rb\" (UniqueName: \"kubernetes.io/projected/f160df7c-e97b-4c5a-badf-08379f8e27bf-kube-api-access-7r4rb\") pod \"machine-api-operator-5694c8668f-qlkrl\" (UID: \"f160df7c-e97b-4c5a-badf-08379f8e27bf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qlkrl" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.938716 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6c932514-1a55-4ed3-a220-7ddcce5a4ca4-etcd-client\") pod \"etcd-operator-b45778765-t6brp\" (UID: \"6c932514-1a55-4ed3-a220-7ddcce5a4ca4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t6brp" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.938764 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f160df7c-e97b-4c5a-badf-08379f8e27bf-images\") pod \"machine-api-operator-5694c8668f-qlkrl\" (UID: \"f160df7c-e97b-4c5a-badf-08379f8e27bf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qlkrl" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.938804 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/13778ea7-497e-431d-a3a9-96979d2e4885-etcd-client\") pod \"apiserver-76f77b778f-shl79\" (UID: \"13778ea7-497e-431d-a3a9-96979d2e4885\") " pod="openshift-apiserver/apiserver-76f77b778f-shl79" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.938841 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb5d2ccc-be5f-4f01-ad01-f44095342ed4-serving-cert\") pod \"console-operator-58897d9998-75jhw\" (UID: \"cb5d2ccc-be5f-4f01-ad01-f44095342ed4\") " pod="openshift-console-operator/console-operator-58897d9998-75jhw" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.938910 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzn8g\" (UniqueName: \"kubernetes.io/projected/cb5d2ccc-be5f-4f01-ad01-f44095342ed4-kube-api-access-tzn8g\") pod \"console-operator-58897d9998-75jhw\" (UID: \"cb5d2ccc-be5f-4f01-ad01-f44095342ed4\") " pod="openshift-console-operator/console-operator-58897d9998-75jhw" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.938969 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/64c93e25-6cb1-443c-bbe4-8fb155713ddb-auth-proxy-config\") pod \"machine-approver-56656f9798-kd4c9\" (UID: \"64c93e25-6cb1-443c-bbe4-8fb155713ddb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kd4c9" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.938998 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7wz6\" (UniqueName: \"kubernetes.io/projected/13778ea7-497e-431d-a3a9-96979d2e4885-kube-api-access-v7wz6\") pod \"apiserver-76f77b778f-shl79\" (UID: \"13778ea7-497e-431d-a3a9-96979d2e4885\") " pod="openshift-apiserver/apiserver-76f77b778f-shl79" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.939018 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64c93e25-6cb1-443c-bbe4-8fb155713ddb-config\") pod \"machine-approver-56656f9798-kd4c9\" (UID: \"64c93e25-6cb1-443c-bbe4-8fb155713ddb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kd4c9" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.939041 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8b800e30-2559-4c0b-9732-7a069ae3da91-installation-pull-secrets\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.939066 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13778ea7-497e-431d-a3a9-96979d2e4885-config\") pod \"apiserver-76f77b778f-shl79\" (UID: \"13778ea7-497e-431d-a3a9-96979d2e4885\") " pod="openshift-apiserver/apiserver-76f77b778f-shl79" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.939084 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/64c93e25-6cb1-443c-bbe4-8fb155713ddb-machine-approver-tls\") pod \"machine-approver-56656f9798-kd4c9\" (UID: \"64c93e25-6cb1-443c-bbe4-8fb155713ddb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kd4c9" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.939105 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/757b923d-5bf5-4b31-af60-a617c6b13559-metrics-tls\") pod \"dns-operator-744455d44c-kwgcl\" (UID: \"757b923d-5bf5-4b31-af60-a617c6b13559\") " pod="openshift-dns-operator/dns-operator-744455d44c-kwgcl" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.939124 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76fbf853-2ec4-4f50-a0d1-c633314219b3-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-2vqkq\" (UID: \"76fbf853-2ec4-4f50-a0d1-c633314219b3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2vqkq" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.939158 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.939192 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/13778ea7-497e-431d-a3a9-96979d2e4885-image-import-ca\") pod \"apiserver-76f77b778f-shl79\" (UID: \"13778ea7-497e-431d-a3a9-96979d2e4885\") " pod="openshift-apiserver/apiserver-76f77b778f-shl79" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.939212 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtklh\" (UniqueName: \"kubernetes.io/projected/757b923d-5bf5-4b31-af60-a617c6b13559-kube-api-access-gtklh\") pod \"dns-operator-744455d44c-kwgcl\" (UID: \"757b923d-5bf5-4b31-af60-a617c6b13559\") " pod="openshift-dns-operator/dns-operator-744455d44c-kwgcl" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.939231 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f160df7c-e97b-4c5a-badf-08379f8e27bf-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-qlkrl\" (UID: \"f160df7c-e97b-4c5a-badf-08379f8e27bf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qlkrl" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.939256 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c932514-1a55-4ed3-a220-7ddcce5a4ca4-config\") pod \"etcd-operator-b45778765-t6brp\" (UID: \"6c932514-1a55-4ed3-a220-7ddcce5a4ca4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t6brp" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.939280 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76fbf853-2ec4-4f50-a0d1-c633314219b3-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-2vqkq\" (UID: \"76fbf853-2ec4-4f50-a0d1-c633314219b3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2vqkq" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.939300 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb5d2ccc-be5f-4f01-ad01-f44095342ed4-config\") pod \"console-operator-58897d9998-75jhw\" (UID: \"cb5d2ccc-be5f-4f01-ad01-f44095342ed4\") " pod="openshift-console-operator/console-operator-58897d9998-75jhw" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.939321 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6c932514-1a55-4ed3-a220-7ddcce5a4ca4-etcd-ca\") pod \"etcd-operator-b45778765-t6brp\" (UID: \"6c932514-1a55-4ed3-a220-7ddcce5a4ca4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t6brp" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.939344 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kssl5\" (UniqueName: \"kubernetes.io/projected/6c932514-1a55-4ed3-a220-7ddcce5a4ca4-kube-api-access-kssl5\") pod \"etcd-operator-b45778765-t6brp\" (UID: \"6c932514-1a55-4ed3-a220-7ddcce5a4ca4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t6brp" Oct 01 14:58:25 crc kubenswrapper[4771]: E1001 14:58:25.939536 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:26.439525079 +0000 UTC m=+151.058700250 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.943910 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn6nl"] Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.944364 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn6nl" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.944909 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-szwtc"] Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.945244 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9q7z8"] Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.945610 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9q7z8" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.945931 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-szwtc" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.945993 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nhxdc"] Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.946499 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nhxdc" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.946757 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.946859 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.947108 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.947177 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.948590 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-z6x72"] Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.949141 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-z6x72" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.951274 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.952259 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.952294 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.952772 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.952908 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.953305 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.953441 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.953468 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.953564 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.953575 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.953647 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.953708 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.953809 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.953718 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.953919 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.953982 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.954057 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.954129 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.953821 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.954200 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.954304 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.954379 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.963069 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.954397 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.963426 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.963776 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-pr972"] Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.963971 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.964149 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.964266 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.964305 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.964695 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.965133 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.965192 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-2kmtl"] Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.965437 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-pr972" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.965517 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.966304 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.966534 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.978970 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.981828 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.982315 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.982659 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-qgbcc"] Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.983211 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ld5bx"] Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.983431 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.983673 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.983842 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2kmtl" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.983722 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rsd2x"] Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.984114 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qgbcc" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.984534 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-hlqj5"] Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.984626 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.984706 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.984892 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.984911 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-hlqj5" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.985075 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rsd2x" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.986415 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.996611 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ld5bx" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.996910 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-n5mnh"] Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.997441 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8gml8"] Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.997900 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-l6gqh"] Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.999700 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l6gqh" Oct 01 14:58:25 crc kubenswrapper[4771]: I1001 14:58:25.999921 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-n5mnh" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.000127 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8gml8" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.004180 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4fgp8"] Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.004678 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pc29"] Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.005277 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pc29" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.005521 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l657x"] Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.005644 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4fgp8" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.006451 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l657x" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.007759 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322165-xdwrz"] Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.008787 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-wpxh2"] Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.009158 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-wpxh2" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.009359 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-xdwrz" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.009985 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-ds5h4"] Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.010721 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-ds5h4" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.013881 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pvm7n"] Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.016438 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pvm7n" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.018245 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.026536 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.026665 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-mhd4x"] Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.033899 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5ddff"] Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.034361 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2vqkq"] Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.034406 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mjvfw"] Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.034501 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-5ddff" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.034679 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mhd4x" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.040070 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.040356 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13778ea7-497e-431d-a3a9-96979d2e4885-serving-cert\") pod \"apiserver-76f77b778f-shl79\" (UID: \"13778ea7-497e-431d-a3a9-96979d2e4885\") " pod="openshift-apiserver/apiserver-76f77b778f-shl79" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.040397 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8b800e30-2559-4c0b-9732-7a069ae3da91-registry-certificates\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.040432 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b31fcf4c-ab24-4f6a-9807-04c076e2d548-proxy-tls\") pod \"machine-config-operator-74547568cd-2kmtl\" (UID: \"b31fcf4c-ab24-4f6a-9807-04c076e2d548\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2kmtl" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.040455 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/13778ea7-497e-431d-a3a9-96979d2e4885-audit\") pod \"apiserver-76f77b778f-shl79\" (UID: \"13778ea7-497e-431d-a3a9-96979d2e4885\") " pod="openshift-apiserver/apiserver-76f77b778f-shl79" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.040491 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khxgk\" (UniqueName: \"kubernetes.io/projected/c107c949-1ce5-41cb-a5a6-49bf5c599fc2-kube-api-access-khxgk\") pod \"olm-operator-6b444d44fb-pvm7n\" (UID: \"c107c949-1ce5-41cb-a5a6-49bf5c599fc2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pvm7n" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.040514 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-pr972\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-pr972" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.040550 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9db1abd4-f11c-45e1-9341-6c818c3e3579-console-serving-cert\") pod \"console-f9d7485db-szwtc\" (UID: \"9db1abd4-f11c-45e1-9341-6c818c3e3579\") " pod="openshift-console/console-f9d7485db-szwtc" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.040574 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fcdm\" (UniqueName: \"kubernetes.io/projected/64c93e25-6cb1-443c-bbe4-8fb155713ddb-kube-api-access-7fcdm\") pod \"machine-approver-56656f9798-kd4c9\" (UID: \"64c93e25-6cb1-443c-bbe4-8fb155713ddb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kd4c9" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.040597 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/892c3644-ab53-40dc-a65e-4b14e6b537ed-srv-cert\") pod \"catalog-operator-68c6474976-4fgp8\" (UID: \"892c3644-ab53-40dc-a65e-4b14e6b537ed\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4fgp8" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.040615 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vjwt\" (UniqueName: \"kubernetes.io/projected/76fbf853-2ec4-4f50-a0d1-c633314219b3-kube-api-access-6vjwt\") pod \"openshift-controller-manager-operator-756b6f6bc6-2vqkq\" (UID: \"76fbf853-2ec4-4f50-a0d1-c633314219b3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2vqkq" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.040635 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8b800e30-2559-4c0b-9732-7a069ae3da91-registry-tls\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.040654 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6c932514-1a55-4ed3-a220-7ddcce5a4ca4-etcd-client\") pod \"etcd-operator-b45778765-t6brp\" (UID: \"6c932514-1a55-4ed3-a220-7ddcce5a4ca4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t6brp" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.040676 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg4hf\" (UniqueName: \"kubernetes.io/projected/51a76061-b2bf-427b-985e-767ebad2a8cb-kube-api-access-jg4hf\") pod \"openshift-config-operator-7777fb866f-tk4n9\" (UID: \"51a76061-b2bf-427b-985e-767ebad2a8cb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tk4n9" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.040696 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f160df7c-e97b-4c5a-badf-08379f8e27bf-images\") pod \"machine-api-operator-5694c8668f-qlkrl\" (UID: \"f160df7c-e97b-4c5a-badf-08379f8e27bf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qlkrl" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.040717 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7skn\" (UniqueName: \"kubernetes.io/projected/83e0e747-e0e6-4aab-bc5c-27d0b41e2fb1-kube-api-access-d7skn\") pod \"authentication-operator-69f744f599-z6x72\" (UID: \"83e0e747-e0e6-4aab-bc5c-27d0b41e2fb1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z6x72" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.040754 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0d14cf95-98ee-432d-8889-edf3508b4eb3-bound-sa-token\") pod \"ingress-operator-5b745b69d9-qgbcc\" (UID: \"0d14cf95-98ee-432d-8889-edf3508b4eb3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qgbcc" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.040773 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wq8gj\" (UniqueName: \"kubernetes.io/projected/892c3644-ab53-40dc-a65e-4b14e6b537ed-kube-api-access-wq8gj\") pod \"catalog-operator-68c6474976-4fgp8\" (UID: \"892c3644-ab53-40dc-a65e-4b14e6b537ed\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4fgp8" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.040791 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1fa98d96-d0f8-4b7f-9421-50e6eceaca84-serving-cert\") pod \"service-ca-operator-777779d784-hlqj5\" (UID: \"1fa98d96-d0f8-4b7f-9421-50e6eceaca84\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hlqj5" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.041546 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a2238a30-4ae1-4bd8-acfa-1e357552252c-stats-auth\") pod \"router-default-5444994796-wpxh2\" (UID: \"a2238a30-4ae1-4bd8-acfa-1e357552252c\") " pod="openshift-ingress/router-default-5444994796-wpxh2" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.041603 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2baafe19-ab7c-43c2-bd6b-0d6398b9fb3b-config\") pod \"openshift-apiserver-operator-796bbdcf4f-wcgzq\" (UID: \"2baafe19-ab7c-43c2-bd6b-0d6398b9fb3b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wcgzq" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.041631 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/83e0e747-e0e6-4aab-bc5c-27d0b41e2fb1-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-z6x72\" (UID: \"83e0e747-e0e6-4aab-bc5c-27d0b41e2fb1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z6x72" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.041649 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/541a902f-ea82-44e7-9c01-e93c9e01a2b6-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-ldrz6\" (UID: \"541a902f-ea82-44e7-9c01-e93c9e01a2b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ldrz6" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.041669 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnhjj\" (UniqueName: \"kubernetes.io/projected/d92fedc8-d031-40c1-b9fa-695496499a26-kube-api-access-wnhjj\") pod \"downloads-7954f5f757-nrlb2\" (UID: \"d92fedc8-d031-40c1-b9fa-695496499a26\") " pod="openshift-console/downloads-7954f5f757-nrlb2" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.041688 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzn8g\" (UniqueName: \"kubernetes.io/projected/cb5d2ccc-be5f-4f01-ad01-f44095342ed4-kube-api-access-tzn8g\") pod \"console-operator-58897d9998-75jhw\" (UID: \"cb5d2ccc-be5f-4f01-ad01-f44095342ed4\") " pod="openshift-console-operator/console-operator-58897d9998-75jhw" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.041705 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fa98d96-d0f8-4b7f-9421-50e6eceaca84-config\") pod \"service-ca-operator-777779d784-hlqj5\" (UID: \"1fa98d96-d0f8-4b7f-9421-50e6eceaca84\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hlqj5" Oct 01 14:58:26 crc kubenswrapper[4771]: E1001 14:58:26.042049 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:26.542024015 +0000 UTC m=+151.161199186 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.042925 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/f160df7c-e97b-4c5a-badf-08379f8e27bf-images\") pod \"machine-api-operator-5694c8668f-qlkrl\" (UID: \"f160df7c-e97b-4c5a-badf-08379f8e27bf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qlkrl" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.043031 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/13778ea7-497e-431d-a3a9-96979d2e4885-audit\") pod \"apiserver-76f77b778f-shl79\" (UID: \"13778ea7-497e-431d-a3a9-96979d2e4885\") " pod="openshift-apiserver/apiserver-76f77b778f-shl79" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.044116 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8b800e30-2559-4c0b-9732-7a069ae3da91-registry-certificates\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.045031 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13778ea7-497e-431d-a3a9-96979d2e4885-serving-cert\") pod \"apiserver-76f77b778f-shl79\" (UID: \"13778ea7-497e-431d-a3a9-96979d2e4885\") " pod="openshift-apiserver/apiserver-76f77b778f-shl79" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.045137 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8b800e30-2559-4c0b-9732-7a069ae3da91-registry-tls\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.045888 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2bba146f-994a-4fbd-834f-861c2ffa4232-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-9q7z8\" (UID: \"2bba146f-994a-4fbd-834f-861c2ffa4232\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9q7z8" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.045935 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9db1abd4-f11c-45e1-9341-6c818c3e3579-oauth-serving-cert\") pod \"console-f9d7485db-szwtc\" (UID: \"9db1abd4-f11c-45e1-9341-6c818c3e3579\") " pod="openshift-console/console-f9d7485db-szwtc" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.045957 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvznm\" (UniqueName: \"kubernetes.io/projected/0d14cf95-98ee-432d-8889-edf3508b4eb3-kube-api-access-cvznm\") pod \"ingress-operator-5b745b69d9-qgbcc\" (UID: \"0d14cf95-98ee-432d-8889-edf3508b4eb3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qgbcc" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.045978 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2baafe19-ab7c-43c2-bd6b-0d6398b9fb3b-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-wcgzq\" (UID: \"2baafe19-ab7c-43c2-bd6b-0d6398b9fb3b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wcgzq" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.045999 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/728535b7-cedd-4391-b5e4-8cee0982380d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-g7zvd\" (UID: \"728535b7-cedd-4391-b5e4-8cee0982380d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g7zvd" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.046020 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/335db6c7-efb8-4055-aacf-8262b4ec5b91-serving-cert\") pod \"route-controller-manager-6576b87f9c-dsjtg\" (UID: \"335db6c7-efb8-4055-aacf-8262b4ec5b91\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsjtg" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.046039 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-audit-policies\") pod \"oauth-openshift-558db77b4-pr972\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-pr972" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.046066 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8b800e30-2559-4c0b-9732-7a069ae3da91-installation-pull-secrets\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.046088 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0d14cf95-98ee-432d-8889-edf3508b4eb3-metrics-tls\") pod \"ingress-operator-5b745b69d9-qgbcc\" (UID: \"0d14cf95-98ee-432d-8889-edf3508b4eb3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qgbcc" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.046114 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/64c93e25-6cb1-443c-bbe4-8fb155713ddb-machine-approver-tls\") pod \"machine-approver-56656f9798-kd4c9\" (UID: \"64c93e25-6cb1-443c-bbe4-8fb155713ddb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kd4c9" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.046712 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvxhk\" (UniqueName: \"kubernetes.io/projected/2baafe19-ab7c-43c2-bd6b-0d6398b9fb3b-kube-api-access-vvxhk\") pod \"openshift-apiserver-operator-796bbdcf4f-wcgzq\" (UID: \"2baafe19-ab7c-43c2-bd6b-0d6398b9fb3b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wcgzq" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.046747 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bba146f-994a-4fbd-834f-861c2ffa4232-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-9q7z8\" (UID: \"2bba146f-994a-4fbd-834f-861c2ffa4232\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9q7z8" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.046770 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/757b923d-5bf5-4b31-af60-a617c6b13559-metrics-tls\") pod \"dns-operator-744455d44c-kwgcl\" (UID: \"757b923d-5bf5-4b31-af60-a617c6b13559\") " pod="openshift-dns-operator/dns-operator-744455d44c-kwgcl" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.046790 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76fbf853-2ec4-4f50-a0d1-c633314219b3-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-2vqkq\" (UID: \"76fbf853-2ec4-4f50-a0d1-c633314219b3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2vqkq" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.046809 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6e05857-3b53-4e95-9f02-a79ad8b509a8-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-8pc29\" (UID: \"f6e05857-3b53-4e95-9f02-a79ad8b509a8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pc29" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.046828 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ea64286d-7a71-4d7e-b54b-5796a3a9f7df-proxy-tls\") pod \"machine-config-controller-84d6567774-l6gqh\" (UID: \"ea64286d-7a71-4d7e-b54b-5796a3a9f7df\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l6gqh" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.046848 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/abdbc7a1-d300-43d1-a23d-416a9cbc5a98-apiservice-cert\") pod \"packageserver-d55dfcdfc-l657x\" (UID: \"abdbc7a1-d300-43d1-a23d-416a9cbc5a98\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l657x" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.046864 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdcmg\" (UniqueName: \"kubernetes.io/projected/1fa98d96-d0f8-4b7f-9421-50e6eceaca84-kube-api-access-jdcmg\") pod \"service-ca-operator-777779d784-hlqj5\" (UID: \"1fa98d96-d0f8-4b7f-9421-50e6eceaca84\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hlqj5" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.046882 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-pr972\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-pr972" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.046905 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f160df7c-e97b-4c5a-badf-08379f8e27bf-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-qlkrl\" (UID: \"f160df7c-e97b-4c5a-badf-08379f8e27bf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qlkrl" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.046920 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-pr972\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-pr972" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.046941 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd8df04a-9a5e-4784-ac97-9782d936fa5e-config\") pod \"controller-manager-879f6c89f-mjvfw\" (UID: \"cd8df04a-9a5e-4784-ac97-9782d936fa5e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mjvfw" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.046960 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-pr972\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-pr972" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.046975 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-pr972\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-pr972" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.046996 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9db1abd4-f11c-45e1-9341-6c818c3e3579-console-config\") pod \"console-f9d7485db-szwtc\" (UID: \"9db1abd4-f11c-45e1-9341-6c818c3e3579\") " pod="openshift-console/console-f9d7485db-szwtc" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047020 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb5d2ccc-be5f-4f01-ad01-f44095342ed4-config\") pod \"console-operator-58897d9998-75jhw\" (UID: \"cb5d2ccc-be5f-4f01-ad01-f44095342ed4\") " pod="openshift-console-operator/console-operator-58897d9998-75jhw" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047041 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kssl5\" (UniqueName: \"kubernetes.io/projected/6c932514-1a55-4ed3-a220-7ddcce5a4ca4-kube-api-access-kssl5\") pod \"etcd-operator-b45778765-t6brp\" (UID: \"6c932514-1a55-4ed3-a220-7ddcce5a4ca4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t6brp" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047058 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/558574b3-154b-48bc-9436-3013a9b62f28-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7cpjp\" (UID: \"558574b3-154b-48bc-9436-3013a9b62f28\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7cpjp" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047072 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2238a30-4ae1-4bd8-acfa-1e357552252c-service-ca-bundle\") pod \"router-default-5444994796-wpxh2\" (UID: \"a2238a30-4ae1-4bd8-acfa-1e357552252c\") " pod="openshift-ingress/router-default-5444994796-wpxh2" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047088 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z8c2\" (UniqueName: \"kubernetes.io/projected/3f9e5d02-c03d-42b0-a837-bfa317d1cbd8-kube-api-access-2z8c2\") pod \"cluster-samples-operator-665b6dd947-ld5bx\" (UID: \"3f9e5d02-c03d-42b0-a837-bfa317d1cbd8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ld5bx" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047108 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxzp5\" (UniqueName: \"kubernetes.io/projected/40496e6d-3f79-4478-804b-dc9904473801-kube-api-access-pxzp5\") pod \"collect-profiles-29322165-xdwrz\" (UID: \"40496e6d-3f79-4478-804b-dc9904473801\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-xdwrz" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047125 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1cd5336-780a-49f1-9360-e7b8c97779d2-config\") pod \"kube-apiserver-operator-766d6c64bb-nhxdc\" (UID: \"d1cd5336-780a-49f1-9360-e7b8c97779d2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nhxdc" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047141 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb1bead7-b2e8-47b0-9e78-d5b6970d0121-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-mn6nl\" (UID: \"bb1bead7-b2e8-47b0-9e78-d5b6970d0121\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn6nl" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047158 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwcdw\" (UniqueName: \"kubernetes.io/projected/cd8df04a-9a5e-4784-ac97-9782d936fa5e-kube-api-access-dwcdw\") pod \"controller-manager-879f6c89f-mjvfw\" (UID: \"cd8df04a-9a5e-4784-ac97-9782d936fa5e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mjvfw" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047183 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rss87\" (UniqueName: \"kubernetes.io/projected/8b800e30-2559-4c0b-9732-7a069ae3da91-kube-api-access-rss87\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047201 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/fcfb70b0-1fb4-49ec-b1e2-1d7788b52aeb-signing-key\") pod \"service-ca-9c57cc56f-5ddff\" (UID: \"fcfb70b0-1fb4-49ec-b1e2-1d7788b52aeb\") " pod="openshift-service-ca/service-ca-9c57cc56f-5ddff" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047218 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhpgp\" (UniqueName: \"kubernetes.io/projected/6730c7b8-fcbf-48c4-b2b8-5ed3566a7cd4-kube-api-access-nhpgp\") pod \"control-plane-machine-set-operator-78cbb6b69f-8gml8\" (UID: \"6730c7b8-fcbf-48c4-b2b8-5ed3566a7cd4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8gml8" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047239 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/13778ea7-497e-431d-a3a9-96979d2e4885-encryption-config\") pod \"apiserver-76f77b778f-shl79\" (UID: \"13778ea7-497e-431d-a3a9-96979d2e4885\") " pod="openshift-apiserver/apiserver-76f77b778f-shl79" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047254 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb1bead7-b2e8-47b0-9e78-d5b6970d0121-config\") pod \"kube-controller-manager-operator-78b949d7b-mn6nl\" (UID: \"bb1bead7-b2e8-47b0-9e78-d5b6970d0121\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn6nl" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047270 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb1bead7-b2e8-47b0-9e78-d5b6970d0121-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-mn6nl\" (UID: \"bb1bead7-b2e8-47b0-9e78-d5b6970d0121\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn6nl" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047292 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/13778ea7-497e-431d-a3a9-96979d2e4885-etcd-serving-ca\") pod \"apiserver-76f77b778f-shl79\" (UID: \"13778ea7-497e-431d-a3a9-96979d2e4885\") " pod="openshift-apiserver/apiserver-76f77b778f-shl79" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047310 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/728535b7-cedd-4391-b5e4-8cee0982380d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-g7zvd\" (UID: \"728535b7-cedd-4391-b5e4-8cee0982380d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g7zvd" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047330 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3f9e5d02-c03d-42b0-a837-bfa317d1cbd8-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-ld5bx\" (UID: \"3f9e5d02-c03d-42b0-a837-bfa317d1cbd8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ld5bx" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047371 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/13778ea7-497e-431d-a3a9-96979d2e4885-node-pullsecrets\") pod \"apiserver-76f77b778f-shl79\" (UID: \"13778ea7-497e-431d-a3a9-96979d2e4885\") " pod="openshift-apiserver/apiserver-76f77b778f-shl79" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047391 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1cd5336-780a-49f1-9360-e7b8c97779d2-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-nhxdc\" (UID: \"d1cd5336-780a-49f1-9360-e7b8c97779d2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nhxdc" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047408 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d1cd5336-780a-49f1-9360-e7b8c97779d2-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-nhxdc\" (UID: \"d1cd5336-780a-49f1-9360-e7b8c97779d2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nhxdc" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047425 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/541a902f-ea82-44e7-9c01-e93c9e01a2b6-audit-dir\") pod \"apiserver-7bbb656c7d-ldrz6\" (UID: \"541a902f-ea82-44e7-9c01-e93c9e01a2b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ldrz6" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047444 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8b800e30-2559-4c0b-9732-7a069ae3da91-bound-sa-token\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047461 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/335db6c7-efb8-4055-aacf-8262b4ec5b91-client-ca\") pod \"route-controller-manager-6576b87f9c-dsjtg\" (UID: \"335db6c7-efb8-4055-aacf-8262b4ec5b91\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsjtg" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047479 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xthf\" (UniqueName: \"kubernetes.io/projected/6af25d90-8e1b-45ec-ac53-1fdd01387b9f-kube-api-access-7xthf\") pod \"dns-default-mhd4x\" (UID: \"6af25d90-8e1b-45ec-ac53-1fdd01387b9f\") " pod="openshift-dns/dns-default-mhd4x" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047497 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b31fcf4c-ab24-4f6a-9807-04c076e2d548-auth-proxy-config\") pod \"machine-config-operator-74547568cd-2kmtl\" (UID: \"b31fcf4c-ab24-4f6a-9807-04c076e2d548\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2kmtl" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047514 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/51a76061-b2bf-427b-985e-767ebad2a8cb-available-featuregates\") pod \"openshift-config-operator-7777fb866f-tk4n9\" (UID: \"51a76061-b2bf-427b-985e-767ebad2a8cb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tk4n9" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047532 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6730c7b8-fcbf-48c4-b2b8-5ed3566a7cd4-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-8gml8\" (UID: \"6730c7b8-fcbf-48c4-b2b8-5ed3566a7cd4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8gml8" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047550 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/558574b3-154b-48bc-9436-3013a9b62f28-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7cpjp\" (UID: \"558574b3-154b-48bc-9436-3013a9b62f28\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7cpjp" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047565 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n5gh\" (UniqueName: \"kubernetes.io/projected/335db6c7-efb8-4055-aacf-8262b4ec5b91-kube-api-access-2n5gh\") pod \"route-controller-manager-6576b87f9c-dsjtg\" (UID: \"335db6c7-efb8-4055-aacf-8262b4ec5b91\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsjtg" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047585 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13778ea7-497e-431d-a3a9-96979d2e4885-trusted-ca-bundle\") pod \"apiserver-76f77b778f-shl79\" (UID: \"13778ea7-497e-431d-a3a9-96979d2e4885\") " pod="openshift-apiserver/apiserver-76f77b778f-shl79" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047602 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb5d2ccc-be5f-4f01-ad01-f44095342ed4-trusted-ca\") pod \"console-operator-58897d9998-75jhw\" (UID: \"cb5d2ccc-be5f-4f01-ad01-f44095342ed4\") " pod="openshift-console-operator/console-operator-58897d9998-75jhw" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047620 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8b800e30-2559-4c0b-9732-7a069ae3da91-ca-trust-extracted\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047635 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-audit-dir\") pod \"oauth-openshift-558db77b4-pr972\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-pr972" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047649 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9db1abd4-f11c-45e1-9341-6c818c3e3579-service-ca\") pod \"console-f9d7485db-szwtc\" (UID: \"9db1abd4-f11c-45e1-9341-6c818c3e3579\") " pod="openshift-console/console-f9d7485db-szwtc" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047668 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c107c949-1ce5-41cb-a5a6-49bf5c599fc2-srv-cert\") pod \"olm-operator-6b444d44fb-pvm7n\" (UID: \"c107c949-1ce5-41cb-a5a6-49bf5c599fc2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pvm7n" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047684 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83e0e747-e0e6-4aab-bc5c-27d0b41e2fb1-serving-cert\") pod \"authentication-operator-69f744f599-z6x72\" (UID: \"83e0e747-e0e6-4aab-bc5c-27d0b41e2fb1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z6x72" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047699 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cd8df04a-9a5e-4784-ac97-9782d936fa5e-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-mjvfw\" (UID: \"cd8df04a-9a5e-4784-ac97-9782d936fa5e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mjvfw" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047717 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c932514-1a55-4ed3-a220-7ddcce5a4ca4-serving-cert\") pod \"etcd-operator-b45778765-t6brp\" (UID: \"6c932514-1a55-4ed3-a220-7ddcce5a4ca4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t6brp" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047746 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/558574b3-154b-48bc-9436-3013a9b62f28-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7cpjp\" (UID: \"558574b3-154b-48bc-9436-3013a9b62f28\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7cpjp" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047762 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/335db6c7-efb8-4055-aacf-8262b4ec5b91-config\") pod \"route-controller-manager-6576b87f9c-dsjtg\" (UID: \"335db6c7-efb8-4055-aacf-8262b4ec5b91\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsjtg" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047778 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-pr972\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-pr972" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047796 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f160df7c-e97b-4c5a-badf-08379f8e27bf-config\") pod \"machine-api-operator-5694c8668f-qlkrl\" (UID: \"f160df7c-e97b-4c5a-badf-08379f8e27bf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qlkrl" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047811 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-pr972\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-pr972" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047829 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6c932514-1a55-4ed3-a220-7ddcce5a4ca4-etcd-service-ca\") pod \"etcd-operator-b45778765-t6brp\" (UID: \"6c932514-1a55-4ed3-a220-7ddcce5a4ca4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t6brp" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047844 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/541a902f-ea82-44e7-9c01-e93c9e01a2b6-serving-cert\") pod \"apiserver-7bbb656c7d-ldrz6\" (UID: \"541a902f-ea82-44e7-9c01-e93c9e01a2b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ldrz6" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047859 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmwhv\" (UniqueName: \"kubernetes.io/projected/2bba146f-994a-4fbd-834f-861c2ffa4232-kube-api-access-wmwhv\") pod \"kube-storage-version-migrator-operator-b67b599dd-9q7z8\" (UID: \"2bba146f-994a-4fbd-834f-861c2ffa4232\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9q7z8" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047879 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7r4rb\" (UniqueName: \"kubernetes.io/projected/f160df7c-e97b-4c5a-badf-08379f8e27bf-kube-api-access-7r4rb\") pod \"machine-api-operator-5694c8668f-qlkrl\" (UID: \"f160df7c-e97b-4c5a-badf-08379f8e27bf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qlkrl" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047894 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp8lb\" (UniqueName: \"kubernetes.io/projected/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-kube-api-access-hp8lb\") pod \"oauth-openshift-558db77b4-pr972\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-pr972" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047909 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/541a902f-ea82-44e7-9c01-e93c9e01a2b6-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-ldrz6\" (UID: \"541a902f-ea82-44e7-9c01-e93c9e01a2b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ldrz6" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047925 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9db1abd4-f11c-45e1-9341-6c818c3e3579-trusted-ca-bundle\") pod \"console-f9d7485db-szwtc\" (UID: \"9db1abd4-f11c-45e1-9341-6c818c3e3579\") " pod="openshift-console/console-f9d7485db-szwtc" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047940 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40496e6d-3f79-4478-804b-dc9904473801-config-volume\") pod \"collect-profiles-29322165-xdwrz\" (UID: \"40496e6d-3f79-4478-804b-dc9904473801\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-xdwrz" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047956 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9db1abd4-f11c-45e1-9341-6c818c3e3579-console-oauth-config\") pod \"console-f9d7485db-szwtc\" (UID: \"9db1abd4-f11c-45e1-9341-6c818c3e3579\") " pod="openshift-console/console-f9d7485db-szwtc" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047972 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cd8df04a-9a5e-4784-ac97-9782d936fa5e-client-ca\") pod \"controller-manager-879f6c89f-mjvfw\" (UID: \"cd8df04a-9a5e-4784-ac97-9782d936fa5e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mjvfw" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.047990 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q9q6\" (UniqueName: \"kubernetes.io/projected/b248e756-e3b8-4fd9-b6fb-99ee87df696d-kube-api-access-9q9q6\") pod \"marketplace-operator-79b997595-rsd2x\" (UID: \"b248e756-e3b8-4fd9-b6fb-99ee87df696d\") " pod="openshift-marketplace/marketplace-operator-79b997595-rsd2x" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048013 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/13778ea7-497e-431d-a3a9-96979d2e4885-etcd-client\") pod \"apiserver-76f77b778f-shl79\" (UID: \"13778ea7-497e-431d-a3a9-96979d2e4885\") " pod="openshift-apiserver/apiserver-76f77b778f-shl79" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048032 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb5d2ccc-be5f-4f01-ad01-f44095342ed4-serving-cert\") pod \"console-operator-58897d9998-75jhw\" (UID: \"cb5d2ccc-be5f-4f01-ad01-f44095342ed4\") " pod="openshift-console-operator/console-operator-58897d9998-75jhw" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048051 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c107c949-1ce5-41cb-a5a6-49bf5c599fc2-profile-collector-cert\") pod \"olm-operator-6b444d44fb-pvm7n\" (UID: \"c107c949-1ce5-41cb-a5a6-49bf5c599fc2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pvm7n" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048070 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b31fcf4c-ab24-4f6a-9807-04c076e2d548-images\") pod \"machine-config-operator-74547568cd-2kmtl\" (UID: \"b31fcf4c-ab24-4f6a-9807-04c076e2d548\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2kmtl" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048086 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-pr972\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-pr972" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048106 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51a76061-b2bf-427b-985e-767ebad2a8cb-serving-cert\") pod \"openshift-config-operator-7777fb866f-tk4n9\" (UID: \"51a76061-b2bf-427b-985e-767ebad2a8cb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tk4n9" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048121 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/fcfb70b0-1fb4-49ec-b1e2-1d7788b52aeb-signing-cabundle\") pod \"service-ca-9c57cc56f-5ddff\" (UID: \"fcfb70b0-1fb4-49ec-b1e2-1d7788b52aeb\") " pod="openshift-service-ca/service-ca-9c57cc56f-5ddff" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048139 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75svc\" (UniqueName: \"kubernetes.io/projected/a2238a30-4ae1-4bd8-acfa-1e357552252c-kube-api-access-75svc\") pod \"router-default-5444994796-wpxh2\" (UID: \"a2238a30-4ae1-4bd8-acfa-1e357552252c\") " pod="openshift-ingress/router-default-5444994796-wpxh2" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048153 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/541a902f-ea82-44e7-9c01-e93c9e01a2b6-etcd-client\") pod \"apiserver-7bbb656c7d-ldrz6\" (UID: \"541a902f-ea82-44e7-9c01-e93c9e01a2b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ldrz6" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048173 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/64c93e25-6cb1-443c-bbe4-8fb155713ddb-auth-proxy-config\") pod \"machine-approver-56656f9798-kd4c9\" (UID: \"64c93e25-6cb1-443c-bbe4-8fb155713ddb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kd4c9" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048192 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7wz6\" (UniqueName: \"kubernetes.io/projected/13778ea7-497e-431d-a3a9-96979d2e4885-kube-api-access-v7wz6\") pod \"apiserver-76f77b778f-shl79\" (UID: \"13778ea7-497e-431d-a3a9-96979d2e4885\") " pod="openshift-apiserver/apiserver-76f77b778f-shl79" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048209 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64c93e25-6cb1-443c-bbe4-8fb155713ddb-config\") pod \"machine-approver-56656f9798-kd4c9\" (UID: \"64c93e25-6cb1-443c-bbe4-8fb155713ddb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kd4c9" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048229 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13778ea7-497e-431d-a3a9-96979d2e4885-config\") pod \"apiserver-76f77b778f-shl79\" (UID: \"13778ea7-497e-431d-a3a9-96979d2e4885\") " pod="openshift-apiserver/apiserver-76f77b778f-shl79" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048246 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2238a30-4ae1-4bd8-acfa-1e357552252c-metrics-certs\") pod \"router-default-5444994796-wpxh2\" (UID: \"a2238a30-4ae1-4bd8-acfa-1e357552252c\") " pod="openshift-ingress/router-default-5444994796-wpxh2" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048262 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh5xh\" (UniqueName: \"kubernetes.io/projected/ea64286d-7a71-4d7e-b54b-5796a3a9f7df-kube-api-access-bh5xh\") pod \"machine-config-controller-84d6567774-l6gqh\" (UID: \"ea64286d-7a71-4d7e-b54b-5796a3a9f7df\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l6gqh" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048286 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048302 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/13778ea7-497e-431d-a3a9-96979d2e4885-image-import-ca\") pod \"apiserver-76f77b778f-shl79\" (UID: \"13778ea7-497e-431d-a3a9-96979d2e4885\") " pod="openshift-apiserver/apiserver-76f77b778f-shl79" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048321 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtklh\" (UniqueName: \"kubernetes.io/projected/757b923d-5bf5-4b31-af60-a617c6b13559-kube-api-access-gtklh\") pod \"dns-operator-744455d44c-kwgcl\" (UID: \"757b923d-5bf5-4b31-af60-a617c6b13559\") " pod="openshift-dns-operator/dns-operator-744455d44c-kwgcl" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048338 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/aac54d18-3f6e-4bf8-98f6-6e24c2d8ed74-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-ds5h4\" (UID: \"aac54d18-3f6e-4bf8-98f6-6e24c2d8ed74\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ds5h4" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048354 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-pr972\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-pr972" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048371 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6af25d90-8e1b-45ec-ac53-1fdd01387b9f-metrics-tls\") pod \"dns-default-mhd4x\" (UID: \"6af25d90-8e1b-45ec-ac53-1fdd01387b9f\") " pod="openshift-dns/dns-default-mhd4x" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048386 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40496e6d-3f79-4478-804b-dc9904473801-secret-volume\") pod \"collect-profiles-29322165-xdwrz\" (UID: \"40496e6d-3f79-4478-804b-dc9904473801\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-xdwrz" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048403 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c932514-1a55-4ed3-a220-7ddcce5a4ca4-config\") pod \"etcd-operator-b45778765-t6brp\" (UID: \"6c932514-1a55-4ed3-a220-7ddcce5a4ca4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t6brp" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048422 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0d14cf95-98ee-432d-8889-edf3508b4eb3-trusted-ca\") pod \"ingress-operator-5b745b69d9-qgbcc\" (UID: \"0d14cf95-98ee-432d-8889-edf3508b4eb3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qgbcc" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048439 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-pr972\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-pr972" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048455 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqjt8\" (UniqueName: \"kubernetes.io/projected/541a902f-ea82-44e7-9c01-e93c9e01a2b6-kube-api-access-rqjt8\") pod \"apiserver-7bbb656c7d-ldrz6\" (UID: \"541a902f-ea82-44e7-9c01-e93c9e01a2b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ldrz6" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048487 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76fbf853-2ec4-4f50-a0d1-c633314219b3-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-2vqkq\" (UID: \"76fbf853-2ec4-4f50-a0d1-c633314219b3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2vqkq" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048503 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6c932514-1a55-4ed3-a220-7ddcce5a4ca4-etcd-ca\") pod \"etcd-operator-b45778765-t6brp\" (UID: \"6c932514-1a55-4ed3-a220-7ddcce5a4ca4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t6brp" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048524 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dcjp\" (UniqueName: \"kubernetes.io/projected/1d5cfd50-877b-49ab-82f0-7753f0fabba4-kube-api-access-5dcjp\") pod \"migrator-59844c95c7-n5mnh\" (UID: \"1d5cfd50-877b-49ab-82f0-7753f0fabba4\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-n5mnh" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048544 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5dkt\" (UniqueName: \"kubernetes.io/projected/b31fcf4c-ab24-4f6a-9807-04c076e2d548-kube-api-access-n5dkt\") pod \"machine-config-operator-74547568cd-2kmtl\" (UID: \"b31fcf4c-ab24-4f6a-9807-04c076e2d548\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2kmtl" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048559 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/892c3644-ab53-40dc-a65e-4b14e6b537ed-profile-collector-cert\") pod \"catalog-operator-68c6474976-4fgp8\" (UID: \"892c3644-ab53-40dc-a65e-4b14e6b537ed\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4fgp8" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048575 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/abdbc7a1-d300-43d1-a23d-416a9cbc5a98-tmpfs\") pod \"packageserver-d55dfcdfc-l657x\" (UID: \"abdbc7a1-d300-43d1-a23d-416a9cbc5a98\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l657x" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048590 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h76zj\" (UniqueName: \"kubernetes.io/projected/abdbc7a1-d300-43d1-a23d-416a9cbc5a98-kube-api-access-h76zj\") pod \"packageserver-d55dfcdfc-l657x\" (UID: \"abdbc7a1-d300-43d1-a23d-416a9cbc5a98\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l657x" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048609 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlcnm\" (UniqueName: \"kubernetes.io/projected/728535b7-cedd-4391-b5e4-8cee0982380d-kube-api-access-nlcnm\") pod \"cluster-image-registry-operator-dc59b4c8b-g7zvd\" (UID: \"728535b7-cedd-4391-b5e4-8cee0982380d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g7zvd" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048625 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83e0e747-e0e6-4aab-bc5c-27d0b41e2fb1-config\") pod \"authentication-operator-69f744f599-z6x72\" (UID: \"83e0e747-e0e6-4aab-bc5c-27d0b41e2fb1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z6x72" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048640 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h95c4\" (UniqueName: \"kubernetes.io/projected/9db1abd4-f11c-45e1-9341-6c818c3e3579-kube-api-access-h95c4\") pod \"console-f9d7485db-szwtc\" (UID: \"9db1abd4-f11c-45e1-9341-6c818c3e3579\") " pod="openshift-console/console-f9d7485db-szwtc" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048654 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd8df04a-9a5e-4784-ac97-9782d936fa5e-serving-cert\") pod \"controller-manager-879f6c89f-mjvfw\" (UID: \"cd8df04a-9a5e-4784-ac97-9782d936fa5e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mjvfw" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048671 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nd5p6\" (UniqueName: \"kubernetes.io/projected/f6e05857-3b53-4e95-9f02-a79ad8b509a8-kube-api-access-nd5p6\") pod \"package-server-manager-789f6589d5-8pc29\" (UID: \"f6e05857-3b53-4e95-9f02-a79ad8b509a8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pc29" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048688 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/541a902f-ea82-44e7-9c01-e93c9e01a2b6-audit-policies\") pod \"apiserver-7bbb656c7d-ldrz6\" (UID: \"541a902f-ea82-44e7-9c01-e93c9e01a2b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ldrz6" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048704 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b248e756-e3b8-4fd9-b6fb-99ee87df696d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rsd2x\" (UID: \"b248e756-e3b8-4fd9-b6fb-99ee87df696d\") " pod="openshift-marketplace/marketplace-operator-79b997595-rsd2x" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048718 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/abdbc7a1-d300-43d1-a23d-416a9cbc5a98-webhook-cert\") pod \"packageserver-d55dfcdfc-l657x\" (UID: \"abdbc7a1-d300-43d1-a23d-416a9cbc5a98\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l657x" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048758 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ea64286d-7a71-4d7e-b54b-5796a3a9f7df-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-l6gqh\" (UID: \"ea64286d-7a71-4d7e-b54b-5796a3a9f7df\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l6gqh" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048779 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8b800e30-2559-4c0b-9732-7a069ae3da91-trusted-ca\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048796 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk4cv\" (UniqueName: \"kubernetes.io/projected/fcfb70b0-1fb4-49ec-b1e2-1d7788b52aeb-kube-api-access-zk4cv\") pod \"service-ca-9c57cc56f-5ddff\" (UID: \"fcfb70b0-1fb4-49ec-b1e2-1d7788b52aeb\") " pod="openshift-service-ca/service-ca-9c57cc56f-5ddff" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048820 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/13778ea7-497e-431d-a3a9-96979d2e4885-audit-dir\") pod \"apiserver-76f77b778f-shl79\" (UID: \"13778ea7-497e-431d-a3a9-96979d2e4885\") " pod="openshift-apiserver/apiserver-76f77b778f-shl79" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048836 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6af25d90-8e1b-45ec-ac53-1fdd01387b9f-config-volume\") pod \"dns-default-mhd4x\" (UID: \"6af25d90-8e1b-45ec-ac53-1fdd01387b9f\") " pod="openshift-dns/dns-default-mhd4x" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048849 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/541a902f-ea82-44e7-9c01-e93c9e01a2b6-encryption-config\") pod \"apiserver-7bbb656c7d-ldrz6\" (UID: \"541a902f-ea82-44e7-9c01-e93c9e01a2b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ldrz6" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048864 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b248e756-e3b8-4fd9-b6fb-99ee87df696d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rsd2x\" (UID: \"b248e756-e3b8-4fd9-b6fb-99ee87df696d\") " pod="openshift-marketplace/marketplace-operator-79b997595-rsd2x" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048878 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/83e0e747-e0e6-4aab-bc5c-27d0b41e2fb1-service-ca-bundle\") pod \"authentication-operator-69f744f599-z6x72\" (UID: \"83e0e747-e0e6-4aab-bc5c-27d0b41e2fb1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z6x72" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048897 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh8jc\" (UniqueName: \"kubernetes.io/projected/aac54d18-3f6e-4bf8-98f6-6e24c2d8ed74-kube-api-access-fh8jc\") pod \"multus-admission-controller-857f4d67dd-ds5h4\" (UID: \"aac54d18-3f6e-4bf8-98f6-6e24c2d8ed74\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ds5h4" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048913 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a2238a30-4ae1-4bd8-acfa-1e357552252c-default-certificate\") pod \"router-default-5444994796-wpxh2\" (UID: \"a2238a30-4ae1-4bd8-acfa-1e357552252c\") " pod="openshift-ingress/router-default-5444994796-wpxh2" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048929 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/728535b7-cedd-4391-b5e4-8cee0982380d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-g7zvd\" (UID: \"728535b7-cedd-4391-b5e4-8cee0982380d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g7zvd" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.048945 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-pr972\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-pr972" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.052441 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/6c932514-1a55-4ed3-a220-7ddcce5a4ca4-etcd-service-ca\") pod \"etcd-operator-b45778765-t6brp\" (UID: \"6c932514-1a55-4ed3-a220-7ddcce5a4ca4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t6brp" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.054179 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-75jhw"] Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.054233 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8b800e30-2559-4c0b-9732-7a069ae3da91-installation-pull-secrets\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.054995 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/13778ea7-497e-431d-a3a9-96979d2e4885-etcd-serving-ca\") pod \"apiserver-76f77b778f-shl79\" (UID: \"13778ea7-497e-431d-a3a9-96979d2e4885\") " pod="openshift-apiserver/apiserver-76f77b778f-shl79" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.055090 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/13778ea7-497e-431d-a3a9-96979d2e4885-node-pullsecrets\") pod \"apiserver-76f77b778f-shl79\" (UID: \"13778ea7-497e-431d-a3a9-96979d2e4885\") " pod="openshift-apiserver/apiserver-76f77b778f-shl79" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.055539 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c932514-1a55-4ed3-a220-7ddcce5a4ca4-config\") pod \"etcd-operator-b45778765-t6brp\" (UID: \"6c932514-1a55-4ed3-a220-7ddcce5a4ca4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t6brp" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.055640 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/64c93e25-6cb1-443c-bbe4-8fb155713ddb-machine-approver-tls\") pod \"machine-approver-56656f9798-kd4c9\" (UID: \"64c93e25-6cb1-443c-bbe4-8fb155713ddb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kd4c9" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.056018 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb5d2ccc-be5f-4f01-ad01-f44095342ed4-config\") pod \"console-operator-58897d9998-75jhw\" (UID: \"cb5d2ccc-be5f-4f01-ad01-f44095342ed4\") " pod="openshift-console-operator/console-operator-58897d9998-75jhw" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.056508 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76fbf853-2ec4-4f50-a0d1-c633314219b3-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-2vqkq\" (UID: \"76fbf853-2ec4-4f50-a0d1-c633314219b3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2vqkq" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.056506 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-t6brp"] Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.056857 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13778ea7-497e-431d-a3a9-96979d2e4885-trusted-ca-bundle\") pod \"apiserver-76f77b778f-shl79\" (UID: \"13778ea7-497e-431d-a3a9-96979d2e4885\") " pod="openshift-apiserver/apiserver-76f77b778f-shl79" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.057655 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/757b923d-5bf5-4b31-af60-a617c6b13559-metrics-tls\") pod \"dns-operator-744455d44c-kwgcl\" (UID: \"757b923d-5bf5-4b31-af60-a617c6b13559\") " pod="openshift-dns-operator/dns-operator-744455d44c-kwgcl" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.057838 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cb5d2ccc-be5f-4f01-ad01-f44095342ed4-trusted-ca\") pod \"console-operator-58897d9998-75jhw\" (UID: \"cb5d2ccc-be5f-4f01-ad01-f44095342ed4\") " pod="openshift-console-operator/console-operator-58897d9998-75jhw" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.058184 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8b800e30-2559-4c0b-9732-7a069ae3da91-ca-trust-extracted\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.061160 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.058474 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/6c932514-1a55-4ed3-a220-7ddcce5a4ca4-etcd-client\") pod \"etcd-operator-b45778765-t6brp\" (UID: \"6c932514-1a55-4ed3-a220-7ddcce5a4ca4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t6brp" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.063245 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/6c932514-1a55-4ed3-a220-7ddcce5a4ca4-etcd-ca\") pod \"etcd-operator-b45778765-t6brp\" (UID: \"6c932514-1a55-4ed3-a220-7ddcce5a4ca4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t6brp" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.063386 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/13778ea7-497e-431d-a3a9-96979d2e4885-audit-dir\") pod \"apiserver-76f77b778f-shl79\" (UID: \"13778ea7-497e-431d-a3a9-96979d2e4885\") " pod="openshift-apiserver/apiserver-76f77b778f-shl79" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.063905 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64c93e25-6cb1-443c-bbe4-8fb155713ddb-config\") pod \"machine-approver-56656f9798-kd4c9\" (UID: \"64c93e25-6cb1-443c-bbe4-8fb155713ddb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kd4c9" Oct 01 14:58:26 crc kubenswrapper[4771]: E1001 14:58:26.065152 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:26.565124825 +0000 UTC m=+151.184299996 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.065795 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f160df7c-e97b-4c5a-badf-08379f8e27bf-config\") pod \"machine-api-operator-5694c8668f-qlkrl\" (UID: \"f160df7c-e97b-4c5a-badf-08379f8e27bf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qlkrl" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.066412 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/13778ea7-497e-431d-a3a9-96979d2e4885-encryption-config\") pod \"apiserver-76f77b778f-shl79\" (UID: \"13778ea7-497e-431d-a3a9-96979d2e4885\") " pod="openshift-apiserver/apiserver-76f77b778f-shl79" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.066614 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13778ea7-497e-431d-a3a9-96979d2e4885-config\") pod \"apiserver-76f77b778f-shl79\" (UID: \"13778ea7-497e-431d-a3a9-96979d2e4885\") " pod="openshift-apiserver/apiserver-76f77b778f-shl79" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.067798 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8b800e30-2559-4c0b-9732-7a069ae3da91-trusted-ca\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.067855 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zhdpb"] Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.068294 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.068464 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/64c93e25-6cb1-443c-bbe4-8fb155713ddb-auth-proxy-config\") pod \"machine-approver-56656f9798-kd4c9\" (UID: \"64c93e25-6cb1-443c-bbe4-8fb155713ddb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kd4c9" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.072426 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/13778ea7-497e-431d-a3a9-96979d2e4885-etcd-client\") pod \"apiserver-76f77b778f-shl79\" (UID: \"13778ea7-497e-431d-a3a9-96979d2e4885\") " pod="openshift-apiserver/apiserver-76f77b778f-shl79" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.076832 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c932514-1a55-4ed3-a220-7ddcce5a4ca4-serving-cert\") pod \"etcd-operator-b45778765-t6brp\" (UID: \"6c932514-1a55-4ed3-a220-7ddcce5a4ca4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t6brp" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.077479 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/13778ea7-497e-431d-a3a9-96979d2e4885-image-import-ca\") pod \"apiserver-76f77b778f-shl79\" (UID: \"13778ea7-497e-431d-a3a9-96979d2e4885\") " pod="openshift-apiserver/apiserver-76f77b778f-shl79" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.078295 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-tk4n9"] Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.080472 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76fbf853-2ec4-4f50-a0d1-c633314219b3-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-2vqkq\" (UID: \"76fbf853-2ec4-4f50-a0d1-c633314219b3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2vqkq" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.081332 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g7zvd"] Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.083090 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cb5d2ccc-be5f-4f01-ad01-f44095342ed4-serving-cert\") pod \"console-operator-58897d9998-75jhw\" (UID: \"cb5d2ccc-be5f-4f01-ad01-f44095342ed4\") " pod="openshift-console-operator/console-operator-58897d9998-75jhw" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.083816 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-szwtc"] Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.085543 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.085811 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-ldrz6"] Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.087505 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wcgzq"] Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.089852 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn6nl"] Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.091147 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-pr972"] Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.092229 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9q7z8"] Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.094961 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7cpjp"] Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.096910 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-nrlb2"] Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.096966 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/f160df7c-e97b-4c5a-badf-08379f8e27bf-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-qlkrl\" (UID: \"f160df7c-e97b-4c5a-badf-08379f8e27bf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qlkrl" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.098635 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-c64p2"] Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.099638 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-c64p2" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.102058 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-5fnbm"] Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.103150 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-5fnbm" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.103367 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rsd2x"] Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.104511 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8gml8"] Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.104807 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.105829 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-qgbcc"] Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.107200 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-l6gqh"] Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.108778 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-z6x72"] Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.110802 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-hlqj5"] Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.112344 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322165-xdwrz"] Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.114663 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pvm7n"] Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.116381 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-2kmtl"] Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.118667 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l657x"] Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.123450 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nhxdc"] Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.125204 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.126521 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsjtg"] Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.127582 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-n5mnh"] Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.128637 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-ds5h4"] Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.129607 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5ddff"] Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.132513 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4fgp8"] Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.134262 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-5fnbm"] Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.135451 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-c64p2"] Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.136866 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ld5bx"] Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.138670 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pc29"] Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.139859 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mhd4x"] Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.140947 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-dp6jb"] Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.141709 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-dp6jb" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.146600 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.150364 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:26 crc kubenswrapper[4771]: E1001 14:58:26.150482 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:26.650463513 +0000 UTC m=+151.269638684 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.150700 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/335db6c7-efb8-4055-aacf-8262b4ec5b91-client-ca\") pod \"route-controller-manager-6576b87f9c-dsjtg\" (UID: \"335db6c7-efb8-4055-aacf-8262b4ec5b91\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsjtg" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.151089 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xthf\" (UniqueName: \"kubernetes.io/projected/6af25d90-8e1b-45ec-ac53-1fdd01387b9f-kube-api-access-7xthf\") pod \"dns-default-mhd4x\" (UID: \"6af25d90-8e1b-45ec-ac53-1fdd01387b9f\") " pod="openshift-dns/dns-default-mhd4x" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.151141 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b31fcf4c-ab24-4f6a-9807-04c076e2d548-auth-proxy-config\") pod \"machine-config-operator-74547568cd-2kmtl\" (UID: \"b31fcf4c-ab24-4f6a-9807-04c076e2d548\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2kmtl" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.151164 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/51a76061-b2bf-427b-985e-767ebad2a8cb-available-featuregates\") pod \"openshift-config-operator-7777fb866f-tk4n9\" (UID: \"51a76061-b2bf-427b-985e-767ebad2a8cb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tk4n9" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.151209 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6730c7b8-fcbf-48c4-b2b8-5ed3566a7cd4-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-8gml8\" (UID: \"6730c7b8-fcbf-48c4-b2b8-5ed3566a7cd4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8gml8" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.151649 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/51a76061-b2bf-427b-985e-767ebad2a8cb-available-featuregates\") pod \"openshift-config-operator-7777fb866f-tk4n9\" (UID: \"51a76061-b2bf-427b-985e-767ebad2a8cb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tk4n9" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.151674 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/335db6c7-efb8-4055-aacf-8262b4ec5b91-client-ca\") pod \"route-controller-manager-6576b87f9c-dsjtg\" (UID: \"335db6c7-efb8-4055-aacf-8262b4ec5b91\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsjtg" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.152140 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b31fcf4c-ab24-4f6a-9807-04c076e2d548-auth-proxy-config\") pod \"machine-config-operator-74547568cd-2kmtl\" (UID: \"b31fcf4c-ab24-4f6a-9807-04c076e2d548\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2kmtl" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.152989 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/828f2820-7234-4e86-81fd-ca8666c1e640-registration-dir\") pod \"csi-hostpathplugin-5fnbm\" (UID: \"828f2820-7234-4e86-81fd-ca8666c1e640\") " pod="hostpath-provisioner/csi-hostpathplugin-5fnbm" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.153123 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/558574b3-154b-48bc-9436-3013a9b62f28-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7cpjp\" (UID: \"558574b3-154b-48bc-9436-3013a9b62f28\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7cpjp" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.153163 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2n5gh\" (UniqueName: \"kubernetes.io/projected/335db6c7-efb8-4055-aacf-8262b4ec5b91-kube-api-access-2n5gh\") pod \"route-controller-manager-6576b87f9c-dsjtg\" (UID: \"335db6c7-efb8-4055-aacf-8262b4ec5b91\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsjtg" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.153792 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/558574b3-154b-48bc-9436-3013a9b62f28-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7cpjp\" (UID: \"558574b3-154b-48bc-9436-3013a9b62f28\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7cpjp" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.154013 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-audit-dir\") pod \"oauth-openshift-558db77b4-pr972\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-pr972" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.154086 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9db1abd4-f11c-45e1-9341-6c818c3e3579-service-ca\") pod \"console-f9d7485db-szwtc\" (UID: \"9db1abd4-f11c-45e1-9341-6c818c3e3579\") " pod="openshift-console/console-f9d7485db-szwtc" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.154132 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c107c949-1ce5-41cb-a5a6-49bf5c599fc2-srv-cert\") pod \"olm-operator-6b444d44fb-pvm7n\" (UID: \"c107c949-1ce5-41cb-a5a6-49bf5c599fc2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pvm7n" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.154161 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83e0e747-e0e6-4aab-bc5c-27d0b41e2fb1-serving-cert\") pod \"authentication-operator-69f744f599-z6x72\" (UID: \"83e0e747-e0e6-4aab-bc5c-27d0b41e2fb1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z6x72" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.154188 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cd8df04a-9a5e-4784-ac97-9782d936fa5e-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-mjvfw\" (UID: \"cd8df04a-9a5e-4784-ac97-9782d936fa5e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mjvfw" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.154215 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/558574b3-154b-48bc-9436-3013a9b62f28-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7cpjp\" (UID: \"558574b3-154b-48bc-9436-3013a9b62f28\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7cpjp" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.154243 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/335db6c7-efb8-4055-aacf-8262b4ec5b91-config\") pod \"route-controller-manager-6576b87f9c-dsjtg\" (UID: \"335db6c7-efb8-4055-aacf-8262b4ec5b91\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsjtg" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.154269 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-pr972\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-pr972" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.154298 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-pr972\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-pr972" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.154326 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/828f2820-7234-4e86-81fd-ca8666c1e640-plugins-dir\") pod \"csi-hostpathplugin-5fnbm\" (UID: \"828f2820-7234-4e86-81fd-ca8666c1e640\") " pod="hostpath-provisioner/csi-hostpathplugin-5fnbm" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.154356 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/541a902f-ea82-44e7-9c01-e93c9e01a2b6-serving-cert\") pod \"apiserver-7bbb656c7d-ldrz6\" (UID: \"541a902f-ea82-44e7-9c01-e93c9e01a2b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ldrz6" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.154379 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmwhv\" (UniqueName: \"kubernetes.io/projected/2bba146f-994a-4fbd-834f-861c2ffa4232-kube-api-access-wmwhv\") pod \"kube-storage-version-migrator-operator-b67b599dd-9q7z8\" (UID: \"2bba146f-994a-4fbd-834f-861c2ffa4232\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9q7z8" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.154406 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/541a902f-ea82-44e7-9c01-e93c9e01a2b6-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-ldrz6\" (UID: \"541a902f-ea82-44e7-9c01-e93c9e01a2b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ldrz6" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.154433 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9db1abd4-f11c-45e1-9341-6c818c3e3579-trusted-ca-bundle\") pod \"console-f9d7485db-szwtc\" (UID: \"9db1abd4-f11c-45e1-9341-6c818c3e3579\") " pod="openshift-console/console-f9d7485db-szwtc" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.154482 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hp8lb\" (UniqueName: \"kubernetes.io/projected/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-kube-api-access-hp8lb\") pod \"oauth-openshift-558db77b4-pr972\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-pr972" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.154537 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40496e6d-3f79-4478-804b-dc9904473801-config-volume\") pod \"collect-profiles-29322165-xdwrz\" (UID: \"40496e6d-3f79-4478-804b-dc9904473801\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-xdwrz" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.154565 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9db1abd4-f11c-45e1-9341-6c818c3e3579-console-oauth-config\") pod \"console-f9d7485db-szwtc\" (UID: \"9db1abd4-f11c-45e1-9341-6c818c3e3579\") " pod="openshift-console/console-f9d7485db-szwtc" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.154592 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cd8df04a-9a5e-4784-ac97-9782d936fa5e-client-ca\") pod \"controller-manager-879f6c89f-mjvfw\" (UID: \"cd8df04a-9a5e-4784-ac97-9782d936fa5e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mjvfw" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.154616 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q9q6\" (UniqueName: \"kubernetes.io/projected/b248e756-e3b8-4fd9-b6fb-99ee87df696d-kube-api-access-9q9q6\") pod \"marketplace-operator-79b997595-rsd2x\" (UID: \"b248e756-e3b8-4fd9-b6fb-99ee87df696d\") " pod="openshift-marketplace/marketplace-operator-79b997595-rsd2x" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.154641 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a3fb633f-3f5a-4691-9b17-e59e9ef8b8b5-cert\") pod \"ingress-canary-c64p2\" (UID: \"a3fb633f-3f5a-4691-9b17-e59e9ef8b8b5\") " pod="openshift-ingress-canary/ingress-canary-c64p2" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.154671 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c107c949-1ce5-41cb-a5a6-49bf5c599fc2-profile-collector-cert\") pod \"olm-operator-6b444d44fb-pvm7n\" (UID: \"c107c949-1ce5-41cb-a5a6-49bf5c599fc2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pvm7n" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.154699 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b31fcf4c-ab24-4f6a-9807-04c076e2d548-images\") pod \"machine-config-operator-74547568cd-2kmtl\" (UID: \"b31fcf4c-ab24-4f6a-9807-04c076e2d548\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2kmtl" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.154724 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-pr972\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-pr972" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.154775 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51a76061-b2bf-427b-985e-767ebad2a8cb-serving-cert\") pod \"openshift-config-operator-7777fb866f-tk4n9\" (UID: \"51a76061-b2bf-427b-985e-767ebad2a8cb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tk4n9" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.154803 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/fcfb70b0-1fb4-49ec-b1e2-1d7788b52aeb-signing-cabundle\") pod \"service-ca-9c57cc56f-5ddff\" (UID: \"fcfb70b0-1fb4-49ec-b1e2-1d7788b52aeb\") " pod="openshift-service-ca/service-ca-9c57cc56f-5ddff" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.154828 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75svc\" (UniqueName: \"kubernetes.io/projected/a2238a30-4ae1-4bd8-acfa-1e357552252c-kube-api-access-75svc\") pod \"router-default-5444994796-wpxh2\" (UID: \"a2238a30-4ae1-4bd8-acfa-1e357552252c\") " pod="openshift-ingress/router-default-5444994796-wpxh2" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.154853 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/541a902f-ea82-44e7-9c01-e93c9e01a2b6-etcd-client\") pod \"apiserver-7bbb656c7d-ldrz6\" (UID: \"541a902f-ea82-44e7-9c01-e93c9e01a2b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ldrz6" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.154880 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvw4z\" (UniqueName: \"kubernetes.io/projected/828f2820-7234-4e86-81fd-ca8666c1e640-kube-api-access-kvw4z\") pod \"csi-hostpathplugin-5fnbm\" (UID: \"828f2820-7234-4e86-81fd-ca8666c1e640\") " pod="hostpath-provisioner/csi-hostpathplugin-5fnbm" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.154915 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2238a30-4ae1-4bd8-acfa-1e357552252c-metrics-certs\") pod \"router-default-5444994796-wpxh2\" (UID: \"a2238a30-4ae1-4bd8-acfa-1e357552252c\") " pod="openshift-ingress/router-default-5444994796-wpxh2" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.154950 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bh5xh\" (UniqueName: \"kubernetes.io/projected/ea64286d-7a71-4d7e-b54b-5796a3a9f7df-kube-api-access-bh5xh\") pod \"machine-config-controller-84d6567774-l6gqh\" (UID: \"ea64286d-7a71-4d7e-b54b-5796a3a9f7df\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l6gqh" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.154992 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.155033 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/aac54d18-3f6e-4bf8-98f6-6e24c2d8ed74-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-ds5h4\" (UID: \"aac54d18-3f6e-4bf8-98f6-6e24c2d8ed74\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ds5h4" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.155060 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-pr972\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-pr972" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.155090 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6af25d90-8e1b-45ec-ac53-1fdd01387b9f-metrics-tls\") pod \"dns-default-mhd4x\" (UID: \"6af25d90-8e1b-45ec-ac53-1fdd01387b9f\") " pod="openshift-dns/dns-default-mhd4x" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.155118 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40496e6d-3f79-4478-804b-dc9904473801-secret-volume\") pod \"collect-profiles-29322165-xdwrz\" (UID: \"40496e6d-3f79-4478-804b-dc9904473801\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-xdwrz" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.155142 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/828f2820-7234-4e86-81fd-ca8666c1e640-socket-dir\") pod \"csi-hostpathplugin-5fnbm\" (UID: \"828f2820-7234-4e86-81fd-ca8666c1e640\") " pod="hostpath-provisioner/csi-hostpathplugin-5fnbm" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.155164 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/828f2820-7234-4e86-81fd-ca8666c1e640-mountpoint-dir\") pod \"csi-hostpathplugin-5fnbm\" (UID: \"828f2820-7234-4e86-81fd-ca8666c1e640\") " pod="hostpath-provisioner/csi-hostpathplugin-5fnbm" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.155199 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0d14cf95-98ee-432d-8889-edf3508b4eb3-trusted-ca\") pod \"ingress-operator-5b745b69d9-qgbcc\" (UID: \"0d14cf95-98ee-432d-8889-edf3508b4eb3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qgbcc" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.155238 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-pr972\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-pr972" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.155270 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqjt8\" (UniqueName: \"kubernetes.io/projected/541a902f-ea82-44e7-9c01-e93c9e01a2b6-kube-api-access-rqjt8\") pod \"apiserver-7bbb656c7d-ldrz6\" (UID: \"541a902f-ea82-44e7-9c01-e93c9e01a2b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ldrz6" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.155298 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jgdw8\" (UniqueName: \"kubernetes.io/projected/a3fb633f-3f5a-4691-9b17-e59e9ef8b8b5-kube-api-access-jgdw8\") pod \"ingress-canary-c64p2\" (UID: \"a3fb633f-3f5a-4691-9b17-e59e9ef8b8b5\") " pod="openshift-ingress-canary/ingress-canary-c64p2" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.155329 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dcjp\" (UniqueName: \"kubernetes.io/projected/1d5cfd50-877b-49ab-82f0-7753f0fabba4-kube-api-access-5dcjp\") pod \"migrator-59844c95c7-n5mnh\" (UID: \"1d5cfd50-877b-49ab-82f0-7753f0fabba4\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-n5mnh" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.155360 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5dkt\" (UniqueName: \"kubernetes.io/projected/b31fcf4c-ab24-4f6a-9807-04c076e2d548-kube-api-access-n5dkt\") pod \"machine-config-operator-74547568cd-2kmtl\" (UID: \"b31fcf4c-ab24-4f6a-9807-04c076e2d548\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2kmtl" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.155386 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/892c3644-ab53-40dc-a65e-4b14e6b537ed-profile-collector-cert\") pod \"catalog-operator-68c6474976-4fgp8\" (UID: \"892c3644-ab53-40dc-a65e-4b14e6b537ed\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4fgp8" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.155412 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/abdbc7a1-d300-43d1-a23d-416a9cbc5a98-tmpfs\") pod \"packageserver-d55dfcdfc-l657x\" (UID: \"abdbc7a1-d300-43d1-a23d-416a9cbc5a98\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l657x" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.155439 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h76zj\" (UniqueName: \"kubernetes.io/projected/abdbc7a1-d300-43d1-a23d-416a9cbc5a98-kube-api-access-h76zj\") pod \"packageserver-d55dfcdfc-l657x\" (UID: \"abdbc7a1-d300-43d1-a23d-416a9cbc5a98\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l657x" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.155551 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlcnm\" (UniqueName: \"kubernetes.io/projected/728535b7-cedd-4391-b5e4-8cee0982380d-kube-api-access-nlcnm\") pod \"cluster-image-registry-operator-dc59b4c8b-g7zvd\" (UID: \"728535b7-cedd-4391-b5e4-8cee0982380d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g7zvd" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.155587 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83e0e747-e0e6-4aab-bc5c-27d0b41e2fb1-config\") pod \"authentication-operator-69f744f599-z6x72\" (UID: \"83e0e747-e0e6-4aab-bc5c-27d0b41e2fb1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z6x72" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.155628 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h95c4\" (UniqueName: \"kubernetes.io/projected/9db1abd4-f11c-45e1-9341-6c818c3e3579-kube-api-access-h95c4\") pod \"console-f9d7485db-szwtc\" (UID: \"9db1abd4-f11c-45e1-9341-6c818c3e3579\") " pod="openshift-console/console-f9d7485db-szwtc" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.155665 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd8df04a-9a5e-4784-ac97-9782d936fa5e-serving-cert\") pod \"controller-manager-879f6c89f-mjvfw\" (UID: \"cd8df04a-9a5e-4784-ac97-9782d936fa5e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mjvfw" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.155700 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nd5p6\" (UniqueName: \"kubernetes.io/projected/f6e05857-3b53-4e95-9f02-a79ad8b509a8-kube-api-access-nd5p6\") pod \"package-server-manager-789f6589d5-8pc29\" (UID: \"f6e05857-3b53-4e95-9f02-a79ad8b509a8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pc29" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.155761 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/541a902f-ea82-44e7-9c01-e93c9e01a2b6-audit-policies\") pod \"apiserver-7bbb656c7d-ldrz6\" (UID: \"541a902f-ea82-44e7-9c01-e93c9e01a2b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ldrz6" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.155784 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9db1abd4-f11c-45e1-9341-6c818c3e3579-service-ca\") pod \"console-f9d7485db-szwtc\" (UID: \"9db1abd4-f11c-45e1-9341-6c818c3e3579\") " pod="openshift-console/console-f9d7485db-szwtc" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.155796 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b248e756-e3b8-4fd9-b6fb-99ee87df696d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rsd2x\" (UID: \"b248e756-e3b8-4fd9-b6fb-99ee87df696d\") " pod="openshift-marketplace/marketplace-operator-79b997595-rsd2x" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.155882 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/abdbc7a1-d300-43d1-a23d-416a9cbc5a98-webhook-cert\") pod \"packageserver-d55dfcdfc-l657x\" (UID: \"abdbc7a1-d300-43d1-a23d-416a9cbc5a98\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l657x" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.155914 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ea64286d-7a71-4d7e-b54b-5796a3a9f7df-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-l6gqh\" (UID: \"ea64286d-7a71-4d7e-b54b-5796a3a9f7df\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l6gqh" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.155948 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk4cv\" (UniqueName: \"kubernetes.io/projected/fcfb70b0-1fb4-49ec-b1e2-1d7788b52aeb-kube-api-access-zk4cv\") pod \"service-ca-9c57cc56f-5ddff\" (UID: \"fcfb70b0-1fb4-49ec-b1e2-1d7788b52aeb\") " pod="openshift-service-ca/service-ca-9c57cc56f-5ddff" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.155972 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/541a902f-ea82-44e7-9c01-e93c9e01a2b6-encryption-config\") pod \"apiserver-7bbb656c7d-ldrz6\" (UID: \"541a902f-ea82-44e7-9c01-e93c9e01a2b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ldrz6" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.155999 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b248e756-e3b8-4fd9-b6fb-99ee87df696d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rsd2x\" (UID: \"b248e756-e3b8-4fd9-b6fb-99ee87df696d\") " pod="openshift-marketplace/marketplace-operator-79b997595-rsd2x" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.156030 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6af25d90-8e1b-45ec-ac53-1fdd01387b9f-config-volume\") pod \"dns-default-mhd4x\" (UID: \"6af25d90-8e1b-45ec-ac53-1fdd01387b9f\") " pod="openshift-dns/dns-default-mhd4x" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.156053 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/83e0e747-e0e6-4aab-bc5c-27d0b41e2fb1-service-ca-bundle\") pod \"authentication-operator-69f744f599-z6x72\" (UID: \"83e0e747-e0e6-4aab-bc5c-27d0b41e2fb1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z6x72" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.156062 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cd8df04a-9a5e-4784-ac97-9782d936fa5e-client-ca\") pod \"controller-manager-879f6c89f-mjvfw\" (UID: \"cd8df04a-9a5e-4784-ac97-9782d936fa5e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mjvfw" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.156098 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh8jc\" (UniqueName: \"kubernetes.io/projected/aac54d18-3f6e-4bf8-98f6-6e24c2d8ed74-kube-api-access-fh8jc\") pod \"multus-admission-controller-857f4d67dd-ds5h4\" (UID: \"aac54d18-3f6e-4bf8-98f6-6e24c2d8ed74\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ds5h4" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.156160 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a2238a30-4ae1-4bd8-acfa-1e357552252c-default-certificate\") pod \"router-default-5444994796-wpxh2\" (UID: \"a2238a30-4ae1-4bd8-acfa-1e357552252c\") " pod="openshift-ingress/router-default-5444994796-wpxh2" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.156756 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/541a902f-ea82-44e7-9c01-e93c9e01a2b6-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-ldrz6\" (UID: \"541a902f-ea82-44e7-9c01-e93c9e01a2b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ldrz6" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.157250 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/728535b7-cedd-4391-b5e4-8cee0982380d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-g7zvd\" (UID: \"728535b7-cedd-4391-b5e4-8cee0982380d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g7zvd" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.157270 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9db1abd4-f11c-45e1-9341-6c818c3e3579-trusted-ca-bundle\") pod \"console-f9d7485db-szwtc\" (UID: \"9db1abd4-f11c-45e1-9341-6c818c3e3579\") " pod="openshift-console/console-f9d7485db-szwtc" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.157290 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-pr972\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-pr972" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.157325 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b31fcf4c-ab24-4f6a-9807-04c076e2d548-proxy-tls\") pod \"machine-config-operator-74547568cd-2kmtl\" (UID: \"b31fcf4c-ab24-4f6a-9807-04c076e2d548\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2kmtl" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.157388 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khxgk\" (UniqueName: \"kubernetes.io/projected/c107c949-1ce5-41cb-a5a6-49bf5c599fc2-kube-api-access-khxgk\") pod \"olm-operator-6b444d44fb-pvm7n\" (UID: \"c107c949-1ce5-41cb-a5a6-49bf5c599fc2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pvm7n" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.157476 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-pr972\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-pr972" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.157569 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9db1abd4-f11c-45e1-9341-6c818c3e3579-console-serving-cert\") pod \"console-f9d7485db-szwtc\" (UID: \"9db1abd4-f11c-45e1-9341-6c818c3e3579\") " pod="openshift-console/console-f9d7485db-szwtc" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.157617 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/892c3644-ab53-40dc-a65e-4b14e6b537ed-srv-cert\") pod \"catalog-operator-68c6474976-4fgp8\" (UID: \"892c3644-ab53-40dc-a65e-4b14e6b537ed\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4fgp8" Oct 01 14:58:26 crc kubenswrapper[4771]: E1001 14:58:26.157657 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:26.657641136 +0000 UTC m=+151.276816397 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.157647 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/828f2820-7234-4e86-81fd-ca8666c1e640-csi-data-dir\") pod \"csi-hostpathplugin-5fnbm\" (UID: \"828f2820-7234-4e86-81fd-ca8666c1e640\") " pod="hostpath-provisioner/csi-hostpathplugin-5fnbm" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.157866 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/541a902f-ea82-44e7-9c01-e93c9e01a2b6-serving-cert\") pod \"apiserver-7bbb656c7d-ldrz6\" (UID: \"541a902f-ea82-44e7-9c01-e93c9e01a2b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ldrz6" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.158272 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/abdbc7a1-d300-43d1-a23d-416a9cbc5a98-tmpfs\") pod \"packageserver-d55dfcdfc-l657x\" (UID: \"abdbc7a1-d300-43d1-a23d-416a9cbc5a98\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l657x" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.154086 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-audit-dir\") pod \"oauth-openshift-558db77b4-pr972\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-pr972" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.158900 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jg4hf\" (UniqueName: \"kubernetes.io/projected/51a76061-b2bf-427b-985e-767ebad2a8cb-kube-api-access-jg4hf\") pod \"openshift-config-operator-7777fb866f-tk4n9\" (UID: \"51a76061-b2bf-427b-985e-767ebad2a8cb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tk4n9" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.159057 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7skn\" (UniqueName: \"kubernetes.io/projected/83e0e747-e0e6-4aab-bc5c-27d0b41e2fb1-kube-api-access-d7skn\") pod \"authentication-operator-69f744f599-z6x72\" (UID: \"83e0e747-e0e6-4aab-bc5c-27d0b41e2fb1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z6x72" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.159086 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0d14cf95-98ee-432d-8889-edf3508b4eb3-bound-sa-token\") pod \"ingress-operator-5b745b69d9-qgbcc\" (UID: \"0d14cf95-98ee-432d-8889-edf3508b4eb3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qgbcc" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.159106 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wq8gj\" (UniqueName: \"kubernetes.io/projected/892c3644-ab53-40dc-a65e-4b14e6b537ed-kube-api-access-wq8gj\") pod \"catalog-operator-68c6474976-4fgp8\" (UID: \"892c3644-ab53-40dc-a65e-4b14e6b537ed\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4fgp8" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.159125 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1fa98d96-d0f8-4b7f-9421-50e6eceaca84-serving-cert\") pod \"service-ca-operator-777779d784-hlqj5\" (UID: \"1fa98d96-d0f8-4b7f-9421-50e6eceaca84\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hlqj5" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.159337 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a2238a30-4ae1-4bd8-acfa-1e357552252c-stats-auth\") pod \"router-default-5444994796-wpxh2\" (UID: \"a2238a30-4ae1-4bd8-acfa-1e357552252c\") " pod="openshift-ingress/router-default-5444994796-wpxh2" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.159376 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/541a902f-ea82-44e7-9c01-e93c9e01a2b6-audit-policies\") pod \"apiserver-7bbb656c7d-ldrz6\" (UID: \"541a902f-ea82-44e7-9c01-e93c9e01a2b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ldrz6" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.159490 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2baafe19-ab7c-43c2-bd6b-0d6398b9fb3b-config\") pod \"openshift-apiserver-operator-796bbdcf4f-wcgzq\" (UID: \"2baafe19-ab7c-43c2-bd6b-0d6398b9fb3b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wcgzq" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.159515 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/83e0e747-e0e6-4aab-bc5c-27d0b41e2fb1-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-z6x72\" (UID: \"83e0e747-e0e6-4aab-bc5c-27d0b41e2fb1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z6x72" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.159534 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/541a902f-ea82-44e7-9c01-e93c9e01a2b6-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-ldrz6\" (UID: \"541a902f-ea82-44e7-9c01-e93c9e01a2b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ldrz6" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.159553 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnhjj\" (UniqueName: \"kubernetes.io/projected/d92fedc8-d031-40c1-b9fa-695496499a26-kube-api-access-wnhjj\") pod \"downloads-7954f5f757-nrlb2\" (UID: \"d92fedc8-d031-40c1-b9fa-695496499a26\") " pod="openshift-console/downloads-7954f5f757-nrlb2" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.159579 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fa98d96-d0f8-4b7f-9421-50e6eceaca84-config\") pod \"service-ca-operator-777779d784-hlqj5\" (UID: \"1fa98d96-d0f8-4b7f-9421-50e6eceaca84\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hlqj5" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.159598 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2bba146f-994a-4fbd-834f-861c2ffa4232-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-9q7z8\" (UID: \"2bba146f-994a-4fbd-834f-861c2ffa4232\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9q7z8" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.159614 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9db1abd4-f11c-45e1-9341-6c818c3e3579-oauth-serving-cert\") pod \"console-f9d7485db-szwtc\" (UID: \"9db1abd4-f11c-45e1-9341-6c818c3e3579\") " pod="openshift-console/console-f9d7485db-szwtc" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.159632 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvznm\" (UniqueName: \"kubernetes.io/projected/0d14cf95-98ee-432d-8889-edf3508b4eb3-kube-api-access-cvznm\") pod \"ingress-operator-5b745b69d9-qgbcc\" (UID: \"0d14cf95-98ee-432d-8889-edf3508b4eb3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qgbcc" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.159648 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2baafe19-ab7c-43c2-bd6b-0d6398b9fb3b-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-wcgzq\" (UID: \"2baafe19-ab7c-43c2-bd6b-0d6398b9fb3b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wcgzq" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.159689 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/728535b7-cedd-4391-b5e4-8cee0982380d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-g7zvd\" (UID: \"728535b7-cedd-4391-b5e4-8cee0982380d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g7zvd" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.159711 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/335db6c7-efb8-4055-aacf-8262b4ec5b91-serving-cert\") pod \"route-controller-manager-6576b87f9c-dsjtg\" (UID: \"335db6c7-efb8-4055-aacf-8262b4ec5b91\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsjtg" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.159775 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-audit-policies\") pod \"oauth-openshift-558db77b4-pr972\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-pr972" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.159794 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0d14cf95-98ee-432d-8889-edf3508b4eb3-metrics-tls\") pod \"ingress-operator-5b745b69d9-qgbcc\" (UID: \"0d14cf95-98ee-432d-8889-edf3508b4eb3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qgbcc" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.159813 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvxhk\" (UniqueName: \"kubernetes.io/projected/2baafe19-ab7c-43c2-bd6b-0d6398b9fb3b-kube-api-access-vvxhk\") pod \"openshift-apiserver-operator-796bbdcf4f-wcgzq\" (UID: \"2baafe19-ab7c-43c2-bd6b-0d6398b9fb3b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wcgzq" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.159832 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bba146f-994a-4fbd-834f-861c2ffa4232-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-9q7z8\" (UID: \"2bba146f-994a-4fbd-834f-861c2ffa4232\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9q7z8" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.159852 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ea64286d-7a71-4d7e-b54b-5796a3a9f7df-proxy-tls\") pod \"machine-config-controller-84d6567774-l6gqh\" (UID: \"ea64286d-7a71-4d7e-b54b-5796a3a9f7df\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l6gqh" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.159872 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6e05857-3b53-4e95-9f02-a79ad8b509a8-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-8pc29\" (UID: \"f6e05857-3b53-4e95-9f02-a79ad8b509a8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pc29" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.159891 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/abdbc7a1-d300-43d1-a23d-416a9cbc5a98-apiservice-cert\") pod \"packageserver-d55dfcdfc-l657x\" (UID: \"abdbc7a1-d300-43d1-a23d-416a9cbc5a98\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l657x" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.159909 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdcmg\" (UniqueName: \"kubernetes.io/projected/1fa98d96-d0f8-4b7f-9421-50e6eceaca84-kube-api-access-jdcmg\") pod \"service-ca-operator-777779d784-hlqj5\" (UID: \"1fa98d96-d0f8-4b7f-9421-50e6eceaca84\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hlqj5" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.159929 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-pr972\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-pr972" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.159950 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-pr972\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-pr972" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.159968 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd8df04a-9a5e-4784-ac97-9782d936fa5e-config\") pod \"controller-manager-879f6c89f-mjvfw\" (UID: \"cd8df04a-9a5e-4784-ac97-9782d936fa5e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mjvfw" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.159987 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-pr972\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-pr972" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.160006 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-pr972\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-pr972" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.160005 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/541a902f-ea82-44e7-9c01-e93c9e01a2b6-etcd-client\") pod \"apiserver-7bbb656c7d-ldrz6\" (UID: \"541a902f-ea82-44e7-9c01-e93c9e01a2b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ldrz6" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.160023 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9db1abd4-f11c-45e1-9341-6c818c3e3579-console-config\") pod \"console-f9d7485db-szwtc\" (UID: \"9db1abd4-f11c-45e1-9341-6c818c3e3579\") " pod="openshift-console/console-f9d7485db-szwtc" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.160045 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2238a30-4ae1-4bd8-acfa-1e357552252c-service-ca-bundle\") pod \"router-default-5444994796-wpxh2\" (UID: \"a2238a30-4ae1-4bd8-acfa-1e357552252c\") " pod="openshift-ingress/router-default-5444994796-wpxh2" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.160067 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z8c2\" (UniqueName: \"kubernetes.io/projected/3f9e5d02-c03d-42b0-a837-bfa317d1cbd8-kube-api-access-2z8c2\") pod \"cluster-samples-operator-665b6dd947-ld5bx\" (UID: \"3f9e5d02-c03d-42b0-a837-bfa317d1cbd8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ld5bx" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.160098 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/558574b3-154b-48bc-9436-3013a9b62f28-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7cpjp\" (UID: \"558574b3-154b-48bc-9436-3013a9b62f28\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7cpjp" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.160124 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxzp5\" (UniqueName: \"kubernetes.io/projected/40496e6d-3f79-4478-804b-dc9904473801-kube-api-access-pxzp5\") pod \"collect-profiles-29322165-xdwrz\" (UID: \"40496e6d-3f79-4478-804b-dc9904473801\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-xdwrz" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.160151 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1cd5336-780a-49f1-9360-e7b8c97779d2-config\") pod \"kube-apiserver-operator-766d6c64bb-nhxdc\" (UID: \"d1cd5336-780a-49f1-9360-e7b8c97779d2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nhxdc" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.160173 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb1bead7-b2e8-47b0-9e78-d5b6970d0121-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-mn6nl\" (UID: \"bb1bead7-b2e8-47b0-9e78-d5b6970d0121\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn6nl" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.160195 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwcdw\" (UniqueName: \"kubernetes.io/projected/cd8df04a-9a5e-4784-ac97-9782d936fa5e-kube-api-access-dwcdw\") pod \"controller-manager-879f6c89f-mjvfw\" (UID: \"cd8df04a-9a5e-4784-ac97-9782d936fa5e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mjvfw" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.160218 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhpgp\" (UniqueName: \"kubernetes.io/projected/6730c7b8-fcbf-48c4-b2b8-5ed3566a7cd4-kube-api-access-nhpgp\") pod \"control-plane-machine-set-operator-78cbb6b69f-8gml8\" (UID: \"6730c7b8-fcbf-48c4-b2b8-5ed3566a7cd4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8gml8" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.160243 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/fcfb70b0-1fb4-49ec-b1e2-1d7788b52aeb-signing-key\") pod \"service-ca-9c57cc56f-5ddff\" (UID: \"fcfb70b0-1fb4-49ec-b1e2-1d7788b52aeb\") " pod="openshift-service-ca/service-ca-9c57cc56f-5ddff" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.160260 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb1bead7-b2e8-47b0-9e78-d5b6970d0121-config\") pod \"kube-controller-manager-operator-78b949d7b-mn6nl\" (UID: \"bb1bead7-b2e8-47b0-9e78-d5b6970d0121\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn6nl" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.160277 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb1bead7-b2e8-47b0-9e78-d5b6970d0121-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-mn6nl\" (UID: \"bb1bead7-b2e8-47b0-9e78-d5b6970d0121\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn6nl" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.160296 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/728535b7-cedd-4391-b5e4-8cee0982380d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-g7zvd\" (UID: \"728535b7-cedd-4391-b5e4-8cee0982380d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g7zvd" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.160314 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3f9e5d02-c03d-42b0-a837-bfa317d1cbd8-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-ld5bx\" (UID: \"3f9e5d02-c03d-42b0-a837-bfa317d1cbd8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ld5bx" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.160342 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1cd5336-780a-49f1-9360-e7b8c97779d2-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-nhxdc\" (UID: \"d1cd5336-780a-49f1-9360-e7b8c97779d2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nhxdc" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.160363 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d1cd5336-780a-49f1-9360-e7b8c97779d2-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-nhxdc\" (UID: \"d1cd5336-780a-49f1-9360-e7b8c97779d2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nhxdc" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.160380 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/541a902f-ea82-44e7-9c01-e93c9e01a2b6-audit-dir\") pod \"apiserver-7bbb656c7d-ldrz6\" (UID: \"541a902f-ea82-44e7-9c01-e93c9e01a2b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ldrz6" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.160627 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ea64286d-7a71-4d7e-b54b-5796a3a9f7df-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-l6gqh\" (UID: \"ea64286d-7a71-4d7e-b54b-5796a3a9f7df\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l6gqh" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.160886 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/541a902f-ea82-44e7-9c01-e93c9e01a2b6-audit-dir\") pod \"apiserver-7bbb656c7d-ldrz6\" (UID: \"541a902f-ea82-44e7-9c01-e93c9e01a2b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ldrz6" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.160952 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/541a902f-ea82-44e7-9c01-e93c9e01a2b6-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-ldrz6\" (UID: \"541a902f-ea82-44e7-9c01-e93c9e01a2b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ldrz6" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.160087 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2baafe19-ab7c-43c2-bd6b-0d6398b9fb3b-config\") pod \"openshift-apiserver-operator-796bbdcf4f-wcgzq\" (UID: \"2baafe19-ab7c-43c2-bd6b-0d6398b9fb3b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wcgzq" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.161420 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bba146f-994a-4fbd-834f-861c2ffa4232-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-9q7z8\" (UID: \"2bba146f-994a-4fbd-834f-861c2ffa4232\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9q7z8" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.161557 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/335db6c7-efb8-4055-aacf-8262b4ec5b91-config\") pod \"route-controller-manager-6576b87f9c-dsjtg\" (UID: \"335db6c7-efb8-4055-aacf-8262b4ec5b91\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsjtg" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.161622 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bb1bead7-b2e8-47b0-9e78-d5b6970d0121-config\") pod \"kube-controller-manager-operator-78b949d7b-mn6nl\" (UID: \"bb1bead7-b2e8-47b0-9e78-d5b6970d0121\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn6nl" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.162740 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1cd5336-780a-49f1-9360-e7b8c97779d2-config\") pod \"kube-apiserver-operator-766d6c64bb-nhxdc\" (UID: \"d1cd5336-780a-49f1-9360-e7b8c97779d2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nhxdc" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.163041 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cd8df04a-9a5e-4784-ac97-9782d936fa5e-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-mjvfw\" (UID: \"cd8df04a-9a5e-4784-ac97-9782d936fa5e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mjvfw" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.163181 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9db1abd4-f11c-45e1-9341-6c818c3e3579-console-serving-cert\") pod \"console-f9d7485db-szwtc\" (UID: \"9db1abd4-f11c-45e1-9341-6c818c3e3579\") " pod="openshift-console/console-f9d7485db-szwtc" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.163554 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/558574b3-154b-48bc-9436-3013a9b62f28-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7cpjp\" (UID: \"558574b3-154b-48bc-9436-3013a9b62f28\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7cpjp" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.163600 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9db1abd4-f11c-45e1-9341-6c818c3e3579-console-oauth-config\") pod \"console-f9d7485db-szwtc\" (UID: \"9db1abd4-f11c-45e1-9341-6c818c3e3579\") " pod="openshift-console/console-f9d7485db-szwtc" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.163556 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9db1abd4-f11c-45e1-9341-6c818c3e3579-console-config\") pod \"console-f9d7485db-szwtc\" (UID: \"9db1abd4-f11c-45e1-9341-6c818c3e3579\") " pod="openshift-console/console-f9d7485db-szwtc" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.163746 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/728535b7-cedd-4391-b5e4-8cee0982380d-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-g7zvd\" (UID: \"728535b7-cedd-4391-b5e4-8cee0982380d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g7zvd" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.163978 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/541a902f-ea82-44e7-9c01-e93c9e01a2b6-encryption-config\") pod \"apiserver-7bbb656c7d-ldrz6\" (UID: \"541a902f-ea82-44e7-9c01-e93c9e01a2b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ldrz6" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.164815 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd8df04a-9a5e-4784-ac97-9782d936fa5e-config\") pod \"controller-manager-879f6c89f-mjvfw\" (UID: \"cd8df04a-9a5e-4784-ac97-9782d936fa5e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mjvfw" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.165402 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.165539 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/51a76061-b2bf-427b-985e-767ebad2a8cb-serving-cert\") pod \"openshift-config-operator-7777fb866f-tk4n9\" (UID: \"51a76061-b2bf-427b-985e-767ebad2a8cb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tk4n9" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.165786 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/335db6c7-efb8-4055-aacf-8262b4ec5b91-serving-cert\") pod \"route-controller-manager-6576b87f9c-dsjtg\" (UID: \"335db6c7-efb8-4055-aacf-8262b4ec5b91\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsjtg" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.166075 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd8df04a-9a5e-4784-ac97-9782d936fa5e-serving-cert\") pod \"controller-manager-879f6c89f-mjvfw\" (UID: \"cd8df04a-9a5e-4784-ac97-9782d936fa5e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mjvfw" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.167676 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bb1bead7-b2e8-47b0-9e78-d5b6970d0121-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-mn6nl\" (UID: \"bb1bead7-b2e8-47b0-9e78-d5b6970d0121\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn6nl" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.169124 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2baafe19-ab7c-43c2-bd6b-0d6398b9fb3b-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-wcgzq\" (UID: \"2baafe19-ab7c-43c2-bd6b-0d6398b9fb3b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wcgzq" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.169220 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1cd5336-780a-49f1-9360-e7b8c97779d2-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-nhxdc\" (UID: \"d1cd5336-780a-49f1-9360-e7b8c97779d2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nhxdc" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.169441 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83e0e747-e0e6-4aab-bc5c-27d0b41e2fb1-config\") pod \"authentication-operator-69f744f599-z6x72\" (UID: \"83e0e747-e0e6-4aab-bc5c-27d0b41e2fb1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z6x72" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.169818 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83e0e747-e0e6-4aab-bc5c-27d0b41e2fb1-serving-cert\") pod \"authentication-operator-69f744f599-z6x72\" (UID: \"83e0e747-e0e6-4aab-bc5c-27d0b41e2fb1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z6x72" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.169912 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9db1abd4-f11c-45e1-9341-6c818c3e3579-oauth-serving-cert\") pod \"console-f9d7485db-szwtc\" (UID: \"9db1abd4-f11c-45e1-9341-6c818c3e3579\") " pod="openshift-console/console-f9d7485db-szwtc" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.171647 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2bba146f-994a-4fbd-834f-861c2ffa4232-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-9q7z8\" (UID: \"2bba146f-994a-4fbd-834f-861c2ffa4232\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9q7z8" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.174205 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/728535b7-cedd-4391-b5e4-8cee0982380d-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-g7zvd\" (UID: \"728535b7-cedd-4391-b5e4-8cee0982380d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g7zvd" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.214180 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.219088 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/83e0e747-e0e6-4aab-bc5c-27d0b41e2fb1-service-ca-bundle\") pod \"authentication-operator-69f744f599-z6x72\" (UID: \"83e0e747-e0e6-4aab-bc5c-27d0b41e2fb1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z6x72" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.227129 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.231792 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.232866 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/83e0e747-e0e6-4aab-bc5c-27d0b41e2fb1-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-z6x72\" (UID: \"83e0e747-e0e6-4aab-bc5c-27d0b41e2fb1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z6x72" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.261705 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:26 crc kubenswrapper[4771]: E1001 14:58:26.261905 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:26.761872225 +0000 UTC m=+151.381047396 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.262778 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/828f2820-7234-4e86-81fd-ca8666c1e640-plugins-dir\") pod \"csi-hostpathplugin-5fnbm\" (UID: \"828f2820-7234-4e86-81fd-ca8666c1e640\") " pod="hostpath-provisioner/csi-hostpathplugin-5fnbm" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.263105 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/828f2820-7234-4e86-81fd-ca8666c1e640-plugins-dir\") pod \"csi-hostpathplugin-5fnbm\" (UID: \"828f2820-7234-4e86-81fd-ca8666c1e640\") " pod="hostpath-provisioner/csi-hostpathplugin-5fnbm" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.263260 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a3fb633f-3f5a-4691-9b17-e59e9ef8b8b5-cert\") pod \"ingress-canary-c64p2\" (UID: \"a3fb633f-3f5a-4691-9b17-e59e9ef8b8b5\") " pod="openshift-ingress-canary/ingress-canary-c64p2" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.263647 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/41c580c9-c09e-436e-9fff-05612f7b47f5-node-bootstrap-token\") pod \"machine-config-server-dp6jb\" (UID: \"41c580c9-c09e-436e-9fff-05612f7b47f5\") " pod="openshift-machine-config-operator/machine-config-server-dp6jb" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.263924 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvw4z\" (UniqueName: \"kubernetes.io/projected/828f2820-7234-4e86-81fd-ca8666c1e640-kube-api-access-kvw4z\") pod \"csi-hostpathplugin-5fnbm\" (UID: \"828f2820-7234-4e86-81fd-ca8666c1e640\") " pod="hostpath-provisioner/csi-hostpathplugin-5fnbm" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.264025 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.264113 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/828f2820-7234-4e86-81fd-ca8666c1e640-mountpoint-dir\") pod \"csi-hostpathplugin-5fnbm\" (UID: \"828f2820-7234-4e86-81fd-ca8666c1e640\") " pod="hostpath-provisioner/csi-hostpathplugin-5fnbm" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.264466 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/828f2820-7234-4e86-81fd-ca8666c1e640-socket-dir\") pod \"csi-hostpathplugin-5fnbm\" (UID: \"828f2820-7234-4e86-81fd-ca8666c1e640\") " pod="hostpath-provisioner/csi-hostpathplugin-5fnbm" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.264595 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/828f2820-7234-4e86-81fd-ca8666c1e640-mountpoint-dir\") pod \"csi-hostpathplugin-5fnbm\" (UID: \"828f2820-7234-4e86-81fd-ca8666c1e640\") " pod="hostpath-provisioner/csi-hostpathplugin-5fnbm" Oct 01 14:58:26 crc kubenswrapper[4771]: E1001 14:58:26.264682 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:26.764643626 +0000 UTC m=+151.383818867 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.264720 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/828f2820-7234-4e86-81fd-ca8666c1e640-socket-dir\") pod \"csi-hostpathplugin-5fnbm\" (UID: \"828f2820-7234-4e86-81fd-ca8666c1e640\") " pod="hostpath-provisioner/csi-hostpathplugin-5fnbm" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.264948 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.265094 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jgdw8\" (UniqueName: \"kubernetes.io/projected/a3fb633f-3f5a-4691-9b17-e59e9ef8b8b5-kube-api-access-jgdw8\") pod \"ingress-canary-c64p2\" (UID: \"a3fb633f-3f5a-4691-9b17-e59e9ef8b8b5\") " pod="openshift-ingress-canary/ingress-canary-c64p2" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.265419 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/828f2820-7234-4e86-81fd-ca8666c1e640-csi-data-dir\") pod \"csi-hostpathplugin-5fnbm\" (UID: \"828f2820-7234-4e86-81fd-ca8666c1e640\") " pod="hostpath-provisioner/csi-hostpathplugin-5fnbm" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.265767 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/828f2820-7234-4e86-81fd-ca8666c1e640-csi-data-dir\") pod \"csi-hostpathplugin-5fnbm\" (UID: \"828f2820-7234-4e86-81fd-ca8666c1e640\") " pod="hostpath-provisioner/csi-hostpathplugin-5fnbm" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.265915 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/41c580c9-c09e-436e-9fff-05612f7b47f5-certs\") pod \"machine-config-server-dp6jb\" (UID: \"41c580c9-c09e-436e-9fff-05612f7b47f5\") " pod="openshift-machine-config-operator/machine-config-server-dp6jb" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.265961 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9n8l\" (UniqueName: \"kubernetes.io/projected/41c580c9-c09e-436e-9fff-05612f7b47f5-kube-api-access-m9n8l\") pod \"machine-config-server-dp6jb\" (UID: \"41c580c9-c09e-436e-9fff-05612f7b47f5\") " pod="openshift-machine-config-operator/machine-config-server-dp6jb" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.266329 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/828f2820-7234-4e86-81fd-ca8666c1e640-registration-dir\") pod \"csi-hostpathplugin-5fnbm\" (UID: \"828f2820-7234-4e86-81fd-ca8666c1e640\") " pod="hostpath-provisioner/csi-hostpathplugin-5fnbm" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.266907 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/828f2820-7234-4e86-81fd-ca8666c1e640-registration-dir\") pod \"csi-hostpathplugin-5fnbm\" (UID: \"828f2820-7234-4e86-81fd-ca8666c1e640\") " pod="hostpath-provisioner/csi-hostpathplugin-5fnbm" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.285592 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.290988 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-pr972\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-pr972" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.304940 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.311948 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-pr972\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-pr972" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.325760 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.335926 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-pr972\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-pr972" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.346200 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.356479 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-pr972\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-pr972" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.366526 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.367133 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:26 crc kubenswrapper[4771]: E1001 14:58:26.367235 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:26.867213814 +0000 UTC m=+151.486388995 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.367503 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/41c580c9-c09e-436e-9fff-05612f7b47f5-certs\") pod \"machine-config-server-dp6jb\" (UID: \"41c580c9-c09e-436e-9fff-05612f7b47f5\") " pod="openshift-machine-config-operator/machine-config-server-dp6jb" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.367534 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9n8l\" (UniqueName: \"kubernetes.io/projected/41c580c9-c09e-436e-9fff-05612f7b47f5-kube-api-access-m9n8l\") pod \"machine-config-server-dp6jb\" (UID: \"41c580c9-c09e-436e-9fff-05612f7b47f5\") " pod="openshift-machine-config-operator/machine-config-server-dp6jb" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.367819 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/41c580c9-c09e-436e-9fff-05612f7b47f5-node-bootstrap-token\") pod \"machine-config-server-dp6jb\" (UID: \"41c580c9-c09e-436e-9fff-05612f7b47f5\") " pod="openshift-machine-config-operator/machine-config-server-dp6jb" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.367885 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:26 crc kubenswrapper[4771]: E1001 14:58:26.368174 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:26.868163058 +0000 UTC m=+151.487338229 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.385719 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.386548 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b31fcf4c-ab24-4f6a-9807-04c076e2d548-images\") pod \"machine-config-operator-74547568cd-2kmtl\" (UID: \"b31fcf4c-ab24-4f6a-9807-04c076e2d548\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2kmtl" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.405353 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.424962 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.432328 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b31fcf4c-ab24-4f6a-9807-04c076e2d548-proxy-tls\") pod \"machine-config-operator-74547568cd-2kmtl\" (UID: \"b31fcf4c-ab24-4f6a-9807-04c076e2d548\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2kmtl" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.445342 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.465891 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.468763 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:26 crc kubenswrapper[4771]: E1001 14:58:26.468960 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:26.9689345 +0000 UTC m=+151.588109671 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.469029 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:26 crc kubenswrapper[4771]: E1001 14:58:26.470013 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:26.970000977 +0000 UTC m=+151.589176148 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.475523 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-pr972\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-pr972" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.486151 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.492957 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-audit-policies\") pod \"oauth-openshift-558db77b4-pr972\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-pr972" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.505558 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.535980 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.542690 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-pr972\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-pr972" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.545826 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.565685 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.570644 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:26 crc kubenswrapper[4771]: E1001 14:58:26.570797 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:27.070778489 +0000 UTC m=+151.689953660 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.571106 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:26 crc kubenswrapper[4771]: E1001 14:58:26.571577 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:27.07156422 +0000 UTC m=+151.690739411 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.574883 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0d14cf95-98ee-432d-8889-edf3508b4eb3-metrics-tls\") pod \"ingress-operator-5b745b69d9-qgbcc\" (UID: \"0d14cf95-98ee-432d-8889-edf3508b4eb3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qgbcc" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.594367 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.599713 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0d14cf95-98ee-432d-8889-edf3508b4eb3-trusted-ca\") pod \"ingress-operator-5b745b69d9-qgbcc\" (UID: \"0d14cf95-98ee-432d-8889-edf3508b4eb3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qgbcc" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.606772 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.626052 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.634723 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-pr972\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-pr972" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.653282 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.663255 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-pr972\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-pr972" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.665313 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.668108 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-pr972\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-pr972" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.672828 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:26 crc kubenswrapper[4771]: E1001 14:58:26.673165 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:27.173134331 +0000 UTC m=+151.792309522 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.673509 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:26 crc kubenswrapper[4771]: E1001 14:58:26.673896 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:27.173881901 +0000 UTC m=+151.793057072 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.685863 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.694237 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-pr972\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-pr972" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.705991 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.716405 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1fa98d96-d0f8-4b7f-9421-50e6eceaca84-serving-cert\") pod \"service-ca-operator-777779d784-hlqj5\" (UID: \"1fa98d96-d0f8-4b7f-9421-50e6eceaca84\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hlqj5" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.726905 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.746650 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.765989 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.775714 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:26 crc kubenswrapper[4771]: E1001 14:58:26.775883 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:27.275863043 +0000 UTC m=+151.895038214 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.776464 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:26 crc kubenswrapper[4771]: E1001 14:58:26.776900 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:27.276879869 +0000 UTC m=+151.896055050 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.786513 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.806084 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.812685 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b248e756-e3b8-4fd9-b6fb-99ee87df696d-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rsd2x\" (UID: \"b248e756-e3b8-4fd9-b6fb-99ee87df696d\") " pod="openshift-marketplace/marketplace-operator-79b997595-rsd2x" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.827897 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.855799 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.857274 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b248e756-e3b8-4fd9-b6fb-99ee87df696d-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rsd2x\" (UID: \"b248e756-e3b8-4fd9-b6fb-99ee87df696d\") " pod="openshift-marketplace/marketplace-operator-79b997595-rsd2x" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.867478 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.871768 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fa98d96-d0f8-4b7f-9421-50e6eceaca84-config\") pod \"service-ca-operator-777779d784-hlqj5\" (UID: \"1fa98d96-d0f8-4b7f-9421-50e6eceaca84\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hlqj5" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.877177 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:26 crc kubenswrapper[4771]: E1001 14:58:26.877346 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:27.377319332 +0000 UTC m=+151.996494513 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.878014 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:26 crc kubenswrapper[4771]: E1001 14:58:26.878358 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:27.378346239 +0000 UTC m=+151.997521410 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.891152 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.900373 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-pr972\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-pr972" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.905903 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.927354 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.945704 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.965527 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.974869 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3f9e5d02-c03d-42b0-a837-bfa317d1cbd8-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-ld5bx\" (UID: \"3f9e5d02-c03d-42b0-a837-bfa317d1cbd8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ld5bx" Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.979135 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:26 crc kubenswrapper[4771]: E1001 14:58:26.979345 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:27.479309075 +0000 UTC m=+152.098484286 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.979686 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:26 crc kubenswrapper[4771]: E1001 14:58:26.980191 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:27.480169497 +0000 UTC m=+152.099344708 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:26 crc kubenswrapper[4771]: I1001 14:58:26.986570 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.005330 4771 request.go:700] Waited for 1.005407089s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmcc-proxy-tls&limit=500&resourceVersion=0 Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.007885 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.016425 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ea64286d-7a71-4d7e-b54b-5796a3a9f7df-proxy-tls\") pod \"machine-config-controller-84d6567774-l6gqh\" (UID: \"ea64286d-7a71-4d7e-b54b-5796a3a9f7df\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l6gqh" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.024965 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.045682 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.055982 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6730c7b8-fcbf-48c4-b2b8-5ed3566a7cd4-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-8gml8\" (UID: \"6730c7b8-fcbf-48c4-b2b8-5ed3566a7cd4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8gml8" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.066447 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.081756 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.081983 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:27.581954495 +0000 UTC m=+152.201129676 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.082306 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.082652 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:27.582642183 +0000 UTC m=+152.201817424 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.085669 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.105721 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.125674 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.146216 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.155035 4771 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.155088 4771 configmap.go:193] Couldn't get configMap openshift-operator-lifecycle-manager/collect-profiles-config: failed to sync configmap cache: timed out waiting for the condition Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.155115 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c107c949-1ce5-41cb-a5a6-49bf5c599fc2-srv-cert podName:c107c949-1ce5-41cb-a5a6-49bf5c599fc2 nodeName:}" failed. No retries permitted until 2025-10-01 14:58:27.655096661 +0000 UTC m=+152.274271832 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/c107c949-1ce5-41cb-a5a6-49bf5c599fc2-srv-cert") pod "olm-operator-6b444d44fb-pvm7n" (UID: "c107c949-1ce5-41cb-a5a6-49bf5c599fc2") : failed to sync secret cache: timed out waiting for the condition Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.155136 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/40496e6d-3f79-4478-804b-dc9904473801-config-volume podName:40496e6d-3f79-4478-804b-dc9904473801 nodeName:}" failed. No retries permitted until 2025-10-01 14:58:27.655122902 +0000 UTC m=+152.274298073 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/40496e6d-3f79-4478-804b-dc9904473801-config-volume") pod "collect-profiles-29322165-xdwrz" (UID: "40496e6d-3f79-4478-804b-dc9904473801") : failed to sync configmap cache: timed out waiting for the condition Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.155040 4771 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/pprof-cert: failed to sync secret cache: timed out waiting for the condition Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.155173 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c107c949-1ce5-41cb-a5a6-49bf5c599fc2-profile-collector-cert podName:c107c949-1ce5-41cb-a5a6-49bf5c599fc2 nodeName:}" failed. No retries permitted until 2025-10-01 14:58:27.655165113 +0000 UTC m=+152.274340284 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "profile-collector-cert" (UniqueName: "kubernetes.io/secret/c107c949-1ce5-41cb-a5a6-49bf5c599fc2-profile-collector-cert") pod "olm-operator-6b444d44fb-pvm7n" (UID: "c107c949-1ce5-41cb-a5a6-49bf5c599fc2") : failed to sync secret cache: timed out waiting for the condition Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.157425 4771 configmap.go:193] Couldn't get configMap openshift-service-ca/signing-cabundle: failed to sync configmap cache: timed out waiting for the condition Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.157706 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/fcfb70b0-1fb4-49ec-b1e2-1d7788b52aeb-signing-cabundle podName:fcfb70b0-1fb4-49ec-b1e2-1d7788b52aeb nodeName:}" failed. No retries permitted until 2025-10-01 14:58:27.657682037 +0000 UTC m=+152.276857258 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-cabundle" (UniqueName: "kubernetes.io/configmap/fcfb70b0-1fb4-49ec-b1e2-1d7788b52aeb-signing-cabundle") pod "service-ca-9c57cc56f-5ddff" (UID: "fcfb70b0-1fb4-49ec-b1e2-1d7788b52aeb") : failed to sync configmap cache: timed out waiting for the condition Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.157851 4771 secret.go:188] Couldn't get secret openshift-dns/dns-default-metrics-tls: failed to sync secret cache: timed out waiting for the condition Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.157904 4771 configmap.go:193] Couldn't get configMap openshift-dns/dns-default: failed to sync configmap cache: timed out waiting for the condition Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.157920 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6af25d90-8e1b-45ec-ac53-1fdd01387b9f-metrics-tls podName:6af25d90-8e1b-45ec-ac53-1fdd01387b9f nodeName:}" failed. No retries permitted until 2025-10-01 14:58:27.657906083 +0000 UTC m=+152.277081254 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6af25d90-8e1b-45ec-ac53-1fdd01387b9f-metrics-tls") pod "dns-default-mhd4x" (UID: "6af25d90-8e1b-45ec-ac53-1fdd01387b9f") : failed to sync secret cache: timed out waiting for the condition Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.157818 4771 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/pprof-cert: failed to sync secret cache: timed out waiting for the condition Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.157947 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6af25d90-8e1b-45ec-ac53-1fdd01387b9f-config-volume podName:6af25d90-8e1b-45ec-ac53-1fdd01387b9f nodeName:}" failed. No retries permitted until 2025-10-01 14:58:27.657935894 +0000 UTC m=+152.277111075 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/6af25d90-8e1b-45ec-ac53-1fdd01387b9f-config-volume") pod "dns-default-mhd4x" (UID: "6af25d90-8e1b-45ec-ac53-1fdd01387b9f") : failed to sync configmap cache: timed out waiting for the condition Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.157968 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/892c3644-ab53-40dc-a65e-4b14e6b537ed-profile-collector-cert podName:892c3644-ab53-40dc-a65e-4b14e6b537ed nodeName:}" failed. No retries permitted until 2025-10-01 14:58:27.657958754 +0000 UTC m=+152.277133925 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "profile-collector-cert" (UniqueName: "kubernetes.io/secret/892c3644-ab53-40dc-a65e-4b14e6b537ed-profile-collector-cert") pod "catalog-operator-68c6474976-4fgp8" (UID: "892c3644-ab53-40dc-a65e-4b14e6b537ed") : failed to sync secret cache: timed out waiting for the condition Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.158067 4771 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/pprof-cert: failed to sync secret cache: timed out waiting for the condition Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.158100 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40496e6d-3f79-4478-804b-dc9904473801-secret-volume podName:40496e6d-3f79-4478-804b-dc9904473801 nodeName:}" failed. No retries permitted until 2025-10-01 14:58:27.658091608 +0000 UTC m=+152.277266779 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-volume" (UniqueName: "kubernetes.io/secret/40496e6d-3f79-4478-804b-dc9904473801-secret-volume") pod "collect-profiles-29322165-xdwrz" (UID: "40496e6d-3f79-4478-804b-dc9904473801") : failed to sync secret cache: timed out waiting for the condition Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.158466 4771 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.158521 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/abdbc7a1-d300-43d1-a23d-416a9cbc5a98-webhook-cert podName:abdbc7a1-d300-43d1-a23d-416a9cbc5a98 nodeName:}" failed. No retries permitted until 2025-10-01 14:58:27.658507099 +0000 UTC m=+152.277682300 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/abdbc7a1-d300-43d1-a23d-416a9cbc5a98-webhook-cert") pod "packageserver-d55dfcdfc-l657x" (UID: "abdbc7a1-d300-43d1-a23d-416a9cbc5a98") : failed to sync secret cache: timed out waiting for the condition Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.158554 4771 secret.go:188] Couldn't get secret openshift-ingress/router-certs-default: failed to sync secret cache: timed out waiting for the condition Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.158602 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2238a30-4ae1-4bd8-acfa-1e357552252c-default-certificate podName:a2238a30-4ae1-4bd8-acfa-1e357552252c nodeName:}" failed. No retries permitted until 2025-10-01 14:58:27.658588561 +0000 UTC m=+152.277763812 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-certificate" (UniqueName: "kubernetes.io/secret/a2238a30-4ae1-4bd8-acfa-1e357552252c-default-certificate") pod "router-default-5444994796-wpxh2" (UID: "a2238a30-4ae1-4bd8-acfa-1e357552252c") : failed to sync secret cache: timed out waiting for the condition Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.158554 4771 secret.go:188] Couldn't get secret openshift-multus/multus-admission-controller-secret: failed to sync secret cache: timed out waiting for the condition Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.158646 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aac54d18-3f6e-4bf8-98f6-6e24c2d8ed74-webhook-certs podName:aac54d18-3f6e-4bf8-98f6-6e24c2d8ed74 nodeName:}" failed. No retries permitted until 2025-10-01 14:58:27.658634182 +0000 UTC m=+152.277809443 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/aac54d18-3f6e-4bf8-98f6-6e24c2d8ed74-webhook-certs") pod "multus-admission-controller-857f4d67dd-ds5h4" (UID: "aac54d18-3f6e-4bf8-98f6-6e24c2d8ed74") : failed to sync secret cache: timed out waiting for the condition Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.158675 4771 secret.go:188] Couldn't get secret openshift-ingress/router-metrics-certs-default: failed to sync secret cache: timed out waiting for the condition Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.158704 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2238a30-4ae1-4bd8-acfa-1e357552252c-metrics-certs podName:a2238a30-4ae1-4bd8-acfa-1e357552252c nodeName:}" failed. No retries permitted until 2025-10-01 14:58:27.658696844 +0000 UTC m=+152.277872095 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a2238a30-4ae1-4bd8-acfa-1e357552252c-metrics-certs") pod "router-default-5444994796-wpxh2" (UID: "a2238a30-4ae1-4bd8-acfa-1e357552252c") : failed to sync secret cache: timed out waiting for the condition Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.159107 4771 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.159335 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/892c3644-ab53-40dc-a65e-4b14e6b537ed-srv-cert podName:892c3644-ab53-40dc-a65e-4b14e6b537ed nodeName:}" failed. No retries permitted until 2025-10-01 14:58:27.659277088 +0000 UTC m=+152.278452339 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/892c3644-ab53-40dc-a65e-4b14e6b537ed-srv-cert") pod "catalog-operator-68c6474976-4fgp8" (UID: "892c3644-ab53-40dc-a65e-4b14e6b537ed") : failed to sync secret cache: timed out waiting for the condition Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.159444 4771 secret.go:188] Couldn't get secret openshift-ingress/router-stats-default: failed to sync secret cache: timed out waiting for the condition Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.159495 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2238a30-4ae1-4bd8-acfa-1e357552252c-stats-auth podName:a2238a30-4ae1-4bd8-acfa-1e357552252c nodeName:}" failed. No retries permitted until 2025-10-01 14:58:27.659484903 +0000 UTC m=+152.278660154 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "stats-auth" (UniqueName: "kubernetes.io/secret/a2238a30-4ae1-4bd8-acfa-1e357552252c-stats-auth") pod "router-default-5444994796-wpxh2" (UID: "a2238a30-4ae1-4bd8-acfa-1e357552252c") : failed to sync secret cache: timed out waiting for the condition Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.161135 4771 configmap.go:193] Couldn't get configMap openshift-ingress/service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.161198 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a2238a30-4ae1-4bd8-acfa-1e357552252c-service-ca-bundle podName:a2238a30-4ae1-4bd8-acfa-1e357552252c nodeName:}" failed. No retries permitted until 2025-10-01 14:58:27.661174097 +0000 UTC m=+152.280349378 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a2238a30-4ae1-4bd8-acfa-1e357552252c-service-ca-bundle") pod "router-default-5444994796-wpxh2" (UID: "a2238a30-4ae1-4bd8-acfa-1e357552252c") : failed to sync configmap cache: timed out waiting for the condition Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.161144 4771 secret.go:188] Couldn't get secret openshift-service-ca/signing-key: failed to sync secret cache: timed out waiting for the condition Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.161242 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcfb70b0-1fb4-49ec-b1e2-1d7788b52aeb-signing-key podName:fcfb70b0-1fb4-49ec-b1e2-1d7788b52aeb nodeName:}" failed. No retries permitted until 2025-10-01 14:58:27.661229088 +0000 UTC m=+152.280404349 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-key" (UniqueName: "kubernetes.io/secret/fcfb70b0-1fb4-49ec-b1e2-1d7788b52aeb-signing-key") pod "service-ca-9c57cc56f-5ddff" (UID: "fcfb70b0-1fb4-49ec-b1e2-1d7788b52aeb") : failed to sync secret cache: timed out waiting for the condition Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.162305 4771 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.162325 4771 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: failed to sync secret cache: timed out waiting for the condition Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.162356 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/abdbc7a1-d300-43d1-a23d-416a9cbc5a98-apiservice-cert podName:abdbc7a1-d300-43d1-a23d-416a9cbc5a98 nodeName:}" failed. No retries permitted until 2025-10-01 14:58:27.662345186 +0000 UTC m=+152.281520357 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/abdbc7a1-d300-43d1-a23d-416a9cbc5a98-apiservice-cert") pod "packageserver-d55dfcdfc-l657x" (UID: "abdbc7a1-d300-43d1-a23d-416a9cbc5a98") : failed to sync secret cache: timed out waiting for the condition Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.162381 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6e05857-3b53-4e95-9f02-a79ad8b509a8-package-server-manager-serving-cert podName:f6e05857-3b53-4e95-9f02-a79ad8b509a8 nodeName:}" failed. No retries permitted until 2025-10-01 14:58:27.662372008 +0000 UTC m=+152.281547279 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/f6e05857-3b53-4e95-9f02-a79ad8b509a8-package-server-manager-serving-cert") pod "package-server-manager-789f6589d5-8pc29" (UID: "f6e05857-3b53-4e95-9f02-a79ad8b509a8") : failed to sync secret cache: timed out waiting for the condition Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.165796 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.184000 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.184512 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:27.684492881 +0000 UTC m=+152.303668052 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.185360 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.185782 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.186057 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:27.686048681 +0000 UTC m=+152.305223852 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.205349 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.225921 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.245347 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.264041 4771 secret.go:188] Couldn't get secret openshift-ingress-canary/canary-serving-cert: failed to sync secret cache: timed out waiting for the condition Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.264112 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3fb633f-3f5a-4691-9b17-e59e9ef8b8b5-cert podName:a3fb633f-3f5a-4691-9b17-e59e9ef8b8b5 nodeName:}" failed. No retries permitted until 2025-10-01 14:58:27.764088733 +0000 UTC m=+152.383263904 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a3fb633f-3f5a-4691-9b17-e59e9ef8b8b5-cert") pod "ingress-canary-c64p2" (UID: "a3fb633f-3f5a-4691-9b17-e59e9ef8b8b5") : failed to sync secret cache: timed out waiting for the condition Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.265030 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.285598 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.286656 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.286820 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:27.786795862 +0000 UTC m=+152.405971033 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.287329 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.287676 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:27.787663005 +0000 UTC m=+152.406838176 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.305177 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.325467 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.346666 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.366482 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.368477 4771 secret.go:188] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.368708 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41c580c9-c09e-436e-9fff-05612f7b47f5-node-bootstrap-token podName:41c580c9-c09e-436e-9fff-05612f7b47f5 nodeName:}" failed. No retries permitted until 2025-10-01 14:58:27.868668242 +0000 UTC m=+152.487843413 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/41c580c9-c09e-436e-9fff-05612f7b47f5-node-bootstrap-token") pod "machine-config-server-dp6jb" (UID: "41c580c9-c09e-436e-9fff-05612f7b47f5") : failed to sync secret cache: timed out waiting for the condition Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.369197 4771 secret.go:188] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.369356 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41c580c9-c09e-436e-9fff-05612f7b47f5-certs podName:41c580c9-c09e-436e-9fff-05612f7b47f5 nodeName:}" failed. No retries permitted until 2025-10-01 14:58:27.869318518 +0000 UTC m=+152.488493689 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/41c580c9-c09e-436e-9fff-05612f7b47f5-certs") pod "machine-config-server-dp6jb" (UID: "41c580c9-c09e-436e-9fff-05612f7b47f5") : failed to sync secret cache: timed out waiting for the condition Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.386493 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.388985 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.389201 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:27.889156794 +0000 UTC m=+152.508331965 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.389707 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.390276 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:27.890260463 +0000 UTC m=+152.509435824 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.407215 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.425703 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.446023 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.467862 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.486041 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.491795 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.492163 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:27.992133013 +0000 UTC m=+152.611308184 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.492583 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.492991 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:27.992981235 +0000 UTC m=+152.612156406 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.505674 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.525940 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.550812 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.565993 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.586501 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.594199 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.594406 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:28.094365602 +0000 UTC m=+152.713540803 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.594851 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.595288 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:28.095271795 +0000 UTC m=+152.714446966 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.606166 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.626949 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.644785 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.667081 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.696376 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.696687 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6e05857-3b53-4e95-9f02-a79ad8b509a8-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-8pc29\" (UID: \"f6e05857-3b53-4e95-9f02-a79ad8b509a8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pc29" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.696772 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/abdbc7a1-d300-43d1-a23d-416a9cbc5a98-apiservice-cert\") pod \"packageserver-d55dfcdfc-l657x\" (UID: \"abdbc7a1-d300-43d1-a23d-416a9cbc5a98\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l657x" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.696809 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2238a30-4ae1-4bd8-acfa-1e357552252c-service-ca-bundle\") pod \"router-default-5444994796-wpxh2\" (UID: \"a2238a30-4ae1-4bd8-acfa-1e357552252c\") " pod="openshift-ingress/router-default-5444994796-wpxh2" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.696868 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/fcfb70b0-1fb4-49ec-b1e2-1d7788b52aeb-signing-key\") pod \"service-ca-9c57cc56f-5ddff\" (UID: \"fcfb70b0-1fb4-49ec-b1e2-1d7788b52aeb\") " pod="openshift-service-ca/service-ca-9c57cc56f-5ddff" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.696964 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c107c949-1ce5-41cb-a5a6-49bf5c599fc2-srv-cert\") pod \"olm-operator-6b444d44fb-pvm7n\" (UID: \"c107c949-1ce5-41cb-a5a6-49bf5c599fc2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pvm7n" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.697034 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40496e6d-3f79-4478-804b-dc9904473801-config-volume\") pod \"collect-profiles-29322165-xdwrz\" (UID: \"40496e6d-3f79-4478-804b-dc9904473801\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-xdwrz" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.697091 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c107c949-1ce5-41cb-a5a6-49bf5c599fc2-profile-collector-cert\") pod \"olm-operator-6b444d44fb-pvm7n\" (UID: \"c107c949-1ce5-41cb-a5a6-49bf5c599fc2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pvm7n" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.697136 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/fcfb70b0-1fb4-49ec-b1e2-1d7788b52aeb-signing-cabundle\") pod \"service-ca-9c57cc56f-5ddff\" (UID: \"fcfb70b0-1fb4-49ec-b1e2-1d7788b52aeb\") " pod="openshift-service-ca/service-ca-9c57cc56f-5ddff" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.697191 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2238a30-4ae1-4bd8-acfa-1e357552252c-metrics-certs\") pod \"router-default-5444994796-wpxh2\" (UID: \"a2238a30-4ae1-4bd8-acfa-1e357552252c\") " pod="openshift-ingress/router-default-5444994796-wpxh2" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.697236 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6af25d90-8e1b-45ec-ac53-1fdd01387b9f-metrics-tls\") pod \"dns-default-mhd4x\" (UID: \"6af25d90-8e1b-45ec-ac53-1fdd01387b9f\") " pod="openshift-dns/dns-default-mhd4x" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.697273 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/aac54d18-3f6e-4bf8-98f6-6e24c2d8ed74-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-ds5h4\" (UID: \"aac54d18-3f6e-4bf8-98f6-6e24c2d8ed74\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ds5h4" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.697306 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40496e6d-3f79-4478-804b-dc9904473801-secret-volume\") pod \"collect-profiles-29322165-xdwrz\" (UID: \"40496e6d-3f79-4478-804b-dc9904473801\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-xdwrz" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.697378 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/892c3644-ab53-40dc-a65e-4b14e6b537ed-profile-collector-cert\") pod \"catalog-operator-68c6474976-4fgp8\" (UID: \"892c3644-ab53-40dc-a65e-4b14e6b537ed\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4fgp8" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.697445 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/abdbc7a1-d300-43d1-a23d-416a9cbc5a98-webhook-cert\") pod \"packageserver-d55dfcdfc-l657x\" (UID: \"abdbc7a1-d300-43d1-a23d-416a9cbc5a98\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l657x" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.697480 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6af25d90-8e1b-45ec-ac53-1fdd01387b9f-config-volume\") pod \"dns-default-mhd4x\" (UID: \"6af25d90-8e1b-45ec-ac53-1fdd01387b9f\") " pod="openshift-dns/dns-default-mhd4x" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.697512 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a2238a30-4ae1-4bd8-acfa-1e357552252c-default-certificate\") pod \"router-default-5444994796-wpxh2\" (UID: \"a2238a30-4ae1-4bd8-acfa-1e357552252c\") " pod="openshift-ingress/router-default-5444994796-wpxh2" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.697552 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/892c3644-ab53-40dc-a65e-4b14e6b537ed-srv-cert\") pod \"catalog-operator-68c6474976-4fgp8\" (UID: \"892c3644-ab53-40dc-a65e-4b14e6b537ed\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4fgp8" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.697617 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a2238a30-4ae1-4bd8-acfa-1e357552252c-stats-auth\") pod \"router-default-5444994796-wpxh2\" (UID: \"a2238a30-4ae1-4bd8-acfa-1e357552252c\") " pod="openshift-ingress/router-default-5444994796-wpxh2" Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.698787 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:28.198723206 +0000 UTC m=+152.817898377 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.699391 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6af25d90-8e1b-45ec-ac53-1fdd01387b9f-config-volume\") pod \"dns-default-mhd4x\" (UID: \"6af25d90-8e1b-45ec-ac53-1fdd01387b9f\") " pod="openshift-dns/dns-default-mhd4x" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.699469 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/fcfb70b0-1fb4-49ec-b1e2-1d7788b52aeb-signing-cabundle\") pod \"service-ca-9c57cc56f-5ddff\" (UID: \"fcfb70b0-1fb4-49ec-b1e2-1d7788b52aeb\") " pod="openshift-service-ca/service-ca-9c57cc56f-5ddff" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.701684 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a2238a30-4ae1-4bd8-acfa-1e357552252c-stats-auth\") pod \"router-default-5444994796-wpxh2\" (UID: \"a2238a30-4ae1-4bd8-acfa-1e357552252c\") " pod="openshift-ingress/router-default-5444994796-wpxh2" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.701985 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40496e6d-3f79-4478-804b-dc9904473801-config-volume\") pod \"collect-profiles-29322165-xdwrz\" (UID: \"40496e6d-3f79-4478-804b-dc9904473801\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-xdwrz" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.701991 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/abdbc7a1-d300-43d1-a23d-416a9cbc5a98-apiservice-cert\") pod \"packageserver-d55dfcdfc-l657x\" (UID: \"abdbc7a1-d300-43d1-a23d-416a9cbc5a98\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l657x" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.702424 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/892c3644-ab53-40dc-a65e-4b14e6b537ed-profile-collector-cert\") pod \"catalog-operator-68c6474976-4fgp8\" (UID: \"892c3644-ab53-40dc-a65e-4b14e6b537ed\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4fgp8" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.702610 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a2238a30-4ae1-4bd8-acfa-1e357552252c-service-ca-bundle\") pod \"router-default-5444994796-wpxh2\" (UID: \"a2238a30-4ae1-4bd8-acfa-1e357552252c\") " pod="openshift-ingress/router-default-5444994796-wpxh2" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.703591 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a2238a30-4ae1-4bd8-acfa-1e357552252c-default-certificate\") pod \"router-default-5444994796-wpxh2\" (UID: \"a2238a30-4ae1-4bd8-acfa-1e357552252c\") " pod="openshift-ingress/router-default-5444994796-wpxh2" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.703922 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/aac54d18-3f6e-4bf8-98f6-6e24c2d8ed74-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-ds5h4\" (UID: \"aac54d18-3f6e-4bf8-98f6-6e24c2d8ed74\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ds5h4" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.704273 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vjwt\" (UniqueName: \"kubernetes.io/projected/76fbf853-2ec4-4f50-a0d1-c633314219b3-kube-api-access-6vjwt\") pod \"openshift-controller-manager-operator-756b6f6bc6-2vqkq\" (UID: \"76fbf853-2ec4-4f50-a0d1-c633314219b3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2vqkq" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.705074 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c107c949-1ce5-41cb-a5a6-49bf5c599fc2-srv-cert\") pod \"olm-operator-6b444d44fb-pvm7n\" (UID: \"c107c949-1ce5-41cb-a5a6-49bf5c599fc2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pvm7n" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.705602 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6af25d90-8e1b-45ec-ac53-1fdd01387b9f-metrics-tls\") pod \"dns-default-mhd4x\" (UID: \"6af25d90-8e1b-45ec-ac53-1fdd01387b9f\") " pod="openshift-dns/dns-default-mhd4x" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.705705 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a2238a30-4ae1-4bd8-acfa-1e357552252c-metrics-certs\") pod \"router-default-5444994796-wpxh2\" (UID: \"a2238a30-4ae1-4bd8-acfa-1e357552252c\") " pod="openshift-ingress/router-default-5444994796-wpxh2" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.706800 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40496e6d-3f79-4478-804b-dc9904473801-secret-volume\") pod \"collect-profiles-29322165-xdwrz\" (UID: \"40496e6d-3f79-4478-804b-dc9904473801\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-xdwrz" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.707544 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/fcfb70b0-1fb4-49ec-b1e2-1d7788b52aeb-signing-key\") pod \"service-ca-9c57cc56f-5ddff\" (UID: \"fcfb70b0-1fb4-49ec-b1e2-1d7788b52aeb\") " pod="openshift-service-ca/service-ca-9c57cc56f-5ddff" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.709502 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c107c949-1ce5-41cb-a5a6-49bf5c599fc2-profile-collector-cert\") pod \"olm-operator-6b444d44fb-pvm7n\" (UID: \"c107c949-1ce5-41cb-a5a6-49bf5c599fc2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pvm7n" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.715646 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/892c3644-ab53-40dc-a65e-4b14e6b537ed-srv-cert\") pod \"catalog-operator-68c6474976-4fgp8\" (UID: \"892c3644-ab53-40dc-a65e-4b14e6b537ed\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4fgp8" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.721226 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6e05857-3b53-4e95-9f02-a79ad8b509a8-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-8pc29\" (UID: \"f6e05857-3b53-4e95-9f02-a79ad8b509a8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pc29" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.721500 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/abdbc7a1-d300-43d1-a23d-416a9cbc5a98-webhook-cert\") pod \"packageserver-d55dfcdfc-l657x\" (UID: \"abdbc7a1-d300-43d1-a23d-416a9cbc5a98\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l657x" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.724977 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2vqkq" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.743499 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzn8g\" (UniqueName: \"kubernetes.io/projected/cb5d2ccc-be5f-4f01-ad01-f44095342ed4-kube-api-access-tzn8g\") pod \"console-operator-58897d9998-75jhw\" (UID: \"cb5d2ccc-be5f-4f01-ad01-f44095342ed4\") " pod="openshift-console-operator/console-operator-58897d9998-75jhw" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.747149 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fcdm\" (UniqueName: \"kubernetes.io/projected/64c93e25-6cb1-443c-bbe4-8fb155713ddb-kube-api-access-7fcdm\") pod \"machine-approver-56656f9798-kd4c9\" (UID: \"64c93e25-6cb1-443c-bbe4-8fb155713ddb\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kd4c9" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.762057 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r4rb\" (UniqueName: \"kubernetes.io/projected/f160df7c-e97b-4c5a-badf-08379f8e27bf-kube-api-access-7r4rb\") pod \"machine-api-operator-5694c8668f-qlkrl\" (UID: \"f160df7c-e97b-4c5a-badf-08379f8e27bf\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-qlkrl" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.793143 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8b800e30-2559-4c0b-9732-7a069ae3da91-bound-sa-token\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.799668 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a3fb633f-3f5a-4691-9b17-e59e9ef8b8b5-cert\") pod \"ingress-canary-c64p2\" (UID: \"a3fb633f-3f5a-4691-9b17-e59e9ef8b8b5\") " pod="openshift-ingress-canary/ingress-canary-c64p2" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.799787 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.800335 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:28.300318528 +0000 UTC m=+152.919493709 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.805229 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kssl5\" (UniqueName: \"kubernetes.io/projected/6c932514-1a55-4ed3-a220-7ddcce5a4ca4-kube-api-access-kssl5\") pod \"etcd-operator-b45778765-t6brp\" (UID: \"6c932514-1a55-4ed3-a220-7ddcce5a4ca4\") " pod="openshift-etcd-operator/etcd-operator-b45778765-t6brp" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.824024 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rss87\" (UniqueName: \"kubernetes.io/projected/8b800e30-2559-4c0b-9732-7a069ae3da91-kube-api-access-rss87\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.842708 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7wz6\" (UniqueName: \"kubernetes.io/projected/13778ea7-497e-431d-a3a9-96979d2e4885-kube-api-access-v7wz6\") pod \"apiserver-76f77b778f-shl79\" (UID: \"13778ea7-497e-431d-a3a9-96979d2e4885\") " pod="openshift-apiserver/apiserver-76f77b778f-shl79" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.863241 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtklh\" (UniqueName: \"kubernetes.io/projected/757b923d-5bf5-4b31-af60-a617c6b13559-kube-api-access-gtklh\") pod \"dns-operator-744455d44c-kwgcl\" (UID: \"757b923d-5bf5-4b31-af60-a617c6b13559\") " pod="openshift-dns-operator/dns-operator-744455d44c-kwgcl" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.866169 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.873011 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a3fb633f-3f5a-4691-9b17-e59e9ef8b8b5-cert\") pod \"ingress-canary-c64p2\" (UID: \"a3fb633f-3f5a-4691-9b17-e59e9ef8b8b5\") " pod="openshift-ingress-canary/ingress-canary-c64p2" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.886529 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.887827 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2vqkq"] Oct 01 14:58:27 crc kubenswrapper[4771]: W1001 14:58:27.899715 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76fbf853_2ec4_4f50_a0d1_c633314219b3.slice/crio-e82eb2ee6c5b86313e48355ee9ca925644aa8fa8628d89de44029923e3848452 WatchSource:0}: Error finding container e82eb2ee6c5b86313e48355ee9ca925644aa8fa8628d89de44029923e3848452: Status 404 returned error can't find the container with id e82eb2ee6c5b86313e48355ee9ca925644aa8fa8628d89de44029923e3848452 Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.900886 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.901011 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:28.400983907 +0000 UTC m=+153.020159108 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.901208 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.901433 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/41c580c9-c09e-436e-9fff-05612f7b47f5-certs\") pod \"machine-config-server-dp6jb\" (UID: \"41c580c9-c09e-436e-9fff-05612f7b47f5\") " pod="openshift-machine-config-operator/machine-config-server-dp6jb" Oct 01 14:58:27 crc kubenswrapper[4771]: E1001 14:58:27.901649 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:28.401639284 +0000 UTC m=+153.020814455 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.901685 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/41c580c9-c09e-436e-9fff-05612f7b47f5-node-bootstrap-token\") pod \"machine-config-server-dp6jb\" (UID: \"41c580c9-c09e-436e-9fff-05612f7b47f5\") " pod="openshift-machine-config-operator/machine-config-server-dp6jb" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.905794 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.926034 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-shl79" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.926266 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.946947 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.947137 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-qlkrl" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.966718 4771 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.977534 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-75jhw" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.985914 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.988377 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-kwgcl" Oct 01 14:58:27 crc kubenswrapper[4771]: I1001 14:58:27.997063 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kd4c9" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.002832 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:28 crc kubenswrapper[4771]: E1001 14:58:28.003039 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:28.503008391 +0000 UTC m=+153.122183572 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.003238 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:28 crc kubenswrapper[4771]: E1001 14:58:28.003601 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:28.503585535 +0000 UTC m=+153.122760706 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.005493 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.024611 4771 request.go:700] Waited for 1.882639356s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dnode-bootstrapper-token&limit=500&resourceVersion=0 Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.026388 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.035875 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-t6brp" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.041531 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/41c580c9-c09e-436e-9fff-05612f7b47f5-node-bootstrap-token\") pod \"machine-config-server-dp6jb\" (UID: \"41c580c9-c09e-436e-9fff-05612f7b47f5\") " pod="openshift-machine-config-operator/machine-config-server-dp6jb" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.046045 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.060104 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/41c580c9-c09e-436e-9fff-05612f7b47f5-certs\") pod \"machine-config-server-dp6jb\" (UID: \"41c580c9-c09e-436e-9fff-05612f7b47f5\") " pod="openshift-machine-config-operator/machine-config-server-dp6jb" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.090753 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xthf\" (UniqueName: \"kubernetes.io/projected/6af25d90-8e1b-45ec-ac53-1fdd01387b9f-kube-api-access-7xthf\") pod \"dns-default-mhd4x\" (UID: \"6af25d90-8e1b-45ec-ac53-1fdd01387b9f\") " pod="openshift-dns/dns-default-mhd4x" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.110595 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:28 crc kubenswrapper[4771]: E1001 14:58:28.111162 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:28.611147261 +0000 UTC m=+153.230322432 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.123222 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n5gh\" (UniqueName: \"kubernetes.io/projected/335db6c7-efb8-4055-aacf-8262b4ec5b91-kube-api-access-2n5gh\") pod \"route-controller-manager-6576b87f9c-dsjtg\" (UID: \"335db6c7-efb8-4055-aacf-8262b4ec5b91\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsjtg" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.125482 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp8lb\" (UniqueName: \"kubernetes.io/projected/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-kube-api-access-hp8lb\") pod \"oauth-openshift-558db77b4-pr972\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " pod="openshift-authentication/oauth-openshift-558db77b4-pr972" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.139687 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q9q6\" (UniqueName: \"kubernetes.io/projected/b248e756-e3b8-4fd9-b6fb-99ee87df696d-kube-api-access-9q9q6\") pod \"marketplace-operator-79b997595-rsd2x\" (UID: \"b248e756-e3b8-4fd9-b6fb-99ee87df696d\") " pod="openshift-marketplace/marketplace-operator-79b997595-rsd2x" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.140858 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-mhd4x" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.159414 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmwhv\" (UniqueName: \"kubernetes.io/projected/2bba146f-994a-4fbd-834f-861c2ffa4232-kube-api-access-wmwhv\") pod \"kube-storage-version-migrator-operator-b67b599dd-9q7z8\" (UID: \"2bba146f-994a-4fbd-834f-861c2ffa4232\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9q7z8" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.180716 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75svc\" (UniqueName: \"kubernetes.io/projected/a2238a30-4ae1-4bd8-acfa-1e357552252c-kube-api-access-75svc\") pod \"router-default-5444994796-wpxh2\" (UID: \"a2238a30-4ae1-4bd8-acfa-1e357552252c\") " pod="openshift-ingress/router-default-5444994796-wpxh2" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.199928 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/558574b3-154b-48bc-9436-3013a9b62f28-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-7cpjp\" (UID: \"558574b3-154b-48bc-9436-3013a9b62f28\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7cpjp" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.212495 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:28 crc kubenswrapper[4771]: E1001 14:58:28.212921 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:28.712908738 +0000 UTC m=+153.332083909 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.218825 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsjtg" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.226146 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7cpjp" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.226411 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqjt8\" (UniqueName: \"kubernetes.io/projected/541a902f-ea82-44e7-9c01-e93c9e01a2b6-kube-api-access-rqjt8\") pod \"apiserver-7bbb656c7d-ldrz6\" (UID: \"541a902f-ea82-44e7-9c01-e93c9e01a2b6\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ldrz6" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.242298 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9q7z8" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.243981 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk4cv\" (UniqueName: \"kubernetes.io/projected/fcfb70b0-1fb4-49ec-b1e2-1d7788b52aeb-kube-api-access-zk4cv\") pod \"service-ca-9c57cc56f-5ddff\" (UID: \"fcfb70b0-1fb4-49ec-b1e2-1d7788b52aeb\") " pod="openshift-service-ca/service-ca-9c57cc56f-5ddff" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.269827 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh5xh\" (UniqueName: \"kubernetes.io/projected/ea64286d-7a71-4d7e-b54b-5796a3a9f7df-kube-api-access-bh5xh\") pod \"machine-config-controller-84d6567774-l6gqh\" (UID: \"ea64286d-7a71-4d7e-b54b-5796a3a9f7df\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l6gqh" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.272684 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-75jhw"] Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.281384 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-pr972" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.284309 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dcjp\" (UniqueName: \"kubernetes.io/projected/1d5cfd50-877b-49ab-82f0-7753f0fabba4-kube-api-access-5dcjp\") pod \"migrator-59844c95c7-n5mnh\" (UID: \"1d5cfd50-877b-49ab-82f0-7753f0fabba4\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-n5mnh" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.309430 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5dkt\" (UniqueName: \"kubernetes.io/projected/b31fcf4c-ab24-4f6a-9807-04c076e2d548-kube-api-access-n5dkt\") pod \"machine-config-operator-74547568cd-2kmtl\" (UID: \"b31fcf4c-ab24-4f6a-9807-04c076e2d548\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2kmtl" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.317596 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:28 crc kubenswrapper[4771]: E1001 14:58:28.322100 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:28.822066054 +0000 UTC m=+153.441241235 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.325053 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-t6brp"] Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.330451 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh8jc\" (UniqueName: \"kubernetes.io/projected/aac54d18-3f6e-4bf8-98f6-6e24c2d8ed74-kube-api-access-fh8jc\") pod \"multus-admission-controller-857f4d67dd-ds5h4\" (UID: \"aac54d18-3f6e-4bf8-98f6-6e24c2d8ed74\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ds5h4" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.339391 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rsd2x" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.352532 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h95c4\" (UniqueName: \"kubernetes.io/projected/9db1abd4-f11c-45e1-9341-6c818c3e3579-kube-api-access-h95c4\") pod \"console-f9d7485db-szwtc\" (UID: \"9db1abd4-f11c-45e1-9341-6c818c3e3579\") " pod="openshift-console/console-f9d7485db-szwtc" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.356369 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l6gqh" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.361976 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-n5mnh" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.376354 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-shl79"] Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.379404 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h76zj\" (UniqueName: \"kubernetes.io/projected/abdbc7a1-d300-43d1-a23d-416a9cbc5a98-kube-api-access-h76zj\") pod \"packageserver-d55dfcdfc-l657x\" (UID: \"abdbc7a1-d300-43d1-a23d-416a9cbc5a98\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l657x" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.387170 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlcnm\" (UniqueName: \"kubernetes.io/projected/728535b7-cedd-4391-b5e4-8cee0982380d-kube-api-access-nlcnm\") pod \"cluster-image-registry-operator-dc59b4c8b-g7zvd\" (UID: \"728535b7-cedd-4391-b5e4-8cee0982380d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g7zvd" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.391515 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l657x" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.399032 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-wpxh2" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.402788 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nd5p6\" (UniqueName: \"kubernetes.io/projected/f6e05857-3b53-4e95-9f02-a79ad8b509a8-kube-api-access-nd5p6\") pod \"package-server-manager-789f6589d5-8pc29\" (UID: \"f6e05857-3b53-4e95-9f02-a79ad8b509a8\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pc29" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.413386 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ldrz6" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.415993 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-mhd4x"] Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.421050 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-ds5h4" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.422472 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:28 crc kubenswrapper[4771]: E1001 14:58:28.422950 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:28.922936428 +0000 UTC m=+153.542111599 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.423460 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khxgk\" (UniqueName: \"kubernetes.io/projected/c107c949-1ce5-41cb-a5a6-49bf5c599fc2-kube-api-access-khxgk\") pod \"olm-operator-6b444d44fb-pvm7n\" (UID: \"c107c949-1ce5-41cb-a5a6-49bf5c599fc2\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pvm7n" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.424431 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pvm7n" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.424825 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-qlkrl"] Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.432591 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-5ddff" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.455565 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg4hf\" (UniqueName: \"kubernetes.io/projected/51a76061-b2bf-427b-985e-767ebad2a8cb-kube-api-access-jg4hf\") pod \"openshift-config-operator-7777fb866f-tk4n9\" (UID: \"51a76061-b2bf-427b-985e-767ebad2a8cb\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-tk4n9" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.463480 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-kwgcl"] Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.467978 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7skn\" (UniqueName: \"kubernetes.io/projected/83e0e747-e0e6-4aab-bc5c-27d0b41e2fb1-kube-api-access-d7skn\") pod \"authentication-operator-69f744f599-z6x72\" (UID: \"83e0e747-e0e6-4aab-bc5c-27d0b41e2fb1\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z6x72" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.481863 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wq8gj\" (UniqueName: \"kubernetes.io/projected/892c3644-ab53-40dc-a65e-4b14e6b537ed-kube-api-access-wq8gj\") pod \"catalog-operator-68c6474976-4fgp8\" (UID: \"892c3644-ab53-40dc-a65e-4b14e6b537ed\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4fgp8" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.514699 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0d14cf95-98ee-432d-8889-edf3508b4eb3-bound-sa-token\") pod \"ingress-operator-5b745b69d9-qgbcc\" (UID: \"0d14cf95-98ee-432d-8889-edf3508b4eb3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qgbcc" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.522480 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvznm\" (UniqueName: \"kubernetes.io/projected/0d14cf95-98ee-432d-8889-edf3508b4eb3-kube-api-access-cvznm\") pod \"ingress-operator-5b745b69d9-qgbcc\" (UID: \"0d14cf95-98ee-432d-8889-edf3508b4eb3\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qgbcc" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.523597 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:28 crc kubenswrapper[4771]: E1001 14:58:28.524095 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:29.024075739 +0000 UTC m=+153.643250910 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.546575 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnhjj\" (UniqueName: \"kubernetes.io/projected/d92fedc8-d031-40c1-b9fa-695496499a26-kube-api-access-wnhjj\") pod \"downloads-7954f5f757-nrlb2\" (UID: \"d92fedc8-d031-40c1-b9fa-695496499a26\") " pod="openshift-console/downloads-7954f5f757-nrlb2" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.551500 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-szwtc" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.563642 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhpgp\" (UniqueName: \"kubernetes.io/projected/6730c7b8-fcbf-48c4-b2b8-5ed3566a7cd4-kube-api-access-nhpgp\") pod \"control-plane-machine-set-operator-78cbb6b69f-8gml8\" (UID: \"6730c7b8-fcbf-48c4-b2b8-5ed3566a7cd4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8gml8" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.570475 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-z6x72" Oct 01 14:58:28 crc kubenswrapper[4771]: W1001 14:58:28.576774 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2238a30_4ae1_4bd8_acfa_1e357552252c.slice/crio-d3339e9198db8e94f039b6bce2862d3c70120028e137d39976191224c24a3681 WatchSource:0}: Error finding container d3339e9198db8e94f039b6bce2862d3c70120028e137d39976191224c24a3681: Status 404 returned error can't find the container with id d3339e9198db8e94f039b6bce2862d3c70120028e137d39976191224c24a3681 Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.583148 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z8c2\" (UniqueName: \"kubernetes.io/projected/3f9e5d02-c03d-42b0-a837-bfa317d1cbd8-kube-api-access-2z8c2\") pod \"cluster-samples-operator-665b6dd947-ld5bx\" (UID: \"3f9e5d02-c03d-42b0-a837-bfa317d1cbd8\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ld5bx" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.590344 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2kmtl" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.602410 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvxhk\" (UniqueName: \"kubernetes.io/projected/2baafe19-ab7c-43c2-bd6b-0d6398b9fb3b-kube-api-access-vvxhk\") pod \"openshift-apiserver-operator-796bbdcf4f-wcgzq\" (UID: \"2baafe19-ab7c-43c2-bd6b-0d6398b9fb3b\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wcgzq" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.615485 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qgbcc" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.625227 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:28 crc kubenswrapper[4771]: E1001 14:58:28.625835 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:29.125817706 +0000 UTC m=+153.744992877 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.628225 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxzp5\" (UniqueName: \"kubernetes.io/projected/40496e6d-3f79-4478-804b-dc9904473801-kube-api-access-pxzp5\") pod \"collect-profiles-29322165-xdwrz\" (UID: \"40496e6d-3f79-4478-804b-dc9904473801\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-xdwrz" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.645082 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdcmg\" (UniqueName: \"kubernetes.io/projected/1fa98d96-d0f8-4b7f-9421-50e6eceaca84-kube-api-access-jdcmg\") pod \"service-ca-operator-777779d784-hlqj5\" (UID: \"1fa98d96-d0f8-4b7f-9421-50e6eceaca84\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-hlqj5" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.649084 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ld5bx" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.664665 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bb1bead7-b2e8-47b0-9e78-d5b6970d0121-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-mn6nl\" (UID: \"bb1bead7-b2e8-47b0-9e78-d5b6970d0121\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn6nl" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.665084 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tk4n9" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.670129 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8gml8" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.677501 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pc29" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.681154 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/728535b7-cedd-4391-b5e4-8cee0982380d-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-g7zvd\" (UID: \"728535b7-cedd-4391-b5e4-8cee0982380d\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g7zvd" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.684015 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4fgp8" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.706242 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d1cd5336-780a-49f1-9360-e7b8c97779d2-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-nhxdc\" (UID: \"d1cd5336-780a-49f1-9360-e7b8c97779d2\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nhxdc" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.706586 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-xdwrz" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.710307 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9q7z8"] Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.710355 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsjtg"] Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.710367 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-l6gqh"] Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.726485 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwcdw\" (UniqueName: \"kubernetes.io/projected/cd8df04a-9a5e-4784-ac97-9782d936fa5e-kube-api-access-dwcdw\") pod \"controller-manager-879f6c89f-mjvfw\" (UID: \"cd8df04a-9a5e-4784-ac97-9782d936fa5e\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mjvfw" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.727137 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:28 crc kubenswrapper[4771]: E1001 14:58:28.727441 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:29.227401528 +0000 UTC m=+153.846576699 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.732292 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-nrlb2" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.739987 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7cpjp"] Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.763607 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvw4z\" (UniqueName: \"kubernetes.io/projected/828f2820-7234-4e86-81fd-ca8666c1e640-kube-api-access-kvw4z\") pod \"csi-hostpathplugin-5fnbm\" (UID: \"828f2820-7234-4e86-81fd-ca8666c1e640\") " pod="hostpath-provisioner/csi-hostpathplugin-5fnbm" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.769161 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-5fnbm" Oct 01 14:58:28 crc kubenswrapper[4771]: W1001 14:58:28.781238 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod335db6c7_efb8_4055_aacf_8262b4ec5b91.slice/crio-722b0373c3f9bac7910d2d75be73c212cbb50e883aa52ddf6a0ae21ae5721f77 WatchSource:0}: Error finding container 722b0373c3f9bac7910d2d75be73c212cbb50e883aa52ddf6a0ae21ae5721f77: Status 404 returned error can't find the container with id 722b0373c3f9bac7910d2d75be73c212cbb50e883aa52ddf6a0ae21ae5721f77 Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.781556 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jgdw8\" (UniqueName: \"kubernetes.io/projected/a3fb633f-3f5a-4691-9b17-e59e9ef8b8b5-kube-api-access-jgdw8\") pod \"ingress-canary-c64p2\" (UID: \"a3fb633f-3f5a-4691-9b17-e59e9ef8b8b5\") " pod="openshift-ingress-canary/ingress-canary-c64p2" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.803838 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wcgzq" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.810361 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g7zvd" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.811193 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9n8l\" (UniqueName: \"kubernetes.io/projected/41c580c9-c09e-436e-9fff-05612f7b47f5-kube-api-access-m9n8l\") pod \"machine-config-server-dp6jb\" (UID: \"41c580c9-c09e-436e-9fff-05612f7b47f5\") " pod="openshift-machine-config-operator/machine-config-server-dp6jb" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.828339 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:28 crc kubenswrapper[4771]: E1001 14:58:28.828999 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:29.328982531 +0000 UTC m=+153.948157702 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.837512 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn6nl" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.848503 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-ds5h4"] Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.850170 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rsd2x"] Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.854933 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-pr972"] Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.859285 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nhxdc" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.895701 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2vqkq" event={"ID":"76fbf853-2ec4-4f50-a0d1-c633314219b3","Type":"ContainerStarted","Data":"d3c8bd45a6f2a7aae5c8bf191e42db95ae1bee0406d1634851e7319f6aa869d1"} Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.895774 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2vqkq" event={"ID":"76fbf853-2ec4-4f50-a0d1-c633314219b3","Type":"ContainerStarted","Data":"e82eb2ee6c5b86313e48355ee9ca925644aa8fa8628d89de44029923e3848452"} Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.897516 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-75jhw" event={"ID":"cb5d2ccc-be5f-4f01-ad01-f44095342ed4","Type":"ContainerStarted","Data":"081edc11b355fbd752a26bfe81cd65953d8a8be8e8763a9984c70e3b4ca95238"} Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.897556 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-75jhw" event={"ID":"cb5d2ccc-be5f-4f01-ad01-f44095342ed4","Type":"ContainerStarted","Data":"b94d4ab815f81ebffb339d30e180510efa7e9c38f0da0d723cf6369b10b62258"} Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.897723 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-75jhw" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.899006 4771 patch_prober.go:28] interesting pod/console-operator-58897d9998-75jhw container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.899051 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-75jhw" podUID="cb5d2ccc-be5f-4f01-ad01-f44095342ed4" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.928935 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.929379 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-hlqj5" Oct 01 14:58:28 crc kubenswrapper[4771]: E1001 14:58:28.929651 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:29.429624199 +0000 UTC m=+154.048799400 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.934969 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-qlkrl" event={"ID":"f160df7c-e97b-4c5a-badf-08379f8e27bf","Type":"ContainerStarted","Data":"8417827b6491d266fbda7f48ee0a56837baad01e8c7ca91bd611538427c5918c"} Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.937513 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mhd4x" event={"ID":"6af25d90-8e1b-45ec-ac53-1fdd01387b9f","Type":"ContainerStarted","Data":"3130bb345282e240924d212c77f2ece41c98cde51e1431aa2d1bcd8bec5dc097"} Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.961831 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsjtg" event={"ID":"335db6c7-efb8-4055-aacf-8262b4ec5b91","Type":"ContainerStarted","Data":"722b0373c3f9bac7910d2d75be73c212cbb50e883aa52ddf6a0ae21ae5721f77"} Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.968586 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kd4c9" event={"ID":"64c93e25-6cb1-443c-bbe4-8fb155713ddb","Type":"ContainerStarted","Data":"c242a91b986f56344800181ced1d7507f19cb218e85876fb376a877aa6737dce"} Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.968625 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kd4c9" event={"ID":"64c93e25-6cb1-443c-bbe4-8fb155713ddb","Type":"ContainerStarted","Data":"d22a37858bc1f4204e0f0691a718a68c5017f481c4432d323eef6637d2630654"} Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.986892 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-mjvfw" Oct 01 14:58:28 crc kubenswrapper[4771]: I1001 14:58:28.988050 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-n5mnh"] Oct 01 14:58:29 crc kubenswrapper[4771]: I1001 14:58:29.021230 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l657x"] Oct 01 14:58:29 crc kubenswrapper[4771]: I1001 14:58:29.033672 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:29 crc kubenswrapper[4771]: E1001 14:58:29.038917 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:29.538902208 +0000 UTC m=+154.158077379 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:29 crc kubenswrapper[4771]: I1001 14:58:29.045077 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-t6brp" event={"ID":"6c932514-1a55-4ed3-a220-7ddcce5a4ca4","Type":"ContainerStarted","Data":"aed8e5870cabf4734921b1a54e314a83f315de781c222562c53e8c2f436c663f"} Oct 01 14:58:29 crc kubenswrapper[4771]: I1001 14:58:29.047307 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-c64p2" Oct 01 14:58:29 crc kubenswrapper[4771]: I1001 14:58:29.060450 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-shl79" event={"ID":"13778ea7-497e-431d-a3a9-96979d2e4885","Type":"ContainerStarted","Data":"aacc058b34ab3239c8ea0a1a531c7c88af3c8fdf98ff3a911b1c608769de8ccb"} Oct 01 14:58:29 crc kubenswrapper[4771]: I1001 14:58:29.070500 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-wpxh2" event={"ID":"a2238a30-4ae1-4bd8-acfa-1e357552252c","Type":"ContainerStarted","Data":"d3339e9198db8e94f039b6bce2862d3c70120028e137d39976191224c24a3681"} Oct 01 14:58:29 crc kubenswrapper[4771]: I1001 14:58:29.078485 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-dp6jb" Oct 01 14:58:29 crc kubenswrapper[4771]: I1001 14:58:29.092470 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9q7z8" event={"ID":"2bba146f-994a-4fbd-834f-861c2ffa4232","Type":"ContainerStarted","Data":"74c9ca32adf2e0b4233dec602f92f426f4e27bf8d534cd234dd7b1292a9414ef"} Oct 01 14:58:29 crc kubenswrapper[4771]: I1001 14:58:29.095688 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-kwgcl" event={"ID":"757b923d-5bf5-4b31-af60-a617c6b13559","Type":"ContainerStarted","Data":"273ac34fd426c863d279310c35c4093f800812d54bc7d64a3d0d16521f20735b"} Oct 01 14:58:29 crc kubenswrapper[4771]: I1001 14:58:29.096855 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l6gqh" event={"ID":"ea64286d-7a71-4d7e-b54b-5796a3a9f7df","Type":"ContainerStarted","Data":"cf7dcbed7dec12104e415add39776d2a6993280045ef4f16bd47525b87e578ac"} Oct 01 14:58:29 crc kubenswrapper[4771]: I1001 14:58:29.117526 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-z6x72"] Oct 01 14:58:29 crc kubenswrapper[4771]: I1001 14:58:29.157957 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-ldrz6"] Oct 01 14:58:29 crc kubenswrapper[4771]: I1001 14:58:29.173880 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:29 crc kubenswrapper[4771]: E1001 14:58:29.174252 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:29.67418369 +0000 UTC m=+154.293358861 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:29 crc kubenswrapper[4771]: I1001 14:58:29.174342 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:29 crc kubenswrapper[4771]: E1001 14:58:29.175215 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:29.675186636 +0000 UTC m=+154.294361807 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:29 crc kubenswrapper[4771]: I1001 14:58:29.250847 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ld5bx"] Oct 01 14:58:29 crc kubenswrapper[4771]: I1001 14:58:29.250904 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-5ddff"] Oct 01 14:58:29 crc kubenswrapper[4771]: I1001 14:58:29.263839 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pvm7n"] Oct 01 14:58:29 crc kubenswrapper[4771]: I1001 14:58:29.278553 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:29 crc kubenswrapper[4771]: E1001 14:58:29.278882 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:29.778867242 +0000 UTC m=+154.398042413 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:29 crc kubenswrapper[4771]: I1001 14:58:29.303935 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-qgbcc"] Oct 01 14:58:29 crc kubenswrapper[4771]: I1001 14:58:29.304950 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-szwtc"] Oct 01 14:58:29 crc kubenswrapper[4771]: I1001 14:58:29.320287 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8gml8"] Oct 01 14:58:29 crc kubenswrapper[4771]: I1001 14:58:29.344808 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-2kmtl"] Oct 01 14:58:29 crc kubenswrapper[4771]: I1001 14:58:29.379924 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:29 crc kubenswrapper[4771]: E1001 14:58:29.380233 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:29.880221749 +0000 UTC m=+154.499396920 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:29 crc kubenswrapper[4771]: W1001 14:58:29.454151 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41c580c9_c09e_436e_9fff_05612f7b47f5.slice/crio-ceb1ab257323be4f904668f51b03c04d9d3c601ca873fe7758f22b4eb291af85 WatchSource:0}: Error finding container ceb1ab257323be4f904668f51b03c04d9d3c601ca873fe7758f22b4eb291af85: Status 404 returned error can't find the container with id ceb1ab257323be4f904668f51b03c04d9d3c601ca873fe7758f22b4eb291af85 Oct 01 14:58:29 crc kubenswrapper[4771]: W1001 14:58:29.473058 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc107c949_1ce5_41cb_a5a6_49bf5c599fc2.slice/crio-26d14f18b445274dc4df41bbdcc5029aa615b0a44da2e0dcd06ba9516bf49f96 WatchSource:0}: Error finding container 26d14f18b445274dc4df41bbdcc5029aa615b0a44da2e0dcd06ba9516bf49f96: Status 404 returned error can't find the container with id 26d14f18b445274dc4df41bbdcc5029aa615b0a44da2e0dcd06ba9516bf49f96 Oct 01 14:58:29 crc kubenswrapper[4771]: I1001 14:58:29.480845 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:29 crc kubenswrapper[4771]: E1001 14:58:29.481208 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:29.981190516 +0000 UTC m=+154.600365687 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:29 crc kubenswrapper[4771]: W1001 14:58:29.490053 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d14cf95_98ee_432d_8889_edf3508b4eb3.slice/crio-12f95d1c02b3227a9e1fcac2c41af8c638bb02db4300240a8bfeaf58fa204f0f WatchSource:0}: Error finding container 12f95d1c02b3227a9e1fcac2c41af8c638bb02db4300240a8bfeaf58fa204f0f: Status 404 returned error can't find the container with id 12f95d1c02b3227a9e1fcac2c41af8c638bb02db4300240a8bfeaf58fa204f0f Oct 01 14:58:29 crc kubenswrapper[4771]: I1001 14:58:29.591926 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:29 crc kubenswrapper[4771]: E1001 14:58:29.592560 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:30.092544857 +0000 UTC m=+154.711720028 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:29 crc kubenswrapper[4771]: I1001 14:58:29.692459 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wcgzq"] Oct 01 14:58:29 crc kubenswrapper[4771]: I1001 14:58:29.693303 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:29 crc kubenswrapper[4771]: E1001 14:58:29.693699 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:30.193685788 +0000 UTC m=+154.812860959 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:29 crc kubenswrapper[4771]: I1001 14:58:29.699687 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4fgp8"] Oct 01 14:58:29 crc kubenswrapper[4771]: I1001 14:58:29.755525 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pc29"] Oct 01 14:58:29 crc kubenswrapper[4771]: I1001 14:58:29.794553 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:29 crc kubenswrapper[4771]: E1001 14:58:29.794929 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:30.294918342 +0000 UTC m=+154.914093513 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:29 crc kubenswrapper[4771]: I1001 14:58:29.895315 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:29 crc kubenswrapper[4771]: E1001 14:58:29.895533 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:30.395511559 +0000 UTC m=+155.014686730 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.005655 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:30 crc kubenswrapper[4771]: E1001 14:58:30.005978 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:30.505967858 +0000 UTC m=+155.125143029 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.107895 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:30 crc kubenswrapper[4771]: E1001 14:58:30.108911 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:30.60869603 +0000 UTC m=+155.227871201 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.159241 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kd4c9" event={"ID":"64c93e25-6cb1-443c-bbe4-8fb155713ddb","Type":"ContainerStarted","Data":"ce20658654b61c17312f98005d703a684341f97d4123c8f5c1f4b44b83776611"} Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.192311 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-5ddff" event={"ID":"fcfb70b0-1fb4-49ec-b1e2-1d7788b52aeb","Type":"ContainerStarted","Data":"bdecd5cf9334e887c953c38d27d03c92574ca6a4ad817c2d6bdab147b5a1d9d9"} Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.209210 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:30 crc kubenswrapper[4771]: E1001 14:58:30.209648 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:30.709633356 +0000 UTC m=+155.328808527 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.215249 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-wpxh2" event={"ID":"a2238a30-4ae1-4bd8-acfa-1e357552252c","Type":"ContainerStarted","Data":"d5a89dbdd96d40e195e050495dbb00550bb746f86a7f94cbdf36fcd1ba44646b"} Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.216768 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wcgzq" event={"ID":"2baafe19-ab7c-43c2-bd6b-0d6398b9fb3b","Type":"ContainerStarted","Data":"5756842ae4b0cc009614bf87ed1890ab8437cd290273b2e43da3d63674310671"} Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.241726 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-tk4n9"] Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.286723 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l657x" event={"ID":"abdbc7a1-d300-43d1-a23d-416a9cbc5a98","Type":"ContainerStarted","Data":"276c2d24d65aa1ccce2161cf5d52a6c86c5515576b3fe5a4c19259ca76d51ccf"} Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.310235 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:30 crc kubenswrapper[4771]: E1001 14:58:30.311484 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:30.811455775 +0000 UTC m=+155.430630996 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.334258 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8gml8" event={"ID":"6730c7b8-fcbf-48c4-b2b8-5ed3566a7cd4","Type":"ContainerStarted","Data":"48eddc7138f783860406170bd199deaad2620413e5d2d81a09f06aa4c464db6f"} Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.338856 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mhd4x" event={"ID":"6af25d90-8e1b-45ec-ac53-1fdd01387b9f","Type":"ContainerStarted","Data":"51d958e9f109eac67b1c0f5f8a355ce0d02eb33186ce1f1d454603c64a7d533b"} Oct 01 14:58:30 crc kubenswrapper[4771]: W1001 14:58:30.335200 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51a76061_b2bf_427b_985e_767ebad2a8cb.slice/crio-0d23948008bda820d189e2e2f9292603365e7cfe7a3afc2f80d1270c09e05e32 WatchSource:0}: Error finding container 0d23948008bda820d189e2e2f9292603365e7cfe7a3afc2f80d1270c09e05e32: Status 404 returned error can't find the container with id 0d23948008bda820d189e2e2f9292603365e7cfe7a3afc2f80d1270c09e05e32 Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.342600 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4fgp8" event={"ID":"892c3644-ab53-40dc-a65e-4b14e6b537ed","Type":"ContainerStarted","Data":"bc60c6c816c12a8607445da5d59f8f8d0278d0707c32b7c77814ed33f78b3cf5"} Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.354967 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-t6brp" event={"ID":"6c932514-1a55-4ed3-a220-7ddcce5a4ca4","Type":"ContainerStarted","Data":"dbf1e941278fcce736941ea6dbe223d6013dd1caf57d581fb40ad5061e87c364"} Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.358167 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2kmtl" event={"ID":"b31fcf4c-ab24-4f6a-9807-04c076e2d548","Type":"ContainerStarted","Data":"76f38a750605ddbdb6452d3a2fc6f021ef90ddddc9dea082472be49ade89acdc"} Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.369554 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-qlkrl" event={"ID":"f160df7c-e97b-4c5a-badf-08379f8e27bf","Type":"ContainerStarted","Data":"1424a610b0cc12e6757c5274f8375bdf9ce5974b5b48a06d1178c9d4b55682b3"} Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.372848 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ldrz6" event={"ID":"541a902f-ea82-44e7-9c01-e93c9e01a2b6","Type":"ContainerStarted","Data":"1b3cc52834766eea5a78d05b2886e80f70a4eb0d04bb6f3441ed57b517cac31f"} Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.376876 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pc29" event={"ID":"f6e05857-3b53-4e95-9f02-a79ad8b509a8","Type":"ContainerStarted","Data":"15281d9df79301feaffb15e1c749b310b6e6794ed4aae22fbe477b1364ae2c00"} Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.381859 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-kwgcl" event={"ID":"757b923d-5bf5-4b31-af60-a617c6b13559","Type":"ContainerStarted","Data":"af3283c882e36b581fcca1cf4eb66147fe810da360ddec24636fab8c62e13802"} Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.388541 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-pr972" event={"ID":"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d","Type":"ContainerStarted","Data":"34b765a834fe7b4fac8af6c5f00d2a569085e864c140edf538a322ef3ed46827"} Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.400201 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-wpxh2" Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.401873 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-dp6jb" event={"ID":"41c580c9-c09e-436e-9fff-05612f7b47f5","Type":"ContainerStarted","Data":"ceb1ab257323be4f904668f51b03c04d9d3c601ca873fe7758f22b4eb291af85"} Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.410606 4771 patch_prober.go:28] interesting pod/router-default-5444994796-wpxh2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 14:58:30 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Oct 01 14:58:30 crc kubenswrapper[4771]: [+]process-running ok Oct 01 14:58:30 crc kubenswrapper[4771]: healthz check failed Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.410647 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wpxh2" podUID="a2238a30-4ae1-4bd8-acfa-1e357552252c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.413372 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:30 crc kubenswrapper[4771]: E1001 14:58:30.414299 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:30.914283428 +0000 UTC m=+155.533458609 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.424943 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-ds5h4" event={"ID":"aac54d18-3f6e-4bf8-98f6-6e24c2d8ed74","Type":"ContainerStarted","Data":"ce4339e8ecc7d8ea940cb2cdc1a59d185fc5b89b5e9ef8c36e6210c26dad1c32"} Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.453859 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-szwtc" event={"ID":"9db1abd4-f11c-45e1-9341-6c818c3e3579","Type":"ContainerStarted","Data":"6744bdf07ecad79d6adfb13deec5c69bf3eddb2b16dca4d7fcbe2ede99c007ee"} Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.461799 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-75jhw" podStartSLOduration=128.461779231 podStartE2EDuration="2m8.461779231s" podCreationTimestamp="2025-10-01 14:56:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:58:30.443836453 +0000 UTC m=+155.063011644" watchObservedRunningTime="2025-10-01 14:58:30.461779231 +0000 UTC m=+155.080954412" Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.463939 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g7zvd"] Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.481607 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn6nl"] Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.494283 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322165-xdwrz"] Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.494263 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-2vqkq" podStartSLOduration=128.494242859 podStartE2EDuration="2m8.494242859s" podCreationTimestamp="2025-10-01 14:56:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:58:30.478276272 +0000 UTC m=+155.097451443" watchObservedRunningTime="2025-10-01 14:58:30.494242859 +0000 UTC m=+155.113418030" Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.497837 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7cpjp" event={"ID":"558574b3-154b-48bc-9436-3013a9b62f28","Type":"ContainerStarted","Data":"5d4e78db86308a71212e8c1a28593cefbc4706821f115dab6decfb8e875a8a8c"} Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.497875 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7cpjp" event={"ID":"558574b3-154b-48bc-9436-3013a9b62f28","Type":"ContainerStarted","Data":"5ed464262815126b564be1bbcd2e3a75fc82532deade630164e94cb94d540e3a"} Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.505189 4771 generic.go:334] "Generic (PLEG): container finished" podID="13778ea7-497e-431d-a3a9-96979d2e4885" containerID="1de66cd9e9fa3eb3484d41987ce8eaf789a3a2038724d05280f8b0abcb8256b5" exitCode=0 Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.505255 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-shl79" event={"ID":"13778ea7-497e-431d-a3a9-96979d2e4885","Type":"ContainerDied","Data":"1de66cd9e9fa3eb3484d41987ce8eaf789a3a2038724d05280f8b0abcb8256b5"} Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.516585 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:30 crc kubenswrapper[4771]: E1001 14:58:30.517139 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:31.017125424 +0000 UTC m=+155.636300595 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.518024 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nhxdc"] Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.518449 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-n5mnh" event={"ID":"1d5cfd50-877b-49ab-82f0-7753f0fabba4","Type":"ContainerStarted","Data":"9e70c7d69b7ac5a58f2522310680867aafb1c33bf04d6a29d3ed16fa2a56347d"} Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.520602 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-c64p2"] Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.556231 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rsd2x" event={"ID":"b248e756-e3b8-4fd9-b6fb-99ee87df696d","Type":"ContainerStarted","Data":"f6f1afa972fa89803fd7f98a31e66d614175b184d9c6cb8962079e93a350a899"} Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.556276 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-hlqj5"] Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.556296 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rsd2x" event={"ID":"b248e756-e3b8-4fd9-b6fb-99ee87df696d","Type":"ContainerStarted","Data":"7dec245134ec70155e79f5150e26a7d3b78f05c6ff1948a3ecd14210dc66e1f9"} Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.556694 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-rsd2x" Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.569705 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-nrlb2"] Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.584259 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qgbcc" event={"ID":"0d14cf95-98ee-432d-8889-edf3508b4eb3","Type":"ContainerStarted","Data":"12f95d1c02b3227a9e1fcac2c41af8c638bb02db4300240a8bfeaf58fa204f0f"} Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.584485 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-z6x72" event={"ID":"83e0e747-e0e6-4aab-bc5c-27d0b41e2fb1","Type":"ContainerStarted","Data":"332e9f91669514a9f2a11e1a8f31dd41527369e0074fc1a88b3c6c8fe2bfdd2f"} Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.582824 4771 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-rsd2x container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.584745 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-rsd2x" podUID="b248e756-e3b8-4fd9-b6fb-99ee87df696d" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.592902 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pvm7n" event={"ID":"c107c949-1ce5-41cb-a5a6-49bf5c599fc2","Type":"ContainerStarted","Data":"26d14f18b445274dc4df41bbdcc5029aa615b0a44da2e0dcd06ba9516bf49f96"} Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.614988 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-5fnbm"] Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.622231 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.622625 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mjvfw"] Oct 01 14:58:30 crc kubenswrapper[4771]: W1001 14:58:30.622946 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40496e6d_3f79_4478_804b_dc9904473801.slice/crio-8ea2b87434699e66bdcc1fa2e5c8511f110832cb1ea8f07ec0398bca9ebcc2a3 WatchSource:0}: Error finding container 8ea2b87434699e66bdcc1fa2e5c8511f110832cb1ea8f07ec0398bca9ebcc2a3: Status 404 returned error can't find the container with id 8ea2b87434699e66bdcc1fa2e5c8511f110832cb1ea8f07ec0398bca9ebcc2a3 Oct 01 14:58:30 crc kubenswrapper[4771]: E1001 14:58:30.627098 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:31.127082819 +0000 UTC m=+155.746257990 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:30 crc kubenswrapper[4771]: W1001 14:58:30.630224 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1cd5336_780a_49f1_9360_e7b8c97779d2.slice/crio-efee973c5e00f3239bd378ba4e633a6d6fa1d63d6a830b7effe56af69cd97343 WatchSource:0}: Error finding container efee973c5e00f3239bd378ba4e633a6d6fa1d63d6a830b7effe56af69cd97343: Status 404 returned error can't find the container with id efee973c5e00f3239bd378ba4e633a6d6fa1d63d6a830b7effe56af69cd97343 Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.638131 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-75jhw" Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.679125 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-7cpjp" podStartSLOduration=128.679107227 podStartE2EDuration="2m8.679107227s" podCreationTimestamp="2025-10-01 14:56:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:58:30.648503406 +0000 UTC m=+155.267678577" watchObservedRunningTime="2025-10-01 14:58:30.679107227 +0000 UTC m=+155.298282398" Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.680210 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-z6x72" podStartSLOduration=128.680204075 podStartE2EDuration="2m8.680204075s" podCreationTimestamp="2025-10-01 14:56:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:58:30.677971808 +0000 UTC m=+155.297146979" watchObservedRunningTime="2025-10-01 14:58:30.680204075 +0000 UTC m=+155.299379246" Oct 01 14:58:30 crc kubenswrapper[4771]: W1001 14:58:30.685342 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd8df04a_9a5e_4784_ac97_9782d936fa5e.slice/crio-04b1598296b2c2a54e152e156a7b6a4d9a4db06b2f471b4e81adf4cd7f938b26 WatchSource:0}: Error finding container 04b1598296b2c2a54e152e156a7b6a4d9a4db06b2f471b4e81adf4cd7f938b26: Status 404 returned error can't find the container with id 04b1598296b2c2a54e152e156a7b6a4d9a4db06b2f471b4e81adf4cd7f938b26 Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.724313 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:30 crc kubenswrapper[4771]: E1001 14:58:30.726914 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:31.226888236 +0000 UTC m=+155.846063407 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.761588 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-rsd2x" podStartSLOduration=127.761570011 podStartE2EDuration="2m7.761570011s" podCreationTimestamp="2025-10-01 14:56:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:58:30.76110601 +0000 UTC m=+155.380281191" watchObservedRunningTime="2025-10-01 14:58:30.761570011 +0000 UTC m=+155.380745182" Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.826492 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:30 crc kubenswrapper[4771]: E1001 14:58:30.826880 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:31.326866348 +0000 UTC m=+155.946041519 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.845560 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-wpxh2" podStartSLOduration=128.845539404 podStartE2EDuration="2m8.845539404s" podCreationTimestamp="2025-10-01 14:56:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:58:30.845396821 +0000 UTC m=+155.464571992" watchObservedRunningTime="2025-10-01 14:58:30.845539404 +0000 UTC m=+155.464714575" Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.864563 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-t6brp" podStartSLOduration=128.86454003 podStartE2EDuration="2m8.86454003s" podCreationTimestamp="2025-10-01 14:56:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:58:30.811193538 +0000 UTC m=+155.430368709" watchObservedRunningTime="2025-10-01 14:58:30.86454003 +0000 UTC m=+155.483715201" Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.876843 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kd4c9" podStartSLOduration=128.876823902 podStartE2EDuration="2m8.876823902s" podCreationTimestamp="2025-10-01 14:56:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:58:30.875199421 +0000 UTC m=+155.494374602" watchObservedRunningTime="2025-10-01 14:58:30.876823902 +0000 UTC m=+155.495999073" Oct 01 14:58:30 crc kubenswrapper[4771]: I1001 14:58:30.933321 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:30 crc kubenswrapper[4771]: E1001 14:58:30.933652 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:31.433636873 +0000 UTC m=+156.052812044 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.042788 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:31 crc kubenswrapper[4771]: E1001 14:58:31.043323 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:31.543310982 +0000 UTC m=+156.162486143 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.144183 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:31 crc kubenswrapper[4771]: E1001 14:58:31.145175 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:31.645154091 +0000 UTC m=+156.264329272 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.246951 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:31 crc kubenswrapper[4771]: E1001 14:58:31.247336 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:31.747309398 +0000 UTC m=+156.366484569 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.348425 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:31 crc kubenswrapper[4771]: E1001 14:58:31.348854 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:31.848822129 +0000 UTC m=+156.467997300 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.349674 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:31 crc kubenswrapper[4771]: E1001 14:58:31.350058 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:31.85004781 +0000 UTC m=+156.469222981 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.403891 4771 patch_prober.go:28] interesting pod/router-default-5444994796-wpxh2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 14:58:31 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Oct 01 14:58:31 crc kubenswrapper[4771]: [+]process-running ok Oct 01 14:58:31 crc kubenswrapper[4771]: healthz check failed Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.403943 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wpxh2" podUID="a2238a30-4ae1-4bd8-acfa-1e357552252c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.451426 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:31 crc kubenswrapper[4771]: E1001 14:58:31.451562 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:31.95154641 +0000 UTC m=+156.570721581 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.451657 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:31 crc kubenswrapper[4771]: E1001 14:58:31.451958 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:31.95194684 +0000 UTC m=+156.571122011 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.552441 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:31 crc kubenswrapper[4771]: E1001 14:58:31.552657 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:32.05263095 +0000 UTC m=+156.671806121 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.552693 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:31 crc kubenswrapper[4771]: E1001 14:58:31.553023 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:32.05301558 +0000 UTC m=+156.672190751 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.600125 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn6nl" event={"ID":"bb1bead7-b2e8-47b0-9e78-d5b6970d0121","Type":"ContainerStarted","Data":"2e142dc82fa4bddb36aaeee5739cce9041e06fe5f7b344f83bbe1c1cea4439e6"} Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.600176 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn6nl" event={"ID":"bb1bead7-b2e8-47b0-9e78-d5b6970d0121","Type":"ContainerStarted","Data":"3006f46124d3008bfc7653d1f354c956f33bd41803bb05710870aa4ec078968d"} Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.602031 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4fgp8" event={"ID":"892c3644-ab53-40dc-a65e-4b14e6b537ed","Type":"ContainerStarted","Data":"ba06b8377ba3485e2536939e98ba8d7a269c541529de14c2dcec71b2adad9baa"} Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.602296 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4fgp8" Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.603229 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-dp6jb" event={"ID":"41c580c9-c09e-436e-9fff-05612f7b47f5","Type":"ContainerStarted","Data":"9416847fca263f6c7a940c5e4cd768aecf8b87aaf555caa6fd93f99b30ecf078"} Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.603996 4771 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-4fgp8 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" start-of-body= Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.604038 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4fgp8" podUID="892c3644-ab53-40dc-a65e-4b14e6b537ed" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.604472 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-ds5h4" event={"ID":"aac54d18-3f6e-4bf8-98f6-6e24c2d8ed74","Type":"ContainerStarted","Data":"e24c82b9d0c3efb7942fe83fc2ad7ff34ecc5734afb96df83dae70dc6b1531f8"} Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.604494 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-ds5h4" event={"ID":"aac54d18-3f6e-4bf8-98f6-6e24c2d8ed74","Type":"ContainerStarted","Data":"f0ac40f03706bb3651f6e0dfe25cbe4771e42d1da89211ee94be0431c3459102"} Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.605388 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pvm7n" event={"ID":"c107c949-1ce5-41cb-a5a6-49bf5c599fc2","Type":"ContainerStarted","Data":"b54170e5f678c6ee265706bce339041ba2cb35b71c26d93491b73ec5a28b132e"} Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.605637 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pvm7n" Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.606681 4771 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-pvm7n container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.606721 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pvm7n" podUID="c107c949-1ce5-41cb-a5a6-49bf5c599fc2" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.607192 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-n5mnh" event={"ID":"1d5cfd50-877b-49ab-82f0-7753f0fabba4","Type":"ContainerStarted","Data":"5d0431bf640c7b6804dc18770b927026d8f69ff59aea818c5adeb047334c515f"} Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.607222 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-n5mnh" event={"ID":"1d5cfd50-877b-49ab-82f0-7753f0fabba4","Type":"ContainerStarted","Data":"23fe41bdfa370e0e789f9e84798cb302b897cad49deeae190ad21651d14175f0"} Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.609866 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-xdwrz" event={"ID":"40496e6d-3f79-4478-804b-dc9904473801","Type":"ContainerStarted","Data":"1262a15d0ce773b7696404ec7c2315d8babf9c50bfe5a5ed0b38caf7396c9c11"} Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.609901 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-xdwrz" event={"ID":"40496e6d-3f79-4478-804b-dc9904473801","Type":"ContainerStarted","Data":"8ea2b87434699e66bdcc1fa2e5c8511f110832cb1ea8f07ec0398bca9ebcc2a3"} Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.611716 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l657x" event={"ID":"abdbc7a1-d300-43d1-a23d-416a9cbc5a98","Type":"ContainerStarted","Data":"1576be8ec3f5aa5c7ef71dc34dadfc7e419200015afc04053a0d08a83895fd64"} Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.612419 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l657x" Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.614014 4771 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-l657x container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" start-of-body= Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.614056 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l657x" podUID="abdbc7a1-d300-43d1-a23d-416a9cbc5a98" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.36:5443/healthz\": dial tcp 10.217.0.36:5443: connect: connection refused" Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.614562 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ld5bx" event={"ID":"3f9e5d02-c03d-42b0-a837-bfa317d1cbd8","Type":"ContainerStarted","Data":"b08f78b76ee375b5947439ec21ba2ec31874e246a2f2866024154f649649d5c9"} Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.614590 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ld5bx" event={"ID":"3f9e5d02-c03d-42b0-a837-bfa317d1cbd8","Type":"ContainerStarted","Data":"d30e2952d29a404c157630c987422509c2c92777c93ba2cdd5195cfb0523de5e"} Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.616718 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9q7z8" event={"ID":"2bba146f-994a-4fbd-834f-861c2ffa4232","Type":"ContainerStarted","Data":"8df2ee84a28d15bac04381af9f6520f489b438393c4c6a54c8a2e75dd6cb2440"} Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.618617 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-szwtc" event={"ID":"9db1abd4-f11c-45e1-9341-6c818c3e3579","Type":"ContainerStarted","Data":"92c36edd0d4b7281ca084e8fbdcddfb24d1fec792c615fa8ffc97c2f42cafbc5"} Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.628252 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-kwgcl" event={"ID":"757b923d-5bf5-4b31-af60-a617c6b13559","Type":"ContainerStarted","Data":"d3044ca0a87a07070418e7b621dc5546f45c105cda40b1e646c56801e09a0381"} Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.630339 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4fgp8" podStartSLOduration=128.630320113 podStartE2EDuration="2m8.630320113s" podCreationTimestamp="2025-10-01 14:56:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:58:31.628857516 +0000 UTC m=+156.248032687" watchObservedRunningTime="2025-10-01 14:58:31.630320113 +0000 UTC m=+156.249495284" Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.630644 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-mjvfw" event={"ID":"cd8df04a-9a5e-4784-ac97-9782d936fa5e","Type":"ContainerStarted","Data":"4c3e9c976c2eb0ace6d070c414635b01ca1e0cd35e2030f5d695a1b9a0bd35af"} Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.630679 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-mjvfw" event={"ID":"cd8df04a-9a5e-4784-ac97-9782d936fa5e","Type":"ContainerStarted","Data":"04b1598296b2c2a54e152e156a7b6a4d9a4db06b2f471b4e81adf4cd7f938b26"} Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.631345 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-mjvfw" Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.632078 4771 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-mjvfw container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.632112 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-mjvfw" podUID="cd8df04a-9a5e-4784-ac97-9782d936fa5e" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.632286 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-z6x72" event={"ID":"83e0e747-e0e6-4aab-bc5c-27d0b41e2fb1","Type":"ContainerStarted","Data":"768425e7e5bbaa7cd608daea7d17f7d5fed571d0bc6c0b304527dacd493a2445"} Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.633342 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-nrlb2" event={"ID":"d92fedc8-d031-40c1-b9fa-695496499a26","Type":"ContainerStarted","Data":"b12739e00856863455d09ae0a77e9f900bca258a87a6bceb7f1153989faa19bd"} Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.634367 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g7zvd" event={"ID":"728535b7-cedd-4391-b5e4-8cee0982380d","Type":"ContainerStarted","Data":"a6d5fd4695c23fe9e54b5883e70cfe133062ee47b568e8934e444dad57a1321e"} Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.637246 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nhxdc" event={"ID":"d1cd5336-780a-49f1-9360-e7b8c97779d2","Type":"ContainerStarted","Data":"efee973c5e00f3239bd378ba4e633a6d6fa1d63d6a830b7effe56af69cd97343"} Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.638242 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pc29" event={"ID":"f6e05857-3b53-4e95-9f02-a79ad8b509a8","Type":"ContainerStarted","Data":"753b7be1ed636c5bdd328135671b25d2b2c95109f1cbae9655fa82fa406bc79e"} Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.640518 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2kmtl" event={"ID":"b31fcf4c-ab24-4f6a-9807-04c076e2d548","Type":"ContainerStarted","Data":"0808200d9a48aff27cdfc33099d59177287cf2890e0e8cf11300002d44928463"} Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.640560 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2kmtl" event={"ID":"b31fcf4c-ab24-4f6a-9807-04c076e2d548","Type":"ContainerStarted","Data":"e91858300d8b57fa09513ece00b0a53fe1bd7a4a9c264a26dc0550929a9e2f66"} Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.643722 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-mhd4x" event={"ID":"6af25d90-8e1b-45ec-ac53-1fdd01387b9f","Type":"ContainerStarted","Data":"15830d524e88036d8571fe8322561969c0c38519f5f14677536b828df4bfb45a"} Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.643808 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-mhd4x" Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.644091 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l657x" podStartSLOduration=128.644077793 podStartE2EDuration="2m8.644077793s" podCreationTimestamp="2025-10-01 14:56:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:58:31.643361836 +0000 UTC m=+156.262537017" watchObservedRunningTime="2025-10-01 14:58:31.644077793 +0000 UTC m=+156.263252974" Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.648383 4771 generic.go:334] "Generic (PLEG): container finished" podID="541a902f-ea82-44e7-9c01-e93c9e01a2b6" containerID="eee719ebeba6ce8658cf79936cef6fd99c137365f3f100421677cdb9290c8f5c" exitCode=0 Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.648447 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ldrz6" event={"ID":"541a902f-ea82-44e7-9c01-e93c9e01a2b6","Type":"ContainerDied","Data":"eee719ebeba6ce8658cf79936cef6fd99c137365f3f100421677cdb9290c8f5c"} Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.650894 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-pr972" event={"ID":"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d","Type":"ContainerStarted","Data":"cc913c282407242dce14acab5ada402e8f333fc188d851c277881a722300e207"} Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.651599 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-pr972" Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.652884 4771 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-pr972 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.19:6443/healthz\": dial tcp 10.217.0.19:6443: connect: connection refused" start-of-body= Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.652987 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-pr972" podUID="0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.19:6443/healthz\": dial tcp 10.217.0.19:6443: connect: connection refused" Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.653390 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:31 crc kubenswrapper[4771]: E1001 14:58:31.653583 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:32.153563046 +0000 UTC m=+156.772738217 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.653629 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:31 crc kubenswrapper[4771]: E1001 14:58:31.653984 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:32.153975436 +0000 UTC m=+156.773150607 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.665463 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-5ddff" event={"ID":"fcfb70b0-1fb4-49ec-b1e2-1d7788b52aeb","Type":"ContainerStarted","Data":"5de652fbc591ee4c0f63c8ab633e9be382543db0bc4870eee7c7051bf9aaf82f"} Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.668438 4771 generic.go:334] "Generic (PLEG): container finished" podID="51a76061-b2bf-427b-985e-767ebad2a8cb" containerID="f6e6da344aa4523a58b8ec878d2587f41a5c27f34a75767560202c39c5d90833" exitCode=0 Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.668664 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tk4n9" event={"ID":"51a76061-b2bf-427b-985e-767ebad2a8cb","Type":"ContainerDied","Data":"f6e6da344aa4523a58b8ec878d2587f41a5c27f34a75767560202c39c5d90833"} Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.668716 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tk4n9" event={"ID":"51a76061-b2bf-427b-985e-767ebad2a8cb","Type":"ContainerStarted","Data":"0d23948008bda820d189e2e2f9292603365e7cfe7a3afc2f80d1270c09e05e32"} Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.668892 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9q7z8" podStartSLOduration=129.668876027 podStartE2EDuration="2m9.668876027s" podCreationTimestamp="2025-10-01 14:56:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:58:31.665635994 +0000 UTC m=+156.284811165" watchObservedRunningTime="2025-10-01 14:58:31.668876027 +0000 UTC m=+156.288051198" Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.686760 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-ds5h4" podStartSLOduration=128.686712292 podStartE2EDuration="2m8.686712292s" podCreationTimestamp="2025-10-01 14:56:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:58:31.685986913 +0000 UTC m=+156.305162084" watchObservedRunningTime="2025-10-01 14:58:31.686712292 +0000 UTC m=+156.305887463" Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.691716 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wcgzq" event={"ID":"2baafe19-ab7c-43c2-bd6b-0d6398b9fb3b","Type":"ContainerStarted","Data":"9a251b60ffe73162b617952c80d4f66ede6e4a79369829a3f51cf3e7d1748df3"} Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.712368 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-dp6jb" podStartSLOduration=5.712348106 podStartE2EDuration="5.712348106s" podCreationTimestamp="2025-10-01 14:58:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:58:31.706003575 +0000 UTC m=+156.325178756" watchObservedRunningTime="2025-10-01 14:58:31.712348106 +0000 UTC m=+156.331523277" Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.716868 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8gml8" event={"ID":"6730c7b8-fcbf-48c4-b2b8-5ed3566a7cd4","Type":"ContainerStarted","Data":"b9fb2cffb1c15e344f44150c3396438d2b940205ca57320f7528dee3eb63af84"} Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.724191 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-n5mnh" podStartSLOduration=128.724169978 podStartE2EDuration="2m8.724169978s" podCreationTimestamp="2025-10-01 14:56:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:58:31.721298005 +0000 UTC m=+156.340473186" watchObservedRunningTime="2025-10-01 14:58:31.724169978 +0000 UTC m=+156.343345149" Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.725968 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5fnbm" event={"ID":"828f2820-7234-4e86-81fd-ca8666c1e640","Type":"ContainerStarted","Data":"31ad94464979b9c3acdb028d1cade9ccac172efcea00af5e425727b5701a345d"} Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.729820 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsjtg" event={"ID":"335db6c7-efb8-4055-aacf-8262b4ec5b91","Type":"ContainerStarted","Data":"f4d36301a185aa3ca50c02164c022c1580b0b325c55edb2d3ea1c466358e2d8c"} Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.730692 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsjtg" Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.733688 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-c64p2" event={"ID":"a3fb633f-3f5a-4691-9b17-e59e9ef8b8b5","Type":"ContainerStarted","Data":"41fa036ef92e3e21fcadbd20290308a20c367259a04440072edae5870d5aa742"} Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.733717 4771 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-dsjtg container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.733788 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsjtg" podUID="335db6c7-efb8-4055-aacf-8262b4ec5b91" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.733747 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-c64p2" event={"ID":"a3fb633f-3f5a-4691-9b17-e59e9ef8b8b5","Type":"ContainerStarted","Data":"12506406dcc4f196bfbcd9767be6809568e8518068eea4dc4165634a93ea7bbc"} Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.741658 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-xdwrz" podStartSLOduration=129.741643954 podStartE2EDuration="2m9.741643954s" podCreationTimestamp="2025-10-01 14:56:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:58:31.738429202 +0000 UTC m=+156.357604373" watchObservedRunningTime="2025-10-01 14:58:31.741643954 +0000 UTC m=+156.360819125" Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.749552 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l6gqh" event={"ID":"ea64286d-7a71-4d7e-b54b-5796a3a9f7df","Type":"ContainerStarted","Data":"b3f4e71415b766cdf9187a6f2f9cf63b76bc27da7ed50f90543d69ee3141f8d1"} Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.749603 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l6gqh" event={"ID":"ea64286d-7a71-4d7e-b54b-5796a3a9f7df","Type":"ContainerStarted","Data":"f06b079ea77186174a17d71ab03501b0dc94ce8834735f4e8d02baa912ef0686"} Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.754616 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:31 crc kubenswrapper[4771]: E1001 14:58:31.756145 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:32.256125893 +0000 UTC m=+156.875301064 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.754618 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-qlkrl" event={"ID":"f160df7c-e97b-4c5a-badf-08379f8e27bf","Type":"ContainerStarted","Data":"7cb94af4f5d9f1e7e68918de86282463d832dd8433e58efd1e03b68edb6e8273"} Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.768385 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pvm7n" podStartSLOduration=128.768368636 podStartE2EDuration="2m8.768368636s" podCreationTimestamp="2025-10-01 14:56:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:58:31.762108596 +0000 UTC m=+156.381283767" watchObservedRunningTime="2025-10-01 14:58:31.768368636 +0000 UTC m=+156.387543807" Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.770568 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qgbcc" event={"ID":"0d14cf95-98ee-432d-8889-edf3508b4eb3","Type":"ContainerStarted","Data":"7f9cc9f2df953da627639141c87f832cc8c8ff09c46e28b5ffeec687f0bb3565"} Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.770615 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qgbcc" event={"ID":"0d14cf95-98ee-432d-8889-edf3508b4eb3","Type":"ContainerStarted","Data":"6f88fd4295b3087978ad8c126067e0950b813634df594ad2bd51d1455b68bdef"} Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.772237 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-hlqj5" event={"ID":"1fa98d96-d0f8-4b7f-9421-50e6eceaca84","Type":"ContainerStarted","Data":"a6d34d8e43f56e44d1b30f207e57ae31eeb6b61c7ad58d0196296a5bedfff3d3"} Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.772273 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-hlqj5" event={"ID":"1fa98d96-d0f8-4b7f-9421-50e6eceaca84","Type":"ContainerStarted","Data":"7e433aae676ef71d3fcb5dd61089d0eaee4c5b1e013ca2718be3ca222167de48"} Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.784863 4771 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-rsd2x container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" start-of-body= Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.784941 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-rsd2x" podUID="b248e756-e3b8-4fd9-b6fb-99ee87df696d" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.23:8080/healthz\": dial tcp 10.217.0.23:8080: connect: connection refused" Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.842463 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-kwgcl" podStartSLOduration=129.842444766 podStartE2EDuration="2m9.842444766s" podCreationTimestamp="2025-10-01 14:56:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:58:31.796423702 +0000 UTC m=+156.415598873" watchObservedRunningTime="2025-10-01 14:58:31.842444766 +0000 UTC m=+156.461619937" Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.844876 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-szwtc" podStartSLOduration=129.844863988 podStartE2EDuration="2m9.844863988s" podCreationTimestamp="2025-10-01 14:56:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:58:31.843309459 +0000 UTC m=+156.462484630" watchObservedRunningTime="2025-10-01 14:58:31.844863988 +0000 UTC m=+156.464039159" Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.857124 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:31 crc kubenswrapper[4771]: E1001 14:58:31.865604 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:32.365585567 +0000 UTC m=+156.984760738 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.876659 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-5ddff" podStartSLOduration=128.876622038 podStartE2EDuration="2m8.876622038s" podCreationTimestamp="2025-10-01 14:56:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:58:31.873695853 +0000 UTC m=+156.492871054" watchObservedRunningTime="2025-10-01 14:58:31.876622038 +0000 UTC m=+156.495797209" Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.955086 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-mhd4x" podStartSLOduration=6.955062441 podStartE2EDuration="6.955062441s" podCreationTimestamp="2025-10-01 14:58:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:58:31.920713324 +0000 UTC m=+156.539888515" watchObservedRunningTime="2025-10-01 14:58:31.955062441 +0000 UTC m=+156.574237612" Oct 01 14:58:31 crc kubenswrapper[4771]: I1001 14:58:31.960280 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:31 crc kubenswrapper[4771]: E1001 14:58:31.960823 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:32.460805877 +0000 UTC m=+157.079981048 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.005366 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-c64p2" podStartSLOduration=7.005344803 podStartE2EDuration="7.005344803s" podCreationTimestamp="2025-10-01 14:58:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:58:31.95818797 +0000 UTC m=+156.577363141" watchObservedRunningTime="2025-10-01 14:58:32.005344803 +0000 UTC m=+156.624519974" Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.061597 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:32 crc kubenswrapper[4771]: E1001 14:58:32.062159 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:32.562119923 +0000 UTC m=+157.181295154 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.121680 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-pr972" podStartSLOduration=130.121659522 podStartE2EDuration="2m10.121659522s" podCreationTimestamp="2025-10-01 14:56:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:58:32.098183983 +0000 UTC m=+156.717359154" watchObservedRunningTime="2025-10-01 14:58:32.121659522 +0000 UTC m=+156.740834693" Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.137592 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-mjvfw" podStartSLOduration=130.137573498 podStartE2EDuration="2m10.137573498s" podCreationTimestamp="2025-10-01 14:56:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:58:32.122473493 +0000 UTC m=+156.741648684" watchObservedRunningTime="2025-10-01 14:58:32.137573498 +0000 UTC m=+156.756748669" Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.140348 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qgbcc" podStartSLOduration=130.140334459 podStartE2EDuration="2m10.140334459s" podCreationTimestamp="2025-10-01 14:56:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:58:32.137339903 +0000 UTC m=+156.756515094" watchObservedRunningTime="2025-10-01 14:58:32.140334459 +0000 UTC m=+156.759509630" Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.158741 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-hlqj5" podStartSLOduration=129.158713558 podStartE2EDuration="2m9.158713558s" podCreationTimestamp="2025-10-01 14:56:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:58:32.156954233 +0000 UTC m=+156.776129434" watchObservedRunningTime="2025-10-01 14:58:32.158713558 +0000 UTC m=+156.777888729" Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.162609 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:32 crc kubenswrapper[4771]: E1001 14:58:32.162966 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:32.662948906 +0000 UTC m=+157.282124077 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.236221 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8gml8" podStartSLOduration=129.236203375 podStartE2EDuration="2m9.236203375s" podCreationTimestamp="2025-10-01 14:56:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:58:32.233329692 +0000 UTC m=+156.852504873" watchObservedRunningTime="2025-10-01 14:58:32.236203375 +0000 UTC m=+156.855378556" Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.263845 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:32 crc kubenswrapper[4771]: E1001 14:58:32.264212 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:32.76420006 +0000 UTC m=+157.383375231 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.275696 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-l6gqh" podStartSLOduration=129.275669982 podStartE2EDuration="2m9.275669982s" podCreationTimestamp="2025-10-01 14:56:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:58:32.27557604 +0000 UTC m=+156.894751231" watchObservedRunningTime="2025-10-01 14:58:32.275669982 +0000 UTC m=+156.894845153" Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.359845 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsjtg" podStartSLOduration=129.359824171 podStartE2EDuration="2m9.359824171s" podCreationTimestamp="2025-10-01 14:56:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:58:32.321048221 +0000 UTC m=+156.940223422" watchObservedRunningTime="2025-10-01 14:58:32.359824171 +0000 UTC m=+156.978999352" Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.364580 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:32 crc kubenswrapper[4771]: E1001 14:58:32.364880 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:32.864860379 +0000 UTC m=+157.484035550 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.364952 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:32 crc kubenswrapper[4771]: E1001 14:58:32.365339 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:32.865329281 +0000 UTC m=+157.484504462 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.397509 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-wcgzq" podStartSLOduration=130.397489761 podStartE2EDuration="2m10.397489761s" podCreationTimestamp="2025-10-01 14:56:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:58:32.395590353 +0000 UTC m=+157.014765534" watchObservedRunningTime="2025-10-01 14:58:32.397489761 +0000 UTC m=+157.016664932" Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.399336 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-qlkrl" podStartSLOduration=129.399328419 podStartE2EDuration="2m9.399328419s" podCreationTimestamp="2025-10-01 14:56:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:58:32.361400701 +0000 UTC m=+156.980575882" watchObservedRunningTime="2025-10-01 14:58:32.399328419 +0000 UTC m=+157.018503590" Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.403879 4771 patch_prober.go:28] interesting pod/router-default-5444994796-wpxh2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 14:58:32 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Oct 01 14:58:32 crc kubenswrapper[4771]: [+]process-running ok Oct 01 14:58:32 crc kubenswrapper[4771]: healthz check failed Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.403946 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wpxh2" podUID="a2238a30-4ae1-4bd8-acfa-1e357552252c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.465580 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:32 crc kubenswrapper[4771]: E1001 14:58:32.465830 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:32.965798925 +0000 UTC m=+157.584974106 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.465917 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:32 crc kubenswrapper[4771]: E1001 14:58:32.466220 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:32.966205246 +0000 UTC m=+157.585380417 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.566580 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:32 crc kubenswrapper[4771]: E1001 14:58:32.566979 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:33.066948487 +0000 UTC m=+157.686123668 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.567071 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:32 crc kubenswrapper[4771]: E1001 14:58:32.567470 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:33.067459379 +0000 UTC m=+157.686634550 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.668476 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:32 crc kubenswrapper[4771]: E1001 14:58:32.668918 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:33.168884767 +0000 UTC m=+157.788059938 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.769905 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:32 crc kubenswrapper[4771]: E1001 14:58:32.770311 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:33.270295356 +0000 UTC m=+157.889470517 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.788820 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tk4n9" event={"ID":"51a76061-b2bf-427b-985e-767ebad2a8cb","Type":"ContainerStarted","Data":"68a6c571456be3286e4f58e5746c6900baff6ad8ecc1b0f4b20842662b319268"} Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.788902 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tk4n9" Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.790345 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-nrlb2" event={"ID":"d92fedc8-d031-40c1-b9fa-695496499a26","Type":"ContainerStarted","Data":"b91f19180c14d798fb0e8eec2e69be4d8718dd33f4af111958c5050e58f276e7"} Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.790535 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-nrlb2" Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.792024 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ldrz6" event={"ID":"541a902f-ea82-44e7-9c01-e93c9e01a2b6","Type":"ContainerStarted","Data":"85206ef2265130acea9d844448819389d568c9b7289cfc8ef154fc3969a78b24"} Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.792357 4771 patch_prober.go:28] interesting pod/downloads-7954f5f757-nrlb2 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.792397 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nrlb2" podUID="d92fedc8-d031-40c1-b9fa-695496499a26" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.793723 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ld5bx" event={"ID":"3f9e5d02-c03d-42b0-a837-bfa317d1cbd8","Type":"ContainerStarted","Data":"c90ac3467043b32d59714e40a6e63f43c7132f7e7cace3711b300c28d110b73f"} Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.794966 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g7zvd" event={"ID":"728535b7-cedd-4391-b5e4-8cee0982380d","Type":"ContainerStarted","Data":"8cd945b1a51974c3b3969a36179fedf375e5a72edca5fba5e8f7924c257db3f6"} Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.796246 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nhxdc" event={"ID":"d1cd5336-780a-49f1-9360-e7b8c97779d2","Type":"ContainerStarted","Data":"7d26e71863d7123624b433e56fc4781c63fa02a4b1355245e85beb6e5cdda045"} Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.798150 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pc29" event={"ID":"f6e05857-3b53-4e95-9f02-a79ad8b509a8","Type":"ContainerStarted","Data":"5bdac63f48ecfd334b4c260ae4b7c6e4e975b572339ff40fa7a8139b4d739418"} Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.798267 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pc29" Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.800180 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-shl79" event={"ID":"13778ea7-497e-431d-a3a9-96979d2e4885","Type":"ContainerStarted","Data":"7bba626c617eb6daba48d3d2913e61e7beac48c7b52d361edd6748458c2216da"} Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.800225 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-shl79" event={"ID":"13778ea7-497e-431d-a3a9-96979d2e4885","Type":"ContainerStarted","Data":"c7dd0949697d24186bf55abd5eb3c3006112e7416ec4bbbd03e9f89e0dcd310c"} Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.800916 4771 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-pr972 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.19:6443/healthz\": dial tcp 10.217.0.19:6443: connect: connection refused" start-of-body= Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.800956 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-pr972" podUID="0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.19:6443/healthz\": dial tcp 10.217.0.19:6443: connect: connection refused" Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.801018 4771 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-mjvfw container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.801032 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-mjvfw" podUID="cd8df04a-9a5e-4784-ac97-9782d936fa5e" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.801901 4771 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-pvm7n container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.801945 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pvm7n" podUID="c107c949-1ce5-41cb-a5a6-49bf5c599fc2" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.801947 4771 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-4fgp8 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" start-of-body= Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.802196 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4fgp8" podUID="892c3644-ab53-40dc-a65e-4b14e6b537ed" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.827078 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsjtg" Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.832925 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tk4n9" podStartSLOduration=130.832899823 podStartE2EDuration="2m10.832899823s" podCreationTimestamp="2025-10-01 14:56:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:58:32.824880399 +0000 UTC m=+157.444055590" watchObservedRunningTime="2025-10-01 14:58:32.832899823 +0000 UTC m=+157.452074994" Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.846981 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2kmtl" podStartSLOduration=129.846963013 podStartE2EDuration="2m9.846963013s" podCreationTimestamp="2025-10-01 14:56:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:58:32.844886699 +0000 UTC m=+157.464061870" watchObservedRunningTime="2025-10-01 14:58:32.846963013 +0000 UTC m=+157.466138184" Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.871358 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:32 crc kubenswrapper[4771]: E1001 14:58:32.871552 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:33.371517549 +0000 UTC m=+157.990692730 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.872454 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:32 crc kubenswrapper[4771]: E1001 14:58:32.886386 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:33.386369758 +0000 UTC m=+158.005544999 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.904288 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pc29" podStartSLOduration=129.904262004 podStartE2EDuration="2m9.904262004s" podCreationTimestamp="2025-10-01 14:56:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:58:32.892520075 +0000 UTC m=+157.511695246" watchObservedRunningTime="2025-10-01 14:58:32.904262004 +0000 UTC m=+157.523437175" Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.926675 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-shl79" Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.926746 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-shl79" Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.931121 4771 patch_prober.go:28] interesting pod/apiserver-76f77b778f-shl79 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.5:8443/livez\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.931182 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-shl79" podUID="13778ea7-497e-431d-a3a9-96979d2e4885" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.5:8443/livez\": dial tcp 10.217.0.5:8443: connect: connection refused" Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.943371 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-mn6nl" podStartSLOduration=130.943355302 podStartE2EDuration="2m10.943355302s" podCreationTimestamp="2025-10-01 14:56:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:58:32.942541442 +0000 UTC m=+157.561716623" watchObservedRunningTime="2025-10-01 14:58:32.943355302 +0000 UTC m=+157.562530483" Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.947091 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-g7zvd" podStartSLOduration=130.947065237 podStartE2EDuration="2m10.947065237s" podCreationTimestamp="2025-10-01 14:56:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:58:32.929555201 +0000 UTC m=+157.548730372" watchObservedRunningTime="2025-10-01 14:58:32.947065237 +0000 UTC m=+157.566240408" Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.974337 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:32 crc kubenswrapper[4771]: E1001 14:58:32.974706 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:33.474690932 +0000 UTC m=+158.093866103 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:32 crc kubenswrapper[4771]: I1001 14:58:32.978115 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ldrz6" podStartSLOduration=129.978097939 podStartE2EDuration="2m9.978097939s" podCreationTimestamp="2025-10-01 14:56:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:58:32.977703799 +0000 UTC m=+157.596878990" watchObservedRunningTime="2025-10-01 14:58:32.978097939 +0000 UTC m=+157.597273110" Oct 01 14:58:33 crc kubenswrapper[4771]: I1001 14:58:33.029198 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-nrlb2" podStartSLOduration=131.029180773 podStartE2EDuration="2m11.029180773s" podCreationTimestamp="2025-10-01 14:56:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:58:33.028718441 +0000 UTC m=+157.647893622" watchObservedRunningTime="2025-10-01 14:58:33.029180773 +0000 UTC m=+157.648355934" Oct 01 14:58:33 crc kubenswrapper[4771]: I1001 14:58:33.030156 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-nhxdc" podStartSLOduration=131.030148317 podStartE2EDuration="2m11.030148317s" podCreationTimestamp="2025-10-01 14:56:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:58:33.009969553 +0000 UTC m=+157.629144734" watchObservedRunningTime="2025-10-01 14:58:33.030148317 +0000 UTC m=+157.649323488" Oct 01 14:58:33 crc kubenswrapper[4771]: I1001 14:58:33.056473 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ld5bx" podStartSLOduration=131.056457879 podStartE2EDuration="2m11.056457879s" podCreationTimestamp="2025-10-01 14:56:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:58:33.05297609 +0000 UTC m=+157.672151261" watchObservedRunningTime="2025-10-01 14:58:33.056457879 +0000 UTC m=+157.675633050" Oct 01 14:58:33 crc kubenswrapper[4771]: I1001 14:58:33.076235 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:33 crc kubenswrapper[4771]: E1001 14:58:33.076607 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:33.576587073 +0000 UTC m=+158.195762314 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:33 crc kubenswrapper[4771]: I1001 14:58:33.101304 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-shl79" podStartSLOduration=131.101285403 podStartE2EDuration="2m11.101285403s" podCreationTimestamp="2025-10-01 14:56:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:58:33.099769264 +0000 UTC m=+157.718944445" watchObservedRunningTime="2025-10-01 14:58:33.101285403 +0000 UTC m=+157.720460604" Oct 01 14:58:33 crc kubenswrapper[4771]: I1001 14:58:33.177143 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:33 crc kubenswrapper[4771]: E1001 14:58:33.177356 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:33.677324603 +0000 UTC m=+158.296499774 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:33 crc kubenswrapper[4771]: I1001 14:58:33.177443 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:33 crc kubenswrapper[4771]: E1001 14:58:33.177754 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:33.677741374 +0000 UTC m=+158.296916545 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:33 crc kubenswrapper[4771]: I1001 14:58:33.222748 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-l657x" Oct 01 14:58:33 crc kubenswrapper[4771]: I1001 14:58:33.278792 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:33 crc kubenswrapper[4771]: E1001 14:58:33.278998 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:33.778964947 +0000 UTC m=+158.398140118 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:33 crc kubenswrapper[4771]: I1001 14:58:33.279250 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:33 crc kubenswrapper[4771]: E1001 14:58:33.279548 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:33.779536302 +0000 UTC m=+158.398711463 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:33 crc kubenswrapper[4771]: I1001 14:58:33.380397 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:33 crc kubenswrapper[4771]: E1001 14:58:33.380573 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:33.88054699 +0000 UTC m=+158.499722161 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:33 crc kubenswrapper[4771]: I1001 14:58:33.380825 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:33 crc kubenswrapper[4771]: E1001 14:58:33.381153 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:33.881139675 +0000 UTC m=+158.500314846 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:33 crc kubenswrapper[4771]: I1001 14:58:33.405701 4771 patch_prober.go:28] interesting pod/router-default-5444994796-wpxh2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 14:58:33 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Oct 01 14:58:33 crc kubenswrapper[4771]: [+]process-running ok Oct 01 14:58:33 crc kubenswrapper[4771]: healthz check failed Oct 01 14:58:33 crc kubenswrapper[4771]: I1001 14:58:33.405793 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wpxh2" podUID="a2238a30-4ae1-4bd8-acfa-1e357552252c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 14:58:33 crc kubenswrapper[4771]: I1001 14:58:33.414671 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ldrz6" Oct 01 14:58:33 crc kubenswrapper[4771]: I1001 14:58:33.414757 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ldrz6" Oct 01 14:58:33 crc kubenswrapper[4771]: I1001 14:58:33.416544 4771 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-ldrz6 container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.13:8443/livez\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Oct 01 14:58:33 crc kubenswrapper[4771]: I1001 14:58:33.416621 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ldrz6" podUID="541a902f-ea82-44e7-9c01-e93c9e01a2b6" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.13:8443/livez\": dial tcp 10.217.0.13:8443: connect: connection refused" Oct 01 14:58:33 crc kubenswrapper[4771]: I1001 14:58:33.481572 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:33 crc kubenswrapper[4771]: E1001 14:58:33.481782 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:33.981716332 +0000 UTC m=+158.600891503 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:33 crc kubenswrapper[4771]: I1001 14:58:33.481864 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:33 crc kubenswrapper[4771]: E1001 14:58:33.482280 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:33.982269456 +0000 UTC m=+158.601444707 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:33 crc kubenswrapper[4771]: I1001 14:58:33.583421 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:33 crc kubenswrapper[4771]: E1001 14:58:33.583623 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:34.083584442 +0000 UTC m=+158.702759623 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:33 crc kubenswrapper[4771]: I1001 14:58:33.583696 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:33 crc kubenswrapper[4771]: E1001 14:58:33.584066 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:34.084053794 +0000 UTC m=+158.703228975 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:33 crc kubenswrapper[4771]: I1001 14:58:33.684888 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:33 crc kubenswrapper[4771]: E1001 14:58:33.685108 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:34.185069251 +0000 UTC m=+158.804244432 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:33 crc kubenswrapper[4771]: I1001 14:58:33.685519 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:33 crc kubenswrapper[4771]: E1001 14:58:33.685930 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:34.185916353 +0000 UTC m=+158.805091534 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:33 crc kubenswrapper[4771]: I1001 14:58:33.786805 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:33 crc kubenswrapper[4771]: E1001 14:58:33.787170 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:34.287151887 +0000 UTC m=+158.906327058 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:33 crc kubenswrapper[4771]: I1001 14:58:33.824391 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5fnbm" event={"ID":"828f2820-7234-4e86-81fd-ca8666c1e640","Type":"ContainerStarted","Data":"0efb56a055e652f5890788e79314df140ec5d884120066d33271342042319eb7"} Oct 01 14:58:33 crc kubenswrapper[4771]: I1001 14:58:33.826495 4771 patch_prober.go:28] interesting pod/downloads-7954f5f757-nrlb2 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Oct 01 14:58:33 crc kubenswrapper[4771]: I1001 14:58:33.826530 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nrlb2" podUID="d92fedc8-d031-40c1-b9fa-695496499a26" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Oct 01 14:58:33 crc kubenswrapper[4771]: I1001 14:58:33.827568 4771 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-mjvfw container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Oct 01 14:58:33 crc kubenswrapper[4771]: I1001 14:58:33.827619 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-mjvfw" podUID="cd8df04a-9a5e-4784-ac97-9782d936fa5e" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Oct 01 14:58:33 crc kubenswrapper[4771]: I1001 14:58:33.889254 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:33 crc kubenswrapper[4771]: E1001 14:58:33.892511 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:34.390437033 +0000 UTC m=+159.009612304 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:33 crc kubenswrapper[4771]: I1001 14:58:33.974846 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-pr972" Oct 01 14:58:33 crc kubenswrapper[4771]: I1001 14:58:33.990039 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:33 crc kubenswrapper[4771]: E1001 14:58:33.990224 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:34.490193829 +0000 UTC m=+159.109369000 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:33 crc kubenswrapper[4771]: I1001 14:58:33.990342 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:33 crc kubenswrapper[4771]: E1001 14:58:33.990759 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:34.490748232 +0000 UTC m=+159.109923473 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:34 crc kubenswrapper[4771]: I1001 14:58:34.091417 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:34 crc kubenswrapper[4771]: E1001 14:58:34.091585 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:34.591545335 +0000 UTC m=+159.210720516 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:34 crc kubenswrapper[4771]: I1001 14:58:34.091722 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:34 crc kubenswrapper[4771]: E1001 14:58:34.092008 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:34.591995637 +0000 UTC m=+159.211170808 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:34 crc kubenswrapper[4771]: E1001 14:58:34.193033 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:34.693014285 +0000 UTC m=+159.312189456 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:34 crc kubenswrapper[4771]: I1001 14:58:34.193059 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:34 crc kubenswrapper[4771]: I1001 14:58:34.193282 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:34 crc kubenswrapper[4771]: E1001 14:58:34.193600 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:34.693590539 +0000 UTC m=+159.312765710 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:34 crc kubenswrapper[4771]: I1001 14:58:34.294858 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:34 crc kubenswrapper[4771]: E1001 14:58:34.295118 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:34.79509824 +0000 UTC m=+159.414273401 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:34 crc kubenswrapper[4771]: I1001 14:58:34.396710 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:34 crc kubenswrapper[4771]: E1001 14:58:34.397036 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:34.897021742 +0000 UTC m=+159.516196913 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:34 crc kubenswrapper[4771]: I1001 14:58:34.402992 4771 patch_prober.go:28] interesting pod/router-default-5444994796-wpxh2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 14:58:34 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Oct 01 14:58:34 crc kubenswrapper[4771]: [+]process-running ok Oct 01 14:58:34 crc kubenswrapper[4771]: healthz check failed Oct 01 14:58:34 crc kubenswrapper[4771]: I1001 14:58:34.403052 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wpxh2" podUID="a2238a30-4ae1-4bd8-acfa-1e357552252c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 14:58:34 crc kubenswrapper[4771]: I1001 14:58:34.497799 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:34 crc kubenswrapper[4771]: E1001 14:58:34.497983 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:34.997957767 +0000 UTC m=+159.617132938 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:34 crc kubenswrapper[4771]: I1001 14:58:34.498045 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:34 crc kubenswrapper[4771]: E1001 14:58:34.498324 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:34.998312656 +0000 UTC m=+159.617487827 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:34 crc kubenswrapper[4771]: I1001 14:58:34.599117 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:34 crc kubenswrapper[4771]: E1001 14:58:34.599267 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:35.099245082 +0000 UTC m=+159.718420263 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:34 crc kubenswrapper[4771]: I1001 14:58:34.599348 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:34 crc kubenswrapper[4771]: E1001 14:58:34.599709 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:35.099696123 +0000 UTC m=+159.718871294 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:34 crc kubenswrapper[4771]: I1001 14:58:34.700566 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:34 crc kubenswrapper[4771]: E1001 14:58:34.700717 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:35.20068916 +0000 UTC m=+159.819864331 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:34 crc kubenswrapper[4771]: I1001 14:58:34.700957 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:34 crc kubenswrapper[4771]: E1001 14:58:34.701302 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:35.201287826 +0000 UTC m=+159.820463007 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:34 crc kubenswrapper[4771]: I1001 14:58:34.802302 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:34 crc kubenswrapper[4771]: E1001 14:58:34.802519 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:35.302489589 +0000 UTC m=+159.921664770 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:34 crc kubenswrapper[4771]: I1001 14:58:34.802623 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:34 crc kubenswrapper[4771]: E1001 14:58:34.802906 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:35.302893139 +0000 UTC m=+159.922068310 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:34 crc kubenswrapper[4771]: I1001 14:58:34.903529 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:34 crc kubenswrapper[4771]: E1001 14:58:34.903689 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:35.40365953 +0000 UTC m=+160.022834701 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:34 crc kubenswrapper[4771]: I1001 14:58:34.903882 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:34 crc kubenswrapper[4771]: E1001 14:58:34.904423 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:35.40440925 +0000 UTC m=+160.023584421 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:34 crc kubenswrapper[4771]: I1001 14:58:34.941367 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-tk4n9" Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.005202 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:35 crc kubenswrapper[4771]: E1001 14:58:35.005352 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:35.505321575 +0000 UTC m=+160.124496756 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.005474 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:35 crc kubenswrapper[4771]: E1001 14:58:35.006106 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:35.506093906 +0000 UTC m=+160.125269077 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.033640 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-m5m6x"] Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.034560 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m5m6x" Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.036246 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.056581 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m5m6x"] Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.106723 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:35 crc kubenswrapper[4771]: E1001 14:58:35.106943 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:35.606917509 +0000 UTC m=+160.226092680 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.107020 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5-utilities\") pod \"certified-operators-m5m6x\" (UID: \"05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5\") " pod="openshift-marketplace/certified-operators-m5m6x" Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.107060 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvtlq\" (UniqueName: \"kubernetes.io/projected/05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5-kube-api-access-gvtlq\") pod \"certified-operators-m5m6x\" (UID: \"05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5\") " pod="openshift-marketplace/certified-operators-m5m6x" Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.107085 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5-catalog-content\") pod \"certified-operators-m5m6x\" (UID: \"05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5\") " pod="openshift-marketplace/certified-operators-m5m6x" Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.107182 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:35 crc kubenswrapper[4771]: E1001 14:58:35.107444 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:35.607432161 +0000 UTC m=+160.226607332 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.208455 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:35 crc kubenswrapper[4771]: E1001 14:58:35.208665 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:35.708634274 +0000 UTC m=+160.327809455 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.208744 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5-utilities\") pod \"certified-operators-m5m6x\" (UID: \"05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5\") " pod="openshift-marketplace/certified-operators-m5m6x" Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.208849 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvtlq\" (UniqueName: \"kubernetes.io/projected/05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5-kube-api-access-gvtlq\") pod \"certified-operators-m5m6x\" (UID: \"05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5\") " pod="openshift-marketplace/certified-operators-m5m6x" Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.208899 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5-catalog-content\") pod \"certified-operators-m5m6x\" (UID: \"05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5\") " pod="openshift-marketplace/certified-operators-m5m6x" Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.209022 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.209112 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5-utilities\") pod \"certified-operators-m5m6x\" (UID: \"05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5\") " pod="openshift-marketplace/certified-operators-m5m6x" Oct 01 14:58:35 crc kubenswrapper[4771]: E1001 14:58:35.209391 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:35.709382213 +0000 UTC m=+160.328557394 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.209475 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5-catalog-content\") pod \"certified-operators-m5m6x\" (UID: \"05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5\") " pod="openshift-marketplace/certified-operators-m5m6x" Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.223060 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9dq54"] Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.224106 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9dq54" Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.225581 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.234760 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9dq54"] Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.242337 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvtlq\" (UniqueName: \"kubernetes.io/projected/05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5-kube-api-access-gvtlq\") pod \"certified-operators-m5m6x\" (UID: \"05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5\") " pod="openshift-marketplace/certified-operators-m5m6x" Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.310034 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:35 crc kubenswrapper[4771]: E1001 14:58:35.310221 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:35.810194426 +0000 UTC m=+160.429369587 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.310650 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e62d1022-e44e-4353-a6da-b846b2cb2858-utilities\") pod \"community-operators-9dq54\" (UID: \"e62d1022-e44e-4353-a6da-b846b2cb2858\") " pod="openshift-marketplace/community-operators-9dq54" Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.310760 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4htw\" (UniqueName: \"kubernetes.io/projected/e62d1022-e44e-4353-a6da-b846b2cb2858-kube-api-access-g4htw\") pod \"community-operators-9dq54\" (UID: \"e62d1022-e44e-4353-a6da-b846b2cb2858\") " pod="openshift-marketplace/community-operators-9dq54" Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.310803 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e62d1022-e44e-4353-a6da-b846b2cb2858-catalog-content\") pod \"community-operators-9dq54\" (UID: \"e62d1022-e44e-4353-a6da-b846b2cb2858\") " pod="openshift-marketplace/community-operators-9dq54" Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.310861 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:35 crc kubenswrapper[4771]: E1001 14:58:35.311163 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:35.81114946 +0000 UTC m=+160.430324631 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.351297 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m5m6x" Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.411600 4771 patch_prober.go:28] interesting pod/router-default-5444994796-wpxh2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 14:58:35 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Oct 01 14:58:35 crc kubenswrapper[4771]: [+]process-running ok Oct 01 14:58:35 crc kubenswrapper[4771]: healthz check failed Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.411680 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wpxh2" podUID="a2238a30-4ae1-4bd8-acfa-1e357552252c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.412165 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:35 crc kubenswrapper[4771]: E1001 14:58:35.412402 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:35.912385534 +0000 UTC m=+160.531560695 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.412447 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4htw\" (UniqueName: \"kubernetes.io/projected/e62d1022-e44e-4353-a6da-b846b2cb2858-kube-api-access-g4htw\") pod \"community-operators-9dq54\" (UID: \"e62d1022-e44e-4353-a6da-b846b2cb2858\") " pod="openshift-marketplace/community-operators-9dq54" Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.412480 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e62d1022-e44e-4353-a6da-b846b2cb2858-catalog-content\") pod \"community-operators-9dq54\" (UID: \"e62d1022-e44e-4353-a6da-b846b2cb2858\") " pod="openshift-marketplace/community-operators-9dq54" Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.412517 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.412538 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e62d1022-e44e-4353-a6da-b846b2cb2858-utilities\") pod \"community-operators-9dq54\" (UID: \"e62d1022-e44e-4353-a6da-b846b2cb2858\") " pod="openshift-marketplace/community-operators-9dq54" Oct 01 14:58:35 crc kubenswrapper[4771]: E1001 14:58:35.412892 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:35.912875756 +0000 UTC m=+160.532050937 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.412956 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e62d1022-e44e-4353-a6da-b846b2cb2858-utilities\") pod \"community-operators-9dq54\" (UID: \"e62d1022-e44e-4353-a6da-b846b2cb2858\") " pod="openshift-marketplace/community-operators-9dq54" Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.413099 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e62d1022-e44e-4353-a6da-b846b2cb2858-catalog-content\") pod \"community-operators-9dq54\" (UID: \"e62d1022-e44e-4353-a6da-b846b2cb2858\") " pod="openshift-marketplace/community-operators-9dq54" Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.452045 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bnw5f"] Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.453039 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bnw5f" Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.484138 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4htw\" (UniqueName: \"kubernetes.io/projected/e62d1022-e44e-4353-a6da-b846b2cb2858-kube-api-access-g4htw\") pod \"community-operators-9dq54\" (UID: \"e62d1022-e44e-4353-a6da-b846b2cb2858\") " pod="openshift-marketplace/community-operators-9dq54" Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.498687 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bnw5f"] Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.513201 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:35 crc kubenswrapper[4771]: E1001 14:58:35.513328 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:36.01330617 +0000 UTC m=+160.632481351 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.513484 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acd3867f-b2c0-422f-872e-91b01c3e1eed-catalog-content\") pod \"certified-operators-bnw5f\" (UID: \"acd3867f-b2c0-422f-872e-91b01c3e1eed\") " pod="openshift-marketplace/certified-operators-bnw5f" Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.513518 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acd3867f-b2c0-422f-872e-91b01c3e1eed-utilities\") pod \"certified-operators-bnw5f\" (UID: \"acd3867f-b2c0-422f-872e-91b01c3e1eed\") " pod="openshift-marketplace/certified-operators-bnw5f" Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.513616 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.513662 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qxzl\" (UniqueName: \"kubernetes.io/projected/acd3867f-b2c0-422f-872e-91b01c3e1eed-kube-api-access-8qxzl\") pod \"certified-operators-bnw5f\" (UID: \"acd3867f-b2c0-422f-872e-91b01c3e1eed\") " pod="openshift-marketplace/certified-operators-bnw5f" Oct 01 14:58:35 crc kubenswrapper[4771]: E1001 14:58:35.513910 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:36.013899244 +0000 UTC m=+160.633074415 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.536187 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9dq54" Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.614446 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.614670 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qxzl\" (UniqueName: \"kubernetes.io/projected/acd3867f-b2c0-422f-872e-91b01c3e1eed-kube-api-access-8qxzl\") pod \"certified-operators-bnw5f\" (UID: \"acd3867f-b2c0-422f-872e-91b01c3e1eed\") " pod="openshift-marketplace/certified-operators-bnw5f" Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.614721 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acd3867f-b2c0-422f-872e-91b01c3e1eed-catalog-content\") pod \"certified-operators-bnw5f\" (UID: \"acd3867f-b2c0-422f-872e-91b01c3e1eed\") " pod="openshift-marketplace/certified-operators-bnw5f" Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.614751 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acd3867f-b2c0-422f-872e-91b01c3e1eed-utilities\") pod \"certified-operators-bnw5f\" (UID: \"acd3867f-b2c0-422f-872e-91b01c3e1eed\") " pod="openshift-marketplace/certified-operators-bnw5f" Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.615184 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acd3867f-b2c0-422f-872e-91b01c3e1eed-utilities\") pod \"certified-operators-bnw5f\" (UID: \"acd3867f-b2c0-422f-872e-91b01c3e1eed\") " pod="openshift-marketplace/certified-operators-bnw5f" Oct 01 14:58:35 crc kubenswrapper[4771]: E1001 14:58:35.615256 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:36.115240251 +0000 UTC m=+160.734415422 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.615673 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acd3867f-b2c0-422f-872e-91b01c3e1eed-catalog-content\") pod \"certified-operators-bnw5f\" (UID: \"acd3867f-b2c0-422f-872e-91b01c3e1eed\") " pod="openshift-marketplace/certified-operators-bnw5f" Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.632847 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-f85cn"] Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.633828 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f85cn" Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.646455 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qxzl\" (UniqueName: \"kubernetes.io/projected/acd3867f-b2c0-422f-872e-91b01c3e1eed-kube-api-access-8qxzl\") pod \"certified-operators-bnw5f\" (UID: \"acd3867f-b2c0-422f-872e-91b01c3e1eed\") " pod="openshift-marketplace/certified-operators-bnw5f" Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.679511 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f85cn"] Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.716439 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.716513 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eab21930-5f16-4c0b-a4c9-f309f5cc3049-catalog-content\") pod \"community-operators-f85cn\" (UID: \"eab21930-5f16-4c0b-a4c9-f309f5cc3049\") " pod="openshift-marketplace/community-operators-f85cn" Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.716544 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eab21930-5f16-4c0b-a4c9-f309f5cc3049-utilities\") pod \"community-operators-f85cn\" (UID: \"eab21930-5f16-4c0b-a4c9-f309f5cc3049\") " pod="openshift-marketplace/community-operators-f85cn" Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.716589 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssbhx\" (UniqueName: \"kubernetes.io/projected/eab21930-5f16-4c0b-a4c9-f309f5cc3049-kube-api-access-ssbhx\") pod \"community-operators-f85cn\" (UID: \"eab21930-5f16-4c0b-a4c9-f309f5cc3049\") " pod="openshift-marketplace/community-operators-f85cn" Oct 01 14:58:35 crc kubenswrapper[4771]: E1001 14:58:35.716881 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:36.216870024 +0000 UTC m=+160.836045195 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.771289 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bnw5f" Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.817596 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:35 crc kubenswrapper[4771]: E1001 14:58:35.817761 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:36.317719268 +0000 UTC m=+160.936894439 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.818160 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssbhx\" (UniqueName: \"kubernetes.io/projected/eab21930-5f16-4c0b-a4c9-f309f5cc3049-kube-api-access-ssbhx\") pod \"community-operators-f85cn\" (UID: \"eab21930-5f16-4c0b-a4c9-f309f5cc3049\") " pod="openshift-marketplace/community-operators-f85cn" Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.818216 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.818251 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eab21930-5f16-4c0b-a4c9-f309f5cc3049-catalog-content\") pod \"community-operators-f85cn\" (UID: \"eab21930-5f16-4c0b-a4c9-f309f5cc3049\") " pod="openshift-marketplace/community-operators-f85cn" Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.818283 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eab21930-5f16-4c0b-a4c9-f309f5cc3049-utilities\") pod \"community-operators-f85cn\" (UID: \"eab21930-5f16-4c0b-a4c9-f309f5cc3049\") " pod="openshift-marketplace/community-operators-f85cn" Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.819001 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eab21930-5f16-4c0b-a4c9-f309f5cc3049-utilities\") pod \"community-operators-f85cn\" (UID: \"eab21930-5f16-4c0b-a4c9-f309f5cc3049\") " pod="openshift-marketplace/community-operators-f85cn" Oct 01 14:58:35 crc kubenswrapper[4771]: E1001 14:58:35.819551 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:36.319536985 +0000 UTC m=+160.938712156 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.819907 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eab21930-5f16-4c0b-a4c9-f309f5cc3049-catalog-content\") pod \"community-operators-f85cn\" (UID: \"eab21930-5f16-4c0b-a4c9-f309f5cc3049\") " pod="openshift-marketplace/community-operators-f85cn" Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.843851 4771 generic.go:334] "Generic (PLEG): container finished" podID="40496e6d-3f79-4478-804b-dc9904473801" containerID="1262a15d0ce773b7696404ec7c2315d8babf9c50bfe5a5ed0b38caf7396c9c11" exitCode=0 Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.843894 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-xdwrz" event={"ID":"40496e6d-3f79-4478-804b-dc9904473801","Type":"ContainerDied","Data":"1262a15d0ce773b7696404ec7c2315d8babf9c50bfe5a5ed0b38caf7396c9c11"} Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.849087 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssbhx\" (UniqueName: \"kubernetes.io/projected/eab21930-5f16-4c0b-a4c9-f309f5cc3049-kube-api-access-ssbhx\") pod \"community-operators-f85cn\" (UID: \"eab21930-5f16-4c0b-a4c9-f309f5cc3049\") " pod="openshift-marketplace/community-operators-f85cn" Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.882403 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m5m6x"] Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.919598 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:35 crc kubenswrapper[4771]: E1001 14:58:35.920006 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:36.419987718 +0000 UTC m=+161.039162889 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:35 crc kubenswrapper[4771]: I1001 14:58:35.950501 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f85cn" Oct 01 14:58:36 crc kubenswrapper[4771]: W1001 14:58:36.005935 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode62d1022_e44e_4353_a6da_b846b2cb2858.slice/crio-da254b2561fe1afd7a542115905138debae75bd6cab5ad95ea1d1bfa3520870e WatchSource:0}: Error finding container da254b2561fe1afd7a542115905138debae75bd6cab5ad95ea1d1bfa3520870e: Status 404 returned error can't find the container with id da254b2561fe1afd7a542115905138debae75bd6cab5ad95ea1d1bfa3520870e Oct 01 14:58:36 crc kubenswrapper[4771]: I1001 14:58:36.009403 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9dq54"] Oct 01 14:58:36 crc kubenswrapper[4771]: I1001 14:58:36.024250 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:36 crc kubenswrapper[4771]: E1001 14:58:36.024508 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:36.524497107 +0000 UTC m=+161.143672278 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:36 crc kubenswrapper[4771]: I1001 14:58:36.125384 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:36 crc kubenswrapper[4771]: E1001 14:58:36.126023 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:36.625994935 +0000 UTC m=+161.245170126 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:36 crc kubenswrapper[4771]: I1001 14:58:36.126054 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:36 crc kubenswrapper[4771]: E1001 14:58:36.126387 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:36.626375246 +0000 UTC m=+161.245550417 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:36 crc kubenswrapper[4771]: I1001 14:58:36.204191 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bnw5f"] Oct 01 14:58:36 crc kubenswrapper[4771]: W1001 14:58:36.215614 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacd3867f_b2c0_422f_872e_91b01c3e1eed.slice/crio-ebabe5161a780394d9ab35f5bd66b4b8150912889eb0ac1957a83f354f933f5c WatchSource:0}: Error finding container ebabe5161a780394d9ab35f5bd66b4b8150912889eb0ac1957a83f354f933f5c: Status 404 returned error can't find the container with id ebabe5161a780394d9ab35f5bd66b4b8150912889eb0ac1957a83f354f933f5c Oct 01 14:58:36 crc kubenswrapper[4771]: I1001 14:58:36.227083 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:36 crc kubenswrapper[4771]: E1001 14:58:36.227472 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:36.727453195 +0000 UTC m=+161.346628366 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:36 crc kubenswrapper[4771]: I1001 14:58:36.329531 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:36 crc kubenswrapper[4771]: I1001 14:58:36.329891 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-f85cn"] Oct 01 14:58:36 crc kubenswrapper[4771]: E1001 14:58:36.329907 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:36.829892689 +0000 UTC m=+161.449067860 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:36 crc kubenswrapper[4771]: W1001 14:58:36.360548 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeab21930_5f16_4c0b_a4c9_f309f5cc3049.slice/crio-9199ca63f66a349fe3a58c49e5cee8d31a1258395596bf09ca842c7439dc02c0 WatchSource:0}: Error finding container 9199ca63f66a349fe3a58c49e5cee8d31a1258395596bf09ca842c7439dc02c0: Status 404 returned error can't find the container with id 9199ca63f66a349fe3a58c49e5cee8d31a1258395596bf09ca842c7439dc02c0 Oct 01 14:58:36 crc kubenswrapper[4771]: I1001 14:58:36.405624 4771 patch_prober.go:28] interesting pod/router-default-5444994796-wpxh2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 14:58:36 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Oct 01 14:58:36 crc kubenswrapper[4771]: [+]process-running ok Oct 01 14:58:36 crc kubenswrapper[4771]: healthz check failed Oct 01 14:58:36 crc kubenswrapper[4771]: I1001 14:58:36.405671 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wpxh2" podUID="a2238a30-4ae1-4bd8-acfa-1e357552252c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 14:58:36 crc kubenswrapper[4771]: I1001 14:58:36.431056 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:36 crc kubenswrapper[4771]: E1001 14:58:36.431643 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:36.931618186 +0000 UTC m=+161.550793357 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:36 crc kubenswrapper[4771]: I1001 14:58:36.534515 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:36 crc kubenswrapper[4771]: E1001 14:58:36.534833 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:37.034820779 +0000 UTC m=+161.653995950 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:36 crc kubenswrapper[4771]: I1001 14:58:36.635999 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:36 crc kubenswrapper[4771]: E1001 14:58:36.636214 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:37.136181916 +0000 UTC m=+161.755357087 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:36 crc kubenswrapper[4771]: I1001 14:58:36.636275 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:36 crc kubenswrapper[4771]: E1001 14:58:36.636624 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:37.136616357 +0000 UTC m=+161.755791528 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:36 crc kubenswrapper[4771]: I1001 14:58:36.665405 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 01 14:58:36 crc kubenswrapper[4771]: I1001 14:58:36.666328 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 14:58:36 crc kubenswrapper[4771]: I1001 14:58:36.668661 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 01 14:58:36 crc kubenswrapper[4771]: I1001 14:58:36.678463 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 01 14:58:36 crc kubenswrapper[4771]: I1001 14:58:36.682606 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 01 14:58:36 crc kubenswrapper[4771]: I1001 14:58:36.737922 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:36 crc kubenswrapper[4771]: E1001 14:58:36.738071 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:37.238049456 +0000 UTC m=+161.857224627 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:36 crc kubenswrapper[4771]: I1001 14:58:36.738141 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0dfffa8-847f-4e13-970c-b7f72bd3d92f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d0dfffa8-847f-4e13-970c-b7f72bd3d92f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 14:58:36 crc kubenswrapper[4771]: I1001 14:58:36.738195 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d0dfffa8-847f-4e13-970c-b7f72bd3d92f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d0dfffa8-847f-4e13-970c-b7f72bd3d92f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 14:58:36 crc kubenswrapper[4771]: I1001 14:58:36.738286 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:36 crc kubenswrapper[4771]: E1001 14:58:36.738572 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:37.238561559 +0000 UTC m=+161.857736730 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:36 crc kubenswrapper[4771]: I1001 14:58:36.782454 4771 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 01 14:58:36 crc kubenswrapper[4771]: I1001 14:58:36.840539 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:36 crc kubenswrapper[4771]: E1001 14:58:36.840804 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:37.340773118 +0000 UTC m=+161.959948299 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:36 crc kubenswrapper[4771]: I1001 14:58:36.840897 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:36 crc kubenswrapper[4771]: I1001 14:58:36.840948 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0dfffa8-847f-4e13-970c-b7f72bd3d92f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d0dfffa8-847f-4e13-970c-b7f72bd3d92f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 14:58:36 crc kubenswrapper[4771]: I1001 14:58:36.840979 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d0dfffa8-847f-4e13-970c-b7f72bd3d92f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d0dfffa8-847f-4e13-970c-b7f72bd3d92f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 14:58:36 crc kubenswrapper[4771]: I1001 14:58:36.841047 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d0dfffa8-847f-4e13-970c-b7f72bd3d92f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"d0dfffa8-847f-4e13-970c-b7f72bd3d92f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 14:58:36 crc kubenswrapper[4771]: E1001 14:58:36.841230 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:37.341211549 +0000 UTC m=+161.960386730 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:36 crc kubenswrapper[4771]: I1001 14:58:36.853784 4771 generic.go:334] "Generic (PLEG): container finished" podID="e62d1022-e44e-4353-a6da-b846b2cb2858" containerID="157ab001d44f25990e8b825881903ef859154893d98ff24099d52ca4f86d3660" exitCode=0 Oct 01 14:58:36 crc kubenswrapper[4771]: I1001 14:58:36.853865 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9dq54" event={"ID":"e62d1022-e44e-4353-a6da-b846b2cb2858","Type":"ContainerDied","Data":"157ab001d44f25990e8b825881903ef859154893d98ff24099d52ca4f86d3660"} Oct 01 14:58:36 crc kubenswrapper[4771]: I1001 14:58:36.853897 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9dq54" event={"ID":"e62d1022-e44e-4353-a6da-b846b2cb2858","Type":"ContainerStarted","Data":"da254b2561fe1afd7a542115905138debae75bd6cab5ad95ea1d1bfa3520870e"} Oct 01 14:58:36 crc kubenswrapper[4771]: I1001 14:58:36.855582 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 14:58:36 crc kubenswrapper[4771]: I1001 14:58:36.856021 4771 generic.go:334] "Generic (PLEG): container finished" podID="eab21930-5f16-4c0b-a4c9-f309f5cc3049" containerID="30a337ea8da485d5fdec7140aa8639514aad715d6c764d61899a58932122dae2" exitCode=0 Oct 01 14:58:36 crc kubenswrapper[4771]: I1001 14:58:36.856085 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f85cn" event={"ID":"eab21930-5f16-4c0b-a4c9-f309f5cc3049","Type":"ContainerDied","Data":"30a337ea8da485d5fdec7140aa8639514aad715d6c764d61899a58932122dae2"} Oct 01 14:58:36 crc kubenswrapper[4771]: I1001 14:58:36.856111 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f85cn" event={"ID":"eab21930-5f16-4c0b-a4c9-f309f5cc3049","Type":"ContainerStarted","Data":"9199ca63f66a349fe3a58c49e5cee8d31a1258395596bf09ca842c7439dc02c0"} Oct 01 14:58:36 crc kubenswrapper[4771]: I1001 14:58:36.862747 4771 generic.go:334] "Generic (PLEG): container finished" podID="acd3867f-b2c0-422f-872e-91b01c3e1eed" containerID="d393237977ea8842c37edbd3e615d26b802f7117a912ffed28efc1f84e27280e" exitCode=0 Oct 01 14:58:36 crc kubenswrapper[4771]: I1001 14:58:36.862876 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bnw5f" event={"ID":"acd3867f-b2c0-422f-872e-91b01c3e1eed","Type":"ContainerDied","Data":"d393237977ea8842c37edbd3e615d26b802f7117a912ffed28efc1f84e27280e"} Oct 01 14:58:36 crc kubenswrapper[4771]: I1001 14:58:36.862913 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bnw5f" event={"ID":"acd3867f-b2c0-422f-872e-91b01c3e1eed","Type":"ContainerStarted","Data":"ebabe5161a780394d9ab35f5bd66b4b8150912889eb0ac1957a83f354f933f5c"} Oct 01 14:58:36 crc kubenswrapper[4771]: I1001 14:58:36.872501 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0dfffa8-847f-4e13-970c-b7f72bd3d92f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"d0dfffa8-847f-4e13-970c-b7f72bd3d92f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 14:58:36 crc kubenswrapper[4771]: I1001 14:58:36.877645 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m5m6x" event={"ID":"05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5","Type":"ContainerDied","Data":"8cc390e041456cc03f1bf05c4e2bb2799f2e2dbe703b3e6bf0967c1998554ca9"} Oct 01 14:58:36 crc kubenswrapper[4771]: I1001 14:58:36.876723 4771 generic.go:334] "Generic (PLEG): container finished" podID="05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5" containerID="8cc390e041456cc03f1bf05c4e2bb2799f2e2dbe703b3e6bf0967c1998554ca9" exitCode=0 Oct 01 14:58:36 crc kubenswrapper[4771]: I1001 14:58:36.883988 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m5m6x" event={"ID":"05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5","Type":"ContainerStarted","Data":"9274edeb931beec0afd0bc06bcfb580d57a7ecd37484f96913597cc81ac17588"} Oct 01 14:58:36 crc kubenswrapper[4771]: I1001 14:58:36.889084 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5fnbm" event={"ID":"828f2820-7234-4e86-81fd-ca8666c1e640","Type":"ContainerStarted","Data":"429d47f0dae4d41e3d349afa7de71e26b2d6796a4596139c2c8c7084d2c232df"} Oct 01 14:58:36 crc kubenswrapper[4771]: I1001 14:58:36.889135 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5fnbm" event={"ID":"828f2820-7234-4e86-81fd-ca8666c1e640","Type":"ContainerStarted","Data":"a213a75f75c5823ac5170373637b531bf89d7f1facd1d4cdea518aa8b8081cba"} Oct 01 14:58:36 crc kubenswrapper[4771]: I1001 14:58:36.941720 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:36 crc kubenswrapper[4771]: E1001 14:58:36.941955 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:37.441923089 +0000 UTC m=+162.061098260 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:36 crc kubenswrapper[4771]: I1001 14:58:36.942169 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:36 crc kubenswrapper[4771]: E1001 14:58:36.943633 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:37.443614002 +0000 UTC m=+162.062789173 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:36 crc kubenswrapper[4771]: I1001 14:58:36.984056 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.030662 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6z4mz"] Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.031627 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6z4mz" Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.039824 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.043816 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:37 crc kubenswrapper[4771]: E1001 14:58:37.044115 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-01 14:58:37.544101567 +0000 UTC m=+162.163276738 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.053084 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6z4mz"] Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.146893 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.147194 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee2a2c00-b21b-45af-9de5-0cc26da899b3-catalog-content\") pod \"redhat-marketplace-6z4mz\" (UID: \"ee2a2c00-b21b-45af-9de5-0cc26da899b3\") " pod="openshift-marketplace/redhat-marketplace-6z4mz" Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.147215 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee2a2c00-b21b-45af-9de5-0cc26da899b3-utilities\") pod \"redhat-marketplace-6z4mz\" (UID: \"ee2a2c00-b21b-45af-9de5-0cc26da899b3\") " pod="openshift-marketplace/redhat-marketplace-6z4mz" Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.147251 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgz52\" (UniqueName: \"kubernetes.io/projected/ee2a2c00-b21b-45af-9de5-0cc26da899b3-kube-api-access-bgz52\") pod \"redhat-marketplace-6z4mz\" (UID: \"ee2a2c00-b21b-45af-9de5-0cc26da899b3\") " pod="openshift-marketplace/redhat-marketplace-6z4mz" Oct 01 14:58:37 crc kubenswrapper[4771]: E1001 14:58:37.147622 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-01 14:58:37.647594498 +0000 UTC m=+162.266769669 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-zhdpb" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.227990 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-xdwrz" Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.233208 4771 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-01T14:58:36.782743677Z","Handler":null,"Name":""} Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.245433 4771 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.245469 4771 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.261996 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.262366 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgz52\" (UniqueName: \"kubernetes.io/projected/ee2a2c00-b21b-45af-9de5-0cc26da899b3-kube-api-access-bgz52\") pod \"redhat-marketplace-6z4mz\" (UID: \"ee2a2c00-b21b-45af-9de5-0cc26da899b3\") " pod="openshift-marketplace/redhat-marketplace-6z4mz" Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.262469 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee2a2c00-b21b-45af-9de5-0cc26da899b3-catalog-content\") pod \"redhat-marketplace-6z4mz\" (UID: \"ee2a2c00-b21b-45af-9de5-0cc26da899b3\") " pod="openshift-marketplace/redhat-marketplace-6z4mz" Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.262488 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee2a2c00-b21b-45af-9de5-0cc26da899b3-utilities\") pod \"redhat-marketplace-6z4mz\" (UID: \"ee2a2c00-b21b-45af-9de5-0cc26da899b3\") " pod="openshift-marketplace/redhat-marketplace-6z4mz" Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.263318 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee2a2c00-b21b-45af-9de5-0cc26da899b3-utilities\") pod \"redhat-marketplace-6z4mz\" (UID: \"ee2a2c00-b21b-45af-9de5-0cc26da899b3\") " pod="openshift-marketplace/redhat-marketplace-6z4mz" Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.267535 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee2a2c00-b21b-45af-9de5-0cc26da899b3-catalog-content\") pod \"redhat-marketplace-6z4mz\" (UID: \"ee2a2c00-b21b-45af-9de5-0cc26da899b3\") " pod="openshift-marketplace/redhat-marketplace-6z4mz" Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.272925 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.299868 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgz52\" (UniqueName: \"kubernetes.io/projected/ee2a2c00-b21b-45af-9de5-0cc26da899b3-kube-api-access-bgz52\") pod \"redhat-marketplace-6z4mz\" (UID: \"ee2a2c00-b21b-45af-9de5-0cc26da899b3\") " pod="openshift-marketplace/redhat-marketplace-6z4mz" Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.332142 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.347790 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6z4mz" Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.365396 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxzp5\" (UniqueName: \"kubernetes.io/projected/40496e6d-3f79-4478-804b-dc9904473801-kube-api-access-pxzp5\") pod \"40496e6d-3f79-4478-804b-dc9904473801\" (UID: \"40496e6d-3f79-4478-804b-dc9904473801\") " Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.365460 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40496e6d-3f79-4478-804b-dc9904473801-config-volume\") pod \"40496e6d-3f79-4478-804b-dc9904473801\" (UID: \"40496e6d-3f79-4478-804b-dc9904473801\") " Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.365514 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40496e6d-3f79-4478-804b-dc9904473801-secret-volume\") pod \"40496e6d-3f79-4478-804b-dc9904473801\" (UID: \"40496e6d-3f79-4478-804b-dc9904473801\") " Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.365811 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.368234 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40496e6d-3f79-4478-804b-dc9904473801-config-volume" (OuterVolumeSpecName: "config-volume") pod "40496e6d-3f79-4478-804b-dc9904473801" (UID: "40496e6d-3f79-4478-804b-dc9904473801"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.390523 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40496e6d-3f79-4478-804b-dc9904473801-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "40496e6d-3f79-4478-804b-dc9904473801" (UID: "40496e6d-3f79-4478-804b-dc9904473801"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.404407 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40496e6d-3f79-4478-804b-dc9904473801-kube-api-access-pxzp5" (OuterVolumeSpecName: "kube-api-access-pxzp5") pod "40496e6d-3f79-4478-804b-dc9904473801" (UID: "40496e6d-3f79-4478-804b-dc9904473801"). InnerVolumeSpecName "kube-api-access-pxzp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.419513 4771 patch_prober.go:28] interesting pod/router-default-5444994796-wpxh2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 14:58:37 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Oct 01 14:58:37 crc kubenswrapper[4771]: [+]process-running ok Oct 01 14:58:37 crc kubenswrapper[4771]: healthz check failed Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.419565 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wpxh2" podUID="a2238a30-4ae1-4bd8-acfa-1e357552252c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.458259 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-h8g2c"] Oct 01 14:58:37 crc kubenswrapper[4771]: E1001 14:58:37.458590 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40496e6d-3f79-4478-804b-dc9904473801" containerName="collect-profiles" Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.458622 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="40496e6d-3f79-4478-804b-dc9904473801" containerName="collect-profiles" Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.458784 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="40496e6d-3f79-4478-804b-dc9904473801" containerName="collect-profiles" Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.466028 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h8g2c" Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.467282 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxzp5\" (UniqueName: \"kubernetes.io/projected/40496e6d-3f79-4478-804b-dc9904473801-kube-api-access-pxzp5\") on node \"crc\" DevicePath \"\"" Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.467316 4771 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/40496e6d-3f79-4478-804b-dc9904473801-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.467325 4771 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/40496e6d-3f79-4478-804b-dc9904473801-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.470518 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h8g2c"] Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.479650 4771 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.479706 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.551117 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-zhdpb\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.585511 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tcdb\" (UniqueName: \"kubernetes.io/projected/dc5bfdae-9f89-4ef2-b197-69937670c341-kube-api-access-7tcdb\") pod \"redhat-marketplace-h8g2c\" (UID: \"dc5bfdae-9f89-4ef2-b197-69937670c341\") " pod="openshift-marketplace/redhat-marketplace-h8g2c" Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.585576 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc5bfdae-9f89-4ef2-b197-69937670c341-catalog-content\") pod \"redhat-marketplace-h8g2c\" (UID: \"dc5bfdae-9f89-4ef2-b197-69937670c341\") " pod="openshift-marketplace/redhat-marketplace-h8g2c" Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.585619 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc5bfdae-9f89-4ef2-b197-69937670c341-utilities\") pod \"redhat-marketplace-h8g2c\" (UID: \"dc5bfdae-9f89-4ef2-b197-69937670c341\") " pod="openshift-marketplace/redhat-marketplace-h8g2c" Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.605033 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.687379 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc5bfdae-9f89-4ef2-b197-69937670c341-catalog-content\") pod \"redhat-marketplace-h8g2c\" (UID: \"dc5bfdae-9f89-4ef2-b197-69937670c341\") " pod="openshift-marketplace/redhat-marketplace-h8g2c" Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.687524 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6z4mz"] Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.687775 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc5bfdae-9f89-4ef2-b197-69937670c341-utilities\") pod \"redhat-marketplace-h8g2c\" (UID: \"dc5bfdae-9f89-4ef2-b197-69937670c341\") " pod="openshift-marketplace/redhat-marketplace-h8g2c" Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.688126 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tcdb\" (UniqueName: \"kubernetes.io/projected/dc5bfdae-9f89-4ef2-b197-69937670c341-kube-api-access-7tcdb\") pod \"redhat-marketplace-h8g2c\" (UID: \"dc5bfdae-9f89-4ef2-b197-69937670c341\") " pod="openshift-marketplace/redhat-marketplace-h8g2c" Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.688148 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc5bfdae-9f89-4ef2-b197-69937670c341-utilities\") pod \"redhat-marketplace-h8g2c\" (UID: \"dc5bfdae-9f89-4ef2-b197-69937670c341\") " pod="openshift-marketplace/redhat-marketplace-h8g2c" Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.688204 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc5bfdae-9f89-4ef2-b197-69937670c341-catalog-content\") pod \"redhat-marketplace-h8g2c\" (UID: \"dc5bfdae-9f89-4ef2-b197-69937670c341\") " pod="openshift-marketplace/redhat-marketplace-h8g2c" Oct 01 14:58:37 crc kubenswrapper[4771]: W1001 14:58:37.699935 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee2a2c00_b21b_45af_9de5_0cc26da899b3.slice/crio-645836519f313cb989cf46703d1303657e531546ef20dc8b250f740dc1940d18 WatchSource:0}: Error finding container 645836519f313cb989cf46703d1303657e531546ef20dc8b250f740dc1940d18: Status 404 returned error can't find the container with id 645836519f313cb989cf46703d1303657e531546ef20dc8b250f740dc1940d18 Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.704870 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tcdb\" (UniqueName: \"kubernetes.io/projected/dc5bfdae-9f89-4ef2-b197-69937670c341-kube-api-access-7tcdb\") pod \"redhat-marketplace-h8g2c\" (UID: \"dc5bfdae-9f89-4ef2-b197-69937670c341\") " pod="openshift-marketplace/redhat-marketplace-h8g2c" Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.786200 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h8g2c" Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.799372 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zhdpb"] Oct 01 14:58:37 crc kubenswrapper[4771]: W1001 14:58:37.803323 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b800e30_2559_4c0b_9732_7a069ae3da91.slice/crio-35f75a6301fa6e562a0456ad86e9c733adb667c160045964ade968b6944c736b WatchSource:0}: Error finding container 35f75a6301fa6e562a0456ad86e9c733adb667c160045964ade968b6944c736b: Status 404 returned error can't find the container with id 35f75a6301fa6e562a0456ad86e9c733adb667c160045964ade968b6944c736b Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.903867 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" event={"ID":"8b800e30-2559-4c0b-9732-7a069ae3da91","Type":"ContainerStarted","Data":"35f75a6301fa6e562a0456ad86e9c733adb667c160045964ade968b6944c736b"} Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.906408 4771 generic.go:334] "Generic (PLEG): container finished" podID="ee2a2c00-b21b-45af-9de5-0cc26da899b3" containerID="e02e3953bdb6ef5e2c5be459755277b258337be69082faf2bef16b8d688ac7d4" exitCode=0 Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.906527 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6z4mz" event={"ID":"ee2a2c00-b21b-45af-9de5-0cc26da899b3","Type":"ContainerDied","Data":"e02e3953bdb6ef5e2c5be459755277b258337be69082faf2bef16b8d688ac7d4"} Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.906598 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6z4mz" event={"ID":"ee2a2c00-b21b-45af-9de5-0cc26da899b3","Type":"ContainerStarted","Data":"645836519f313cb989cf46703d1303657e531546ef20dc8b250f740dc1940d18"} Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.911532 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d0dfffa8-847f-4e13-970c-b7f72bd3d92f","Type":"ContainerStarted","Data":"ec570426fc7b87e8486913df8ccba3aaaa8a65e6608400587df021aeab9019b4"} Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.911585 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d0dfffa8-847f-4e13-970c-b7f72bd3d92f","Type":"ContainerStarted","Data":"e3af5714ba6b81b16efa649b0933e03a1049663884b420059c8ea55a3153e77b"} Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.935568 4771 patch_prober.go:28] interesting pod/apiserver-76f77b778f-shl79 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 01 14:58:37 crc kubenswrapper[4771]: [+]log ok Oct 01 14:58:37 crc kubenswrapper[4771]: [+]etcd ok Oct 01 14:58:37 crc kubenswrapper[4771]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 01 14:58:37 crc kubenswrapper[4771]: [+]poststarthook/generic-apiserver-start-informers ok Oct 01 14:58:37 crc kubenswrapper[4771]: [+]poststarthook/max-in-flight-filter ok Oct 01 14:58:37 crc kubenswrapper[4771]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 01 14:58:37 crc kubenswrapper[4771]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 01 14:58:37 crc kubenswrapper[4771]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Oct 01 14:58:37 crc kubenswrapper[4771]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Oct 01 14:58:37 crc kubenswrapper[4771]: [+]poststarthook/project.openshift.io-projectcache ok Oct 01 14:58:37 crc kubenswrapper[4771]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 01 14:58:37 crc kubenswrapper[4771]: [+]poststarthook/openshift.io-startinformers ok Oct 01 14:58:37 crc kubenswrapper[4771]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 01 14:58:37 crc kubenswrapper[4771]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 01 14:58:37 crc kubenswrapper[4771]: livez check failed Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.935621 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-shl79" podUID="13778ea7-497e-431d-a3a9-96979d2e4885" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.937205 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-5fnbm" event={"ID":"828f2820-7234-4e86-81fd-ca8666c1e640","Type":"ContainerStarted","Data":"e5121828093af8fb0a589e1ac50f53c3a077c1513d50a85a31eda45163376302"} Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.942786 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-xdwrz" event={"ID":"40496e6d-3f79-4478-804b-dc9904473801","Type":"ContainerDied","Data":"8ea2b87434699e66bdcc1fa2e5c8511f110832cb1ea8f07ec0398bca9ebcc2a3"} Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.942819 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ea2b87434699e66bdcc1fa2e5c8511f110832cb1ea8f07ec0398bca9ebcc2a3" Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.942905 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322165-xdwrz" Oct 01 14:58:37 crc kubenswrapper[4771]: I1001 14:58:37.950613 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=1.950594111 podStartE2EDuration="1.950594111s" podCreationTimestamp="2025-10-01 14:58:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:58:37.948251791 +0000 UTC m=+162.567426972" watchObservedRunningTime="2025-10-01 14:58:37.950594111 +0000 UTC m=+162.569769282" Oct 01 14:58:38 crc kubenswrapper[4771]: I1001 14:58:38.002005 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 01 14:58:38 crc kubenswrapper[4771]: I1001 14:58:38.021719 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-5fnbm" podStartSLOduration=13.021695115 podStartE2EDuration="13.021695115s" podCreationTimestamp="2025-10-01 14:58:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:58:37.971901775 +0000 UTC m=+162.591076946" watchObservedRunningTime="2025-10-01 14:58:38.021695115 +0000 UTC m=+162.640870286" Oct 01 14:58:38 crc kubenswrapper[4771]: I1001 14:58:38.022285 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-h8g2c"] Oct 01 14:58:38 crc kubenswrapper[4771]: W1001 14:58:38.043402 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc5bfdae_9f89_4ef2_b197_69937670c341.slice/crio-0857d1856fa16309ceec618084603c51f1d80810a52385b0360eeca3092771cc WatchSource:0}: Error finding container 0857d1856fa16309ceec618084603c51f1d80810a52385b0360eeca3092771cc: Status 404 returned error can't find the container with id 0857d1856fa16309ceec618084603c51f1d80810a52385b0360eeca3092771cc Oct 01 14:58:38 crc kubenswrapper[4771]: I1001 14:58:38.343878 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-rsd2x" Oct 01 14:58:38 crc kubenswrapper[4771]: I1001 14:58:38.399409 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-wpxh2" Oct 01 14:58:38 crc kubenswrapper[4771]: I1001 14:58:38.402095 4771 patch_prober.go:28] interesting pod/router-default-5444994796-wpxh2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 14:58:38 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Oct 01 14:58:38 crc kubenswrapper[4771]: [+]process-running ok Oct 01 14:58:38 crc kubenswrapper[4771]: healthz check failed Oct 01 14:58:38 crc kubenswrapper[4771]: I1001 14:58:38.402151 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wpxh2" podUID="a2238a30-4ae1-4bd8-acfa-1e357552252c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 14:58:38 crc kubenswrapper[4771]: I1001 14:58:38.429783 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ldrz6" Oct 01 14:58:38 crc kubenswrapper[4771]: I1001 14:58:38.435537 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-pvm7n" Oct 01 14:58:38 crc kubenswrapper[4771]: I1001 14:58:38.438211 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ldrz6" Oct 01 14:58:38 crc kubenswrapper[4771]: I1001 14:58:38.446381 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bxmn4"] Oct 01 14:58:38 crc kubenswrapper[4771]: I1001 14:58:38.448709 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bxmn4" Oct 01 14:58:38 crc kubenswrapper[4771]: I1001 14:58:38.450896 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 01 14:58:38 crc kubenswrapper[4771]: I1001 14:58:38.457125 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bxmn4"] Oct 01 14:58:38 crc kubenswrapper[4771]: I1001 14:58:38.500500 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d16ca88c-55fc-497e-818c-ed358e0c4bfb-utilities\") pod \"redhat-operators-bxmn4\" (UID: \"d16ca88c-55fc-497e-818c-ed358e0c4bfb\") " pod="openshift-marketplace/redhat-operators-bxmn4" Oct 01 14:58:38 crc kubenswrapper[4771]: I1001 14:58:38.500685 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9649d\" (UniqueName: \"kubernetes.io/projected/d16ca88c-55fc-497e-818c-ed358e0c4bfb-kube-api-access-9649d\") pod \"redhat-operators-bxmn4\" (UID: \"d16ca88c-55fc-497e-818c-ed358e0c4bfb\") " pod="openshift-marketplace/redhat-operators-bxmn4" Oct 01 14:58:38 crc kubenswrapper[4771]: I1001 14:58:38.500770 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d16ca88c-55fc-497e-818c-ed358e0c4bfb-catalog-content\") pod \"redhat-operators-bxmn4\" (UID: \"d16ca88c-55fc-497e-818c-ed358e0c4bfb\") " pod="openshift-marketplace/redhat-operators-bxmn4" Oct 01 14:58:38 crc kubenswrapper[4771]: I1001 14:58:38.552380 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-szwtc" Oct 01 14:58:38 crc kubenswrapper[4771]: I1001 14:58:38.552606 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-szwtc" Oct 01 14:58:38 crc kubenswrapper[4771]: I1001 14:58:38.553997 4771 patch_prober.go:28] interesting pod/console-f9d7485db-szwtc container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.18:8443/health\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Oct 01 14:58:38 crc kubenswrapper[4771]: I1001 14:58:38.554039 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-szwtc" podUID="9db1abd4-f11c-45e1-9341-6c818c3e3579" containerName="console" probeResult="failure" output="Get \"https://10.217.0.18:8443/health\": dial tcp 10.217.0.18:8443: connect: connection refused" Oct 01 14:58:38 crc kubenswrapper[4771]: I1001 14:58:38.601581 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d16ca88c-55fc-497e-818c-ed358e0c4bfb-utilities\") pod \"redhat-operators-bxmn4\" (UID: \"d16ca88c-55fc-497e-818c-ed358e0c4bfb\") " pod="openshift-marketplace/redhat-operators-bxmn4" Oct 01 14:58:38 crc kubenswrapper[4771]: I1001 14:58:38.601684 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9649d\" (UniqueName: \"kubernetes.io/projected/d16ca88c-55fc-497e-818c-ed358e0c4bfb-kube-api-access-9649d\") pod \"redhat-operators-bxmn4\" (UID: \"d16ca88c-55fc-497e-818c-ed358e0c4bfb\") " pod="openshift-marketplace/redhat-operators-bxmn4" Oct 01 14:58:38 crc kubenswrapper[4771]: I1001 14:58:38.601712 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d16ca88c-55fc-497e-818c-ed358e0c4bfb-catalog-content\") pod \"redhat-operators-bxmn4\" (UID: \"d16ca88c-55fc-497e-818c-ed358e0c4bfb\") " pod="openshift-marketplace/redhat-operators-bxmn4" Oct 01 14:58:38 crc kubenswrapper[4771]: I1001 14:58:38.604349 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d16ca88c-55fc-497e-818c-ed358e0c4bfb-utilities\") pod \"redhat-operators-bxmn4\" (UID: \"d16ca88c-55fc-497e-818c-ed358e0c4bfb\") " pod="openshift-marketplace/redhat-operators-bxmn4" Oct 01 14:58:38 crc kubenswrapper[4771]: I1001 14:58:38.604694 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d16ca88c-55fc-497e-818c-ed358e0c4bfb-catalog-content\") pod \"redhat-operators-bxmn4\" (UID: \"d16ca88c-55fc-497e-818c-ed358e0c4bfb\") " pod="openshift-marketplace/redhat-operators-bxmn4" Oct 01 14:58:38 crc kubenswrapper[4771]: I1001 14:58:38.639348 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9649d\" (UniqueName: \"kubernetes.io/projected/d16ca88c-55fc-497e-818c-ed358e0c4bfb-kube-api-access-9649d\") pod \"redhat-operators-bxmn4\" (UID: \"d16ca88c-55fc-497e-818c-ed358e0c4bfb\") " pod="openshift-marketplace/redhat-operators-bxmn4" Oct 01 14:58:38 crc kubenswrapper[4771]: I1001 14:58:38.691187 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4fgp8" Oct 01 14:58:38 crc kubenswrapper[4771]: I1001 14:58:38.733143 4771 patch_prober.go:28] interesting pod/downloads-7954f5f757-nrlb2 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Oct 01 14:58:38 crc kubenswrapper[4771]: I1001 14:58:38.733201 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nrlb2" podUID="d92fedc8-d031-40c1-b9fa-695496499a26" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Oct 01 14:58:38 crc kubenswrapper[4771]: I1001 14:58:38.736326 4771 patch_prober.go:28] interesting pod/downloads-7954f5f757-nrlb2 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Oct 01 14:58:38 crc kubenswrapper[4771]: I1001 14:58:38.736368 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-nrlb2" podUID="d92fedc8-d031-40c1-b9fa-695496499a26" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Oct 01 14:58:38 crc kubenswrapper[4771]: I1001 14:58:38.769412 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bxmn4" Oct 01 14:58:38 crc kubenswrapper[4771]: I1001 14:58:38.828227 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bf6jc"] Oct 01 14:58:38 crc kubenswrapper[4771]: I1001 14:58:38.829442 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bf6jc" Oct 01 14:58:38 crc kubenswrapper[4771]: I1001 14:58:38.849629 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bf6jc"] Oct 01 14:58:38 crc kubenswrapper[4771]: I1001 14:58:38.906608 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd69v\" (UniqueName: \"kubernetes.io/projected/8d691866-2c89-43bc-9d4b-931c6ac2b4d7-kube-api-access-hd69v\") pod \"redhat-operators-bf6jc\" (UID: \"8d691866-2c89-43bc-9d4b-931c6ac2b4d7\") " pod="openshift-marketplace/redhat-operators-bf6jc" Oct 01 14:58:38 crc kubenswrapper[4771]: I1001 14:58:38.906953 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d691866-2c89-43bc-9d4b-931c6ac2b4d7-utilities\") pod \"redhat-operators-bf6jc\" (UID: \"8d691866-2c89-43bc-9d4b-931c6ac2b4d7\") " pod="openshift-marketplace/redhat-operators-bf6jc" Oct 01 14:58:38 crc kubenswrapper[4771]: I1001 14:58:38.906978 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d691866-2c89-43bc-9d4b-931c6ac2b4d7-catalog-content\") pod \"redhat-operators-bf6jc\" (UID: \"8d691866-2c89-43bc-9d4b-931c6ac2b4d7\") " pod="openshift-marketplace/redhat-operators-bf6jc" Oct 01 14:58:38 crc kubenswrapper[4771]: I1001 14:58:38.952300 4771 generic.go:334] "Generic (PLEG): container finished" podID="dc5bfdae-9f89-4ef2-b197-69937670c341" containerID="6c5f00b7d5ea7b3521c534813eb31d382d7d633f273d2208799751249d9c49c7" exitCode=0 Oct 01 14:58:38 crc kubenswrapper[4771]: I1001 14:58:38.952412 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h8g2c" event={"ID":"dc5bfdae-9f89-4ef2-b197-69937670c341","Type":"ContainerDied","Data":"6c5f00b7d5ea7b3521c534813eb31d382d7d633f273d2208799751249d9c49c7"} Oct 01 14:58:38 crc kubenswrapper[4771]: I1001 14:58:38.952438 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h8g2c" event={"ID":"dc5bfdae-9f89-4ef2-b197-69937670c341","Type":"ContainerStarted","Data":"0857d1856fa16309ceec618084603c51f1d80810a52385b0360eeca3092771cc"} Oct 01 14:58:38 crc kubenswrapper[4771]: I1001 14:58:38.957898 4771 generic.go:334] "Generic (PLEG): container finished" podID="d0dfffa8-847f-4e13-970c-b7f72bd3d92f" containerID="ec570426fc7b87e8486913df8ccba3aaaa8a65e6608400587df021aeab9019b4" exitCode=0 Oct 01 14:58:38 crc kubenswrapper[4771]: I1001 14:58:38.958033 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d0dfffa8-847f-4e13-970c-b7f72bd3d92f","Type":"ContainerDied","Data":"ec570426fc7b87e8486913df8ccba3aaaa8a65e6608400587df021aeab9019b4"} Oct 01 14:58:38 crc kubenswrapper[4771]: I1001 14:58:38.964964 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" event={"ID":"8b800e30-2559-4c0b-9732-7a069ae3da91","Type":"ContainerStarted","Data":"618e1641aa67bab1f7d90d08fc7f0331d42f857de85d23fb278c9da637f2c175"} Oct 01 14:58:38 crc kubenswrapper[4771]: I1001 14:58:38.991961 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-mjvfw" Oct 01 14:58:38 crc kubenswrapper[4771]: I1001 14:58:38.993381 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bxmn4"] Oct 01 14:58:39 crc kubenswrapper[4771]: I1001 14:58:39.002231 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" podStartSLOduration=137.002208209 podStartE2EDuration="2m17.002208209s" podCreationTimestamp="2025-10-01 14:56:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:58:38.999596533 +0000 UTC m=+163.618771704" watchObservedRunningTime="2025-10-01 14:58:39.002208209 +0000 UTC m=+163.621383380" Oct 01 14:58:39 crc kubenswrapper[4771]: I1001 14:58:39.008191 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d691866-2c89-43bc-9d4b-931c6ac2b4d7-utilities\") pod \"redhat-operators-bf6jc\" (UID: \"8d691866-2c89-43bc-9d4b-931c6ac2b4d7\") " pod="openshift-marketplace/redhat-operators-bf6jc" Oct 01 14:58:39 crc kubenswrapper[4771]: I1001 14:58:39.008264 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d691866-2c89-43bc-9d4b-931c6ac2b4d7-catalog-content\") pod \"redhat-operators-bf6jc\" (UID: \"8d691866-2c89-43bc-9d4b-931c6ac2b4d7\") " pod="openshift-marketplace/redhat-operators-bf6jc" Oct 01 14:58:39 crc kubenswrapper[4771]: I1001 14:58:39.008346 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hd69v\" (UniqueName: \"kubernetes.io/projected/8d691866-2c89-43bc-9d4b-931c6ac2b4d7-kube-api-access-hd69v\") pod \"redhat-operators-bf6jc\" (UID: \"8d691866-2c89-43bc-9d4b-931c6ac2b4d7\") " pod="openshift-marketplace/redhat-operators-bf6jc" Oct 01 14:58:39 crc kubenswrapper[4771]: I1001 14:58:39.010146 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d691866-2c89-43bc-9d4b-931c6ac2b4d7-utilities\") pod \"redhat-operators-bf6jc\" (UID: \"8d691866-2c89-43bc-9d4b-931c6ac2b4d7\") " pod="openshift-marketplace/redhat-operators-bf6jc" Oct 01 14:58:39 crc kubenswrapper[4771]: I1001 14:58:39.011832 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d691866-2c89-43bc-9d4b-931c6ac2b4d7-catalog-content\") pod \"redhat-operators-bf6jc\" (UID: \"8d691866-2c89-43bc-9d4b-931c6ac2b4d7\") " pod="openshift-marketplace/redhat-operators-bf6jc" Oct 01 14:58:39 crc kubenswrapper[4771]: W1001 14:58:39.017265 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd16ca88c_55fc_497e_818c_ed358e0c4bfb.slice/crio-23d5dd4409626d4894a20c327d4474765c0058fc558a03af92887a780fc4f6d2 WatchSource:0}: Error finding container 23d5dd4409626d4894a20c327d4474765c0058fc558a03af92887a780fc4f6d2: Status 404 returned error can't find the container with id 23d5dd4409626d4894a20c327d4474765c0058fc558a03af92887a780fc4f6d2 Oct 01 14:58:39 crc kubenswrapper[4771]: I1001 14:58:39.033037 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd69v\" (UniqueName: \"kubernetes.io/projected/8d691866-2c89-43bc-9d4b-931c6ac2b4d7-kube-api-access-hd69v\") pod \"redhat-operators-bf6jc\" (UID: \"8d691866-2c89-43bc-9d4b-931c6ac2b4d7\") " pod="openshift-marketplace/redhat-operators-bf6jc" Oct 01 14:58:39 crc kubenswrapper[4771]: I1001 14:58:39.149679 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bf6jc" Oct 01 14:58:39 crc kubenswrapper[4771]: I1001 14:58:39.351524 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bf6jc"] Oct 01 14:58:39 crc kubenswrapper[4771]: I1001 14:58:39.403290 4771 patch_prober.go:28] interesting pod/router-default-5444994796-wpxh2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 14:58:39 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Oct 01 14:58:39 crc kubenswrapper[4771]: [+]process-running ok Oct 01 14:58:39 crc kubenswrapper[4771]: healthz check failed Oct 01 14:58:39 crc kubenswrapper[4771]: I1001 14:58:39.403344 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wpxh2" podUID="a2238a30-4ae1-4bd8-acfa-1e357552252c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 14:58:39 crc kubenswrapper[4771]: I1001 14:58:39.617517 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 01 14:58:39 crc kubenswrapper[4771]: I1001 14:58:39.618772 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 14:58:39 crc kubenswrapper[4771]: I1001 14:58:39.623093 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 01 14:58:39 crc kubenswrapper[4771]: I1001 14:58:39.623228 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 01 14:58:39 crc kubenswrapper[4771]: I1001 14:58:39.634529 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 01 14:58:39 crc kubenswrapper[4771]: I1001 14:58:39.724060 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4b0c930b-aa23-40ab-892a-fe0eb123351c-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"4b0c930b-aa23-40ab-892a-fe0eb123351c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 14:58:39 crc kubenswrapper[4771]: I1001 14:58:39.724391 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4b0c930b-aa23-40ab-892a-fe0eb123351c-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"4b0c930b-aa23-40ab-892a-fe0eb123351c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 14:58:39 crc kubenswrapper[4771]: I1001 14:58:39.825392 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4b0c930b-aa23-40ab-892a-fe0eb123351c-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"4b0c930b-aa23-40ab-892a-fe0eb123351c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 14:58:39 crc kubenswrapper[4771]: I1001 14:58:39.825440 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4b0c930b-aa23-40ab-892a-fe0eb123351c-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"4b0c930b-aa23-40ab-892a-fe0eb123351c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 14:58:39 crc kubenswrapper[4771]: I1001 14:58:39.825581 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4b0c930b-aa23-40ab-892a-fe0eb123351c-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"4b0c930b-aa23-40ab-892a-fe0eb123351c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 14:58:39 crc kubenswrapper[4771]: I1001 14:58:39.843718 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4b0c930b-aa23-40ab-892a-fe0eb123351c-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"4b0c930b-aa23-40ab-892a-fe0eb123351c\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 14:58:39 crc kubenswrapper[4771]: I1001 14:58:39.954259 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 14:58:39 crc kubenswrapper[4771]: I1001 14:58:39.977617 4771 generic.go:334] "Generic (PLEG): container finished" podID="d16ca88c-55fc-497e-818c-ed358e0c4bfb" containerID="427d4c1624c21c39bda646df812b0a91eb475f6d8c7f3a209a2441ad40d26e75" exitCode=0 Oct 01 14:58:39 crc kubenswrapper[4771]: I1001 14:58:39.977668 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bxmn4" event={"ID":"d16ca88c-55fc-497e-818c-ed358e0c4bfb","Type":"ContainerDied","Data":"427d4c1624c21c39bda646df812b0a91eb475f6d8c7f3a209a2441ad40d26e75"} Oct 01 14:58:39 crc kubenswrapper[4771]: I1001 14:58:39.977821 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bxmn4" event={"ID":"d16ca88c-55fc-497e-818c-ed358e0c4bfb","Type":"ContainerStarted","Data":"23d5dd4409626d4894a20c327d4474765c0058fc558a03af92887a780fc4f6d2"} Oct 01 14:58:39 crc kubenswrapper[4771]: I1001 14:58:39.982214 4771 generic.go:334] "Generic (PLEG): container finished" podID="8d691866-2c89-43bc-9d4b-931c6ac2b4d7" containerID="f6f929d21bd6fc0cf757d137678b06043a82d39555d25c2586279292bb2e96d0" exitCode=0 Oct 01 14:58:39 crc kubenswrapper[4771]: I1001 14:58:39.982293 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bf6jc" event={"ID":"8d691866-2c89-43bc-9d4b-931c6ac2b4d7","Type":"ContainerDied","Data":"f6f929d21bd6fc0cf757d137678b06043a82d39555d25c2586279292bb2e96d0"} Oct 01 14:58:39 crc kubenswrapper[4771]: I1001 14:58:39.982350 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bf6jc" event={"ID":"8d691866-2c89-43bc-9d4b-931c6ac2b4d7","Type":"ContainerStarted","Data":"7f900588b1defff85cecd6a2b750afca4dfab02df369e6cd97483b77ea92aabd"} Oct 01 14:58:39 crc kubenswrapper[4771]: I1001 14:58:39.982555 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:40 crc kubenswrapper[4771]: I1001 14:58:40.143927 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-mhd4x" Oct 01 14:58:40 crc kubenswrapper[4771]: I1001 14:58:40.219188 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 14:58:40 crc kubenswrapper[4771]: I1001 14:58:40.331489 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d0dfffa8-847f-4e13-970c-b7f72bd3d92f-kubelet-dir\") pod \"d0dfffa8-847f-4e13-970c-b7f72bd3d92f\" (UID: \"d0dfffa8-847f-4e13-970c-b7f72bd3d92f\") " Oct 01 14:58:40 crc kubenswrapper[4771]: I1001 14:58:40.331591 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0dfffa8-847f-4e13-970c-b7f72bd3d92f-kube-api-access\") pod \"d0dfffa8-847f-4e13-970c-b7f72bd3d92f\" (UID: \"d0dfffa8-847f-4e13-970c-b7f72bd3d92f\") " Oct 01 14:58:40 crc kubenswrapper[4771]: I1001 14:58:40.331923 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d0dfffa8-847f-4e13-970c-b7f72bd3d92f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d0dfffa8-847f-4e13-970c-b7f72bd3d92f" (UID: "d0dfffa8-847f-4e13-970c-b7f72bd3d92f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 14:58:40 crc kubenswrapper[4771]: I1001 14:58:40.335910 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0dfffa8-847f-4e13-970c-b7f72bd3d92f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d0dfffa8-847f-4e13-970c-b7f72bd3d92f" (UID: "d0dfffa8-847f-4e13-970c-b7f72bd3d92f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:58:40 crc kubenswrapper[4771]: I1001 14:58:40.405586 4771 patch_prober.go:28] interesting pod/router-default-5444994796-wpxh2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 14:58:40 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Oct 01 14:58:40 crc kubenswrapper[4771]: [+]process-running ok Oct 01 14:58:40 crc kubenswrapper[4771]: healthz check failed Oct 01 14:58:40 crc kubenswrapper[4771]: I1001 14:58:40.405677 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wpxh2" podUID="a2238a30-4ae1-4bd8-acfa-1e357552252c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 14:58:40 crc kubenswrapper[4771]: I1001 14:58:40.436122 4771 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d0dfffa8-847f-4e13-970c-b7f72bd3d92f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 01 14:58:40 crc kubenswrapper[4771]: I1001 14:58:40.436162 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d0dfffa8-847f-4e13-970c-b7f72bd3d92f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 14:58:40 crc kubenswrapper[4771]: I1001 14:58:40.448503 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 01 14:58:40 crc kubenswrapper[4771]: W1001 14:58:40.459874 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4b0c930b_aa23_40ab_892a_fe0eb123351c.slice/crio-ff1d4b3f82312918331299ab0ba7649fe41c0cb5a12108399f4925274c5923b5 WatchSource:0}: Error finding container ff1d4b3f82312918331299ab0ba7649fe41c0cb5a12108399f4925274c5923b5: Status 404 returned error can't find the container with id ff1d4b3f82312918331299ab0ba7649fe41c0cb5a12108399f4925274c5923b5 Oct 01 14:58:40 crc kubenswrapper[4771]: I1001 14:58:40.993778 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"4b0c930b-aa23-40ab-892a-fe0eb123351c","Type":"ContainerStarted","Data":"1a7138a2b10d3727283b23fd1fcd550b25091b653848977315de08cbbbfca34c"} Oct 01 14:58:40 crc kubenswrapper[4771]: I1001 14:58:40.994116 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"4b0c930b-aa23-40ab-892a-fe0eb123351c","Type":"ContainerStarted","Data":"ff1d4b3f82312918331299ab0ba7649fe41c0cb5a12108399f4925274c5923b5"} Oct 01 14:58:40 crc kubenswrapper[4771]: I1001 14:58:40.995754 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"d0dfffa8-847f-4e13-970c-b7f72bd3d92f","Type":"ContainerDied","Data":"e3af5714ba6b81b16efa649b0933e03a1049663884b420059c8ea55a3153e77b"} Oct 01 14:58:40 crc kubenswrapper[4771]: I1001 14:58:40.995786 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3af5714ba6b81b16efa649b0933e03a1049663884b420059c8ea55a3153e77b" Oct 01 14:58:40 crc kubenswrapper[4771]: I1001 14:58:40.995804 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 01 14:58:41 crc kubenswrapper[4771]: I1001 14:58:41.401922 4771 patch_prober.go:28] interesting pod/router-default-5444994796-wpxh2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 14:58:41 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Oct 01 14:58:41 crc kubenswrapper[4771]: [+]process-running ok Oct 01 14:58:41 crc kubenswrapper[4771]: healthz check failed Oct 01 14:58:41 crc kubenswrapper[4771]: I1001 14:58:41.401987 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wpxh2" podUID="a2238a30-4ae1-4bd8-acfa-1e357552252c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 14:58:42 crc kubenswrapper[4771]: I1001 14:58:42.019406 4771 generic.go:334] "Generic (PLEG): container finished" podID="4b0c930b-aa23-40ab-892a-fe0eb123351c" containerID="1a7138a2b10d3727283b23fd1fcd550b25091b653848977315de08cbbbfca34c" exitCode=0 Oct 01 14:58:42 crc kubenswrapper[4771]: I1001 14:58:42.019447 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"4b0c930b-aa23-40ab-892a-fe0eb123351c","Type":"ContainerDied","Data":"1a7138a2b10d3727283b23fd1fcd550b25091b653848977315de08cbbbfca34c"} Oct 01 14:58:42 crc kubenswrapper[4771]: I1001 14:58:42.177369 4771 patch_prober.go:28] interesting pod/machine-config-daemon-vck47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:58:42 crc kubenswrapper[4771]: I1001 14:58:42.177433 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:58:42 crc kubenswrapper[4771]: I1001 14:58:42.403092 4771 patch_prober.go:28] interesting pod/router-default-5444994796-wpxh2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 14:58:42 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Oct 01 14:58:42 crc kubenswrapper[4771]: [+]process-running ok Oct 01 14:58:42 crc kubenswrapper[4771]: healthz check failed Oct 01 14:58:42 crc kubenswrapper[4771]: I1001 14:58:42.403154 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wpxh2" podUID="a2238a30-4ae1-4bd8-acfa-1e357552252c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 14:58:42 crc kubenswrapper[4771]: I1001 14:58:42.933450 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-shl79" Oct 01 14:58:42 crc kubenswrapper[4771]: I1001 14:58:42.937832 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-shl79" Oct 01 14:58:43 crc kubenswrapper[4771]: I1001 14:58:43.283331 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 14:58:43 crc kubenswrapper[4771]: I1001 14:58:43.461257 4771 patch_prober.go:28] interesting pod/router-default-5444994796-wpxh2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 14:58:43 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Oct 01 14:58:43 crc kubenswrapper[4771]: [+]process-running ok Oct 01 14:58:43 crc kubenswrapper[4771]: healthz check failed Oct 01 14:58:43 crc kubenswrapper[4771]: I1001 14:58:43.461312 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wpxh2" podUID="a2238a30-4ae1-4bd8-acfa-1e357552252c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 14:58:44 crc kubenswrapper[4771]: I1001 14:58:44.402422 4771 patch_prober.go:28] interesting pod/router-default-5444994796-wpxh2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 14:58:44 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Oct 01 14:58:44 crc kubenswrapper[4771]: [+]process-running ok Oct 01 14:58:44 crc kubenswrapper[4771]: healthz check failed Oct 01 14:58:44 crc kubenswrapper[4771]: I1001 14:58:44.402467 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wpxh2" podUID="a2238a30-4ae1-4bd8-acfa-1e357552252c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 14:58:45 crc kubenswrapper[4771]: I1001 14:58:45.213433 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a49c960d-cfd1-4745-976b-59c62e3dcf8e-metrics-certs\") pod \"network-metrics-daemon-8qdkc\" (UID: \"a49c960d-cfd1-4745-976b-59c62e3dcf8e\") " pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:58:45 crc kubenswrapper[4771]: I1001 14:58:45.224551 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a49c960d-cfd1-4745-976b-59c62e3dcf8e-metrics-certs\") pod \"network-metrics-daemon-8qdkc\" (UID: \"a49c960d-cfd1-4745-976b-59c62e3dcf8e\") " pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:58:45 crc kubenswrapper[4771]: I1001 14:58:45.229332 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8qdkc" Oct 01 14:58:45 crc kubenswrapper[4771]: I1001 14:58:45.403636 4771 patch_prober.go:28] interesting pod/router-default-5444994796-wpxh2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 14:58:45 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Oct 01 14:58:45 crc kubenswrapper[4771]: [+]process-running ok Oct 01 14:58:45 crc kubenswrapper[4771]: healthz check failed Oct 01 14:58:45 crc kubenswrapper[4771]: I1001 14:58:45.404533 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wpxh2" podUID="a2238a30-4ae1-4bd8-acfa-1e357552252c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 14:58:46 crc kubenswrapper[4771]: I1001 14:58:46.402424 4771 patch_prober.go:28] interesting pod/router-default-5444994796-wpxh2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 14:58:46 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Oct 01 14:58:46 crc kubenswrapper[4771]: [+]process-running ok Oct 01 14:58:46 crc kubenswrapper[4771]: healthz check failed Oct 01 14:58:46 crc kubenswrapper[4771]: I1001 14:58:46.402513 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wpxh2" podUID="a2238a30-4ae1-4bd8-acfa-1e357552252c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 14:58:47 crc kubenswrapper[4771]: I1001 14:58:47.402862 4771 patch_prober.go:28] interesting pod/router-default-5444994796-wpxh2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 14:58:47 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Oct 01 14:58:47 crc kubenswrapper[4771]: [+]process-running ok Oct 01 14:58:47 crc kubenswrapper[4771]: healthz check failed Oct 01 14:58:47 crc kubenswrapper[4771]: I1001 14:58:47.403350 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wpxh2" podUID="a2238a30-4ae1-4bd8-acfa-1e357552252c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 14:58:48 crc kubenswrapper[4771]: I1001 14:58:48.402097 4771 patch_prober.go:28] interesting pod/router-default-5444994796-wpxh2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 14:58:48 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Oct 01 14:58:48 crc kubenswrapper[4771]: [+]process-running ok Oct 01 14:58:48 crc kubenswrapper[4771]: healthz check failed Oct 01 14:58:48 crc kubenswrapper[4771]: I1001 14:58:48.402155 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wpxh2" podUID="a2238a30-4ae1-4bd8-acfa-1e357552252c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 14:58:48 crc kubenswrapper[4771]: I1001 14:58:48.553094 4771 patch_prober.go:28] interesting pod/console-f9d7485db-szwtc container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.18:8443/health\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Oct 01 14:58:48 crc kubenswrapper[4771]: I1001 14:58:48.553149 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-szwtc" podUID="9db1abd4-f11c-45e1-9341-6c818c3e3579" containerName="console" probeResult="failure" output="Get \"https://10.217.0.18:8443/health\": dial tcp 10.217.0.18:8443: connect: connection refused" Oct 01 14:58:48 crc kubenswrapper[4771]: I1001 14:58:48.733073 4771 patch_prober.go:28] interesting pod/downloads-7954f5f757-nrlb2 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Oct 01 14:58:48 crc kubenswrapper[4771]: I1001 14:58:48.733135 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nrlb2" podUID="d92fedc8-d031-40c1-b9fa-695496499a26" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Oct 01 14:58:48 crc kubenswrapper[4771]: I1001 14:58:48.733073 4771 patch_prober.go:28] interesting pod/downloads-7954f5f757-nrlb2 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Oct 01 14:58:48 crc kubenswrapper[4771]: I1001 14:58:48.733207 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-nrlb2" podUID="d92fedc8-d031-40c1-b9fa-695496499a26" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Oct 01 14:58:49 crc kubenswrapper[4771]: I1001 14:58:49.402886 4771 patch_prober.go:28] interesting pod/router-default-5444994796-wpxh2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 14:58:49 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Oct 01 14:58:49 crc kubenswrapper[4771]: [+]process-running ok Oct 01 14:58:49 crc kubenswrapper[4771]: healthz check failed Oct 01 14:58:49 crc kubenswrapper[4771]: I1001 14:58:49.403227 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wpxh2" podUID="a2238a30-4ae1-4bd8-acfa-1e357552252c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 14:58:50 crc kubenswrapper[4771]: I1001 14:58:50.044208 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 14:58:50 crc kubenswrapper[4771]: I1001 14:58:50.077557 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"4b0c930b-aa23-40ab-892a-fe0eb123351c","Type":"ContainerDied","Data":"ff1d4b3f82312918331299ab0ba7649fe41c0cb5a12108399f4925274c5923b5"} Oct 01 14:58:50 crc kubenswrapper[4771]: I1001 14:58:50.077603 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff1d4b3f82312918331299ab0ba7649fe41c0cb5a12108399f4925274c5923b5" Oct 01 14:58:50 crc kubenswrapper[4771]: I1001 14:58:50.077663 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 01 14:58:50 crc kubenswrapper[4771]: I1001 14:58:50.095686 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4b0c930b-aa23-40ab-892a-fe0eb123351c-kube-api-access\") pod \"4b0c930b-aa23-40ab-892a-fe0eb123351c\" (UID: \"4b0c930b-aa23-40ab-892a-fe0eb123351c\") " Oct 01 14:58:50 crc kubenswrapper[4771]: I1001 14:58:50.096061 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4b0c930b-aa23-40ab-892a-fe0eb123351c-kubelet-dir\") pod \"4b0c930b-aa23-40ab-892a-fe0eb123351c\" (UID: \"4b0c930b-aa23-40ab-892a-fe0eb123351c\") " Oct 01 14:58:50 crc kubenswrapper[4771]: I1001 14:58:50.096372 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4b0c930b-aa23-40ab-892a-fe0eb123351c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4b0c930b-aa23-40ab-892a-fe0eb123351c" (UID: "4b0c930b-aa23-40ab-892a-fe0eb123351c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 14:58:50 crc kubenswrapper[4771]: I1001 14:58:50.102210 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b0c930b-aa23-40ab-892a-fe0eb123351c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4b0c930b-aa23-40ab-892a-fe0eb123351c" (UID: "4b0c930b-aa23-40ab-892a-fe0eb123351c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:58:50 crc kubenswrapper[4771]: I1001 14:58:50.198420 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4b0c930b-aa23-40ab-892a-fe0eb123351c-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 01 14:58:50 crc kubenswrapper[4771]: I1001 14:58:50.198459 4771 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4b0c930b-aa23-40ab-892a-fe0eb123351c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 01 14:58:50 crc kubenswrapper[4771]: I1001 14:58:50.403510 4771 patch_prober.go:28] interesting pod/router-default-5444994796-wpxh2 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 01 14:58:50 crc kubenswrapper[4771]: [-]has-synced failed: reason withheld Oct 01 14:58:50 crc kubenswrapper[4771]: [+]process-running ok Oct 01 14:58:50 crc kubenswrapper[4771]: healthz check failed Oct 01 14:58:50 crc kubenswrapper[4771]: I1001 14:58:50.403583 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wpxh2" podUID="a2238a30-4ae1-4bd8-acfa-1e357552252c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 01 14:58:50 crc kubenswrapper[4771]: I1001 14:58:50.450066 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8qdkc"] Oct 01 14:58:51 crc kubenswrapper[4771]: I1001 14:58:51.402597 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-wpxh2" Oct 01 14:58:51 crc kubenswrapper[4771]: I1001 14:58:51.405877 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-wpxh2" Oct 01 14:58:55 crc kubenswrapper[4771]: I1001 14:58:55.145395 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 01 14:58:57 crc kubenswrapper[4771]: I1001 14:58:57.618433 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 14:58:58 crc kubenswrapper[4771]: I1001 14:58:58.603502 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-szwtc" Oct 01 14:58:58 crc kubenswrapper[4771]: I1001 14:58:58.609879 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-szwtc" Oct 01 14:58:58 crc kubenswrapper[4771]: I1001 14:58:58.733631 4771 patch_prober.go:28] interesting pod/downloads-7954f5f757-nrlb2 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Oct 01 14:58:58 crc kubenswrapper[4771]: I1001 14:58:58.733687 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-nrlb2" podUID="d92fedc8-d031-40c1-b9fa-695496499a26" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Oct 01 14:58:58 crc kubenswrapper[4771]: I1001 14:58:58.733746 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-nrlb2" Oct 01 14:58:58 crc kubenswrapper[4771]: I1001 14:58:58.734261 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"b91f19180c14d798fb0e8eec2e69be4d8718dd33f4af111958c5050e58f276e7"} pod="openshift-console/downloads-7954f5f757-nrlb2" containerMessage="Container download-server failed liveness probe, will be restarted" Oct 01 14:58:58 crc kubenswrapper[4771]: I1001 14:58:58.734329 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-nrlb2" podUID="d92fedc8-d031-40c1-b9fa-695496499a26" containerName="download-server" containerID="cri-o://b91f19180c14d798fb0e8eec2e69be4d8718dd33f4af111958c5050e58f276e7" gracePeriod=2 Oct 01 14:58:58 crc kubenswrapper[4771]: I1001 14:58:58.735017 4771 patch_prober.go:28] interesting pod/downloads-7954f5f757-nrlb2 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Oct 01 14:58:58 crc kubenswrapper[4771]: I1001 14:58:58.735047 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nrlb2" podUID="d92fedc8-d031-40c1-b9fa-695496499a26" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Oct 01 14:58:58 crc kubenswrapper[4771]: I1001 14:58:58.735320 4771 patch_prober.go:28] interesting pod/downloads-7954f5f757-nrlb2 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Oct 01 14:58:58 crc kubenswrapper[4771]: I1001 14:58:58.735554 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nrlb2" podUID="d92fedc8-d031-40c1-b9fa-695496499a26" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Oct 01 14:59:00 crc kubenswrapper[4771]: W1001 14:59:00.027628 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda49c960d_cfd1_4745_976b_59c62e3dcf8e.slice/crio-83577eda7569402936fb2ad9826264bdc9421e56ff069f502643c1f576a72fdd WatchSource:0}: Error finding container 83577eda7569402936fb2ad9826264bdc9421e56ff069f502643c1f576a72fdd: Status 404 returned error can't find the container with id 83577eda7569402936fb2ad9826264bdc9421e56ff069f502643c1f576a72fdd Oct 01 14:59:00 crc kubenswrapper[4771]: I1001 14:59:00.141300 4771 generic.go:334] "Generic (PLEG): container finished" podID="d92fedc8-d031-40c1-b9fa-695496499a26" containerID="b91f19180c14d798fb0e8eec2e69be4d8718dd33f4af111958c5050e58f276e7" exitCode=0 Oct 01 14:59:00 crc kubenswrapper[4771]: I1001 14:59:00.141392 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-nrlb2" event={"ID":"d92fedc8-d031-40c1-b9fa-695496499a26","Type":"ContainerDied","Data":"b91f19180c14d798fb0e8eec2e69be4d8718dd33f4af111958c5050e58f276e7"} Oct 01 14:59:00 crc kubenswrapper[4771]: I1001 14:59:00.142340 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8qdkc" event={"ID":"a49c960d-cfd1-4745-976b-59c62e3dcf8e","Type":"ContainerStarted","Data":"83577eda7569402936fb2ad9826264bdc9421e56ff069f502643c1f576a72fdd"} Oct 01 14:59:05 crc kubenswrapper[4771]: E1001 14:59:05.016135 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 01 14:59:05 crc kubenswrapper[4771]: E1001 14:59:05.016668 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8qxzl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-bnw5f_openshift-marketplace(acd3867f-b2c0-422f-872e-91b01c3e1eed): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 14:59:05 crc kubenswrapper[4771]: E1001 14:59:05.017999 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-bnw5f" podUID="acd3867f-b2c0-422f-872e-91b01c3e1eed" Oct 01 14:59:08 crc kubenswrapper[4771]: E1001 14:59:08.134228 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-bnw5f" podUID="acd3867f-b2c0-422f-872e-91b01c3e1eed" Oct 01 14:59:08 crc kubenswrapper[4771]: I1001 14:59:08.682225 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-8pc29" Oct 01 14:59:08 crc kubenswrapper[4771]: I1001 14:59:08.736342 4771 patch_prober.go:28] interesting pod/downloads-7954f5f757-nrlb2 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Oct 01 14:59:08 crc kubenswrapper[4771]: I1001 14:59:08.736437 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nrlb2" podUID="d92fedc8-d031-40c1-b9fa-695496499a26" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Oct 01 14:59:10 crc kubenswrapper[4771]: E1001 14:59:10.675467 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 01 14:59:10 crc kubenswrapper[4771]: E1001 14:59:10.675646 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ssbhx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-f85cn_openshift-marketplace(eab21930-5f16-4c0b-a4c9-f309f5cc3049): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 14:59:10 crc kubenswrapper[4771]: E1001 14:59:10.676883 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-f85cn" podUID="eab21930-5f16-4c0b-a4c9-f309f5cc3049" Oct 01 14:59:12 crc kubenswrapper[4771]: I1001 14:59:12.177886 4771 patch_prober.go:28] interesting pod/machine-config-daemon-vck47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:59:12 crc kubenswrapper[4771]: I1001 14:59:12.177983 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:59:14 crc kubenswrapper[4771]: E1001 14:59:14.637803 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Oct 01 14:59:14 crc kubenswrapper[4771]: E1001 14:59:14.638395 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gvtlq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-m5m6x_openshift-marketplace(05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 14:59:14 crc kubenswrapper[4771]: E1001 14:59:14.639724 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-m5m6x" podUID="05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5" Oct 01 14:59:15 crc kubenswrapper[4771]: E1001 14:59:15.166700 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-f85cn" podUID="eab21930-5f16-4c0b-a4c9-f309f5cc3049" Oct 01 14:59:18 crc kubenswrapper[4771]: E1001 14:59:18.620078 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-m5m6x" podUID="05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5" Oct 01 14:59:18 crc kubenswrapper[4771]: I1001 14:59:18.733809 4771 patch_prober.go:28] interesting pod/downloads-7954f5f757-nrlb2 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Oct 01 14:59:18 crc kubenswrapper[4771]: I1001 14:59:18.733882 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nrlb2" podUID="d92fedc8-d031-40c1-b9fa-695496499a26" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Oct 01 14:59:21 crc kubenswrapper[4771]: E1001 14:59:21.870180 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage2785913079/2\": happened during read: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 01 14:59:21 crc kubenswrapper[4771]: E1001 14:59:21.870469 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bgz52,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-6z4mz_openshift-marketplace(ee2a2c00-b21b-45af-9de5-0cc26da899b3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage2785913079/2\": happened during read: context canceled" logger="UnhandledError" Oct 01 14:59:21 crc kubenswrapper[4771]: E1001 14:59:21.871881 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \\\"/var/tmp/container_images_storage2785913079/2\\\": happened during read: context canceled\"" pod="openshift-marketplace/redhat-marketplace-6z4mz" podUID="ee2a2c00-b21b-45af-9de5-0cc26da899b3" Oct 01 14:59:22 crc kubenswrapper[4771]: E1001 14:59:22.649619 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-6z4mz" podUID="ee2a2c00-b21b-45af-9de5-0cc26da899b3" Oct 01 14:59:22 crc kubenswrapper[4771]: E1001 14:59:22.683947 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Oct 01 14:59:22 crc kubenswrapper[4771]: E1001 14:59:22.684121 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g4htw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-9dq54_openshift-marketplace(e62d1022-e44e-4353-a6da-b846b2cb2858): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 14:59:22 crc kubenswrapper[4771]: E1001 14:59:22.685394 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-9dq54" podUID="e62d1022-e44e-4353-a6da-b846b2cb2858" Oct 01 14:59:28 crc kubenswrapper[4771]: E1001 14:59:28.295080 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-9dq54" podUID="e62d1022-e44e-4353-a6da-b846b2cb2858" Oct 01 14:59:28 crc kubenswrapper[4771]: E1001 14:59:28.355559 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Oct 01 14:59:28 crc kubenswrapper[4771]: E1001 14:59:28.355757 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7tcdb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-h8g2c_openshift-marketplace(dc5bfdae-9f89-4ef2-b197-69937670c341): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 14:59:28 crc kubenswrapper[4771]: E1001 14:59:28.357314 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-h8g2c" podUID="dc5bfdae-9f89-4ef2-b197-69937670c341" Oct 01 14:59:28 crc kubenswrapper[4771]: I1001 14:59:28.733525 4771 patch_prober.go:28] interesting pod/downloads-7954f5f757-nrlb2 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Oct 01 14:59:28 crc kubenswrapper[4771]: I1001 14:59:28.733593 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nrlb2" podUID="d92fedc8-d031-40c1-b9fa-695496499a26" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Oct 01 14:59:31 crc kubenswrapper[4771]: E1001 14:59:31.138068 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-h8g2c" podUID="dc5bfdae-9f89-4ef2-b197-69937670c341" Oct 01 14:59:31 crc kubenswrapper[4771]: E1001 14:59:31.186379 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:a9db7a3b30ecf2d0f1b396fcba52764b4f9a80a670461c1ec06db87ff269ea06: Get \"https://registry.redhat.io/v2/redhat/redhat-operator-index/blobs/sha256:a9db7a3b30ecf2d0f1b396fcba52764b4f9a80a670461c1ec06db87ff269ea06\": context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 01 14:59:31 crc kubenswrapper[4771]: E1001 14:59:31.186545 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9649d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-bxmn4_openshift-marketplace(d16ca88c-55fc-497e-818c-ed358e0c4bfb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:a9db7a3b30ecf2d0f1b396fcba52764b4f9a80a670461c1ec06db87ff269ea06: Get \"https://registry.redhat.io/v2/redhat/redhat-operator-index/blobs/sha256:a9db7a3b30ecf2d0f1b396fcba52764b4f9a80a670461c1ec06db87ff269ea06\": context canceled" logger="UnhandledError" Oct 01 14:59:31 crc kubenswrapper[4771]: E1001 14:59:31.187800 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:a9db7a3b30ecf2d0f1b396fcba52764b4f9a80a670461c1ec06db87ff269ea06: Get \\\"https://registry.redhat.io/v2/redhat/redhat-operator-index/blobs/sha256:a9db7a3b30ecf2d0f1b396fcba52764b4f9a80a670461c1ec06db87ff269ea06\\\": context canceled\"" pod="openshift-marketplace/redhat-operators-bxmn4" podUID="d16ca88c-55fc-497e-818c-ed358e0c4bfb" Oct 01 14:59:31 crc kubenswrapper[4771]: E1001 14:59:31.227507 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Oct 01 14:59:31 crc kubenswrapper[4771]: E1001 14:59:31.227686 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hd69v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-bf6jc_openshift-marketplace(8d691866-2c89-43bc-9d4b-931c6ac2b4d7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 14:59:31 crc kubenswrapper[4771]: E1001 14:59:31.228901 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-bf6jc" podUID="8d691866-2c89-43bc-9d4b-931c6ac2b4d7" Oct 01 14:59:31 crc kubenswrapper[4771]: E1001 14:59:31.324338 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-bf6jc" podUID="8d691866-2c89-43bc-9d4b-931c6ac2b4d7" Oct 01 14:59:31 crc kubenswrapper[4771]: E1001 14:59:31.324422 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-bxmn4" podUID="d16ca88c-55fc-497e-818c-ed358e0c4bfb" Oct 01 14:59:32 crc kubenswrapper[4771]: I1001 14:59:32.328078 4771 generic.go:334] "Generic (PLEG): container finished" podID="acd3867f-b2c0-422f-872e-91b01c3e1eed" containerID="e2308d0346fcdb39c4e1d751bdf41c0e01f0d41cc1fc7ab4b9980322fdf932db" exitCode=0 Oct 01 14:59:32 crc kubenswrapper[4771]: I1001 14:59:32.328582 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bnw5f" event={"ID":"acd3867f-b2c0-422f-872e-91b01c3e1eed","Type":"ContainerDied","Data":"e2308d0346fcdb39c4e1d751bdf41c0e01f0d41cc1fc7ab4b9980322fdf932db"} Oct 01 14:59:32 crc kubenswrapper[4771]: I1001 14:59:32.334485 4771 generic.go:334] "Generic (PLEG): container finished" podID="05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5" containerID="716a0283fb58c69dc04014abe4cac65bdb6162ed0a8463669dd48bab78da4c7e" exitCode=0 Oct 01 14:59:32 crc kubenswrapper[4771]: I1001 14:59:32.335009 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m5m6x" event={"ID":"05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5","Type":"ContainerDied","Data":"716a0283fb58c69dc04014abe4cac65bdb6162ed0a8463669dd48bab78da4c7e"} Oct 01 14:59:32 crc kubenswrapper[4771]: I1001 14:59:32.339401 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-nrlb2" event={"ID":"d92fedc8-d031-40c1-b9fa-695496499a26","Type":"ContainerStarted","Data":"cefc6df4daace096baaef152243d34090bab805590edddca7c9f9f3d9cd9b2d3"} Oct 01 14:59:32 crc kubenswrapper[4771]: I1001 14:59:32.340056 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-nrlb2" Oct 01 14:59:32 crc kubenswrapper[4771]: I1001 14:59:32.340408 4771 patch_prober.go:28] interesting pod/downloads-7954f5f757-nrlb2 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Oct 01 14:59:32 crc kubenswrapper[4771]: I1001 14:59:32.340439 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nrlb2" podUID="d92fedc8-d031-40c1-b9fa-695496499a26" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Oct 01 14:59:32 crc kubenswrapper[4771]: I1001 14:59:32.347316 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8qdkc" event={"ID":"a49c960d-cfd1-4745-976b-59c62e3dcf8e","Type":"ContainerStarted","Data":"efbb1585f4d8b592321cfc68f5fcd6e2c5b8ce9465642d77817ea1888dafe420"} Oct 01 14:59:32 crc kubenswrapper[4771]: I1001 14:59:32.347368 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8qdkc" event={"ID":"a49c960d-cfd1-4745-976b-59c62e3dcf8e","Type":"ContainerStarted","Data":"9fa06e73c22880192726d5da170dd009302583c436bc27d0c070076a55eea49a"} Oct 01 14:59:32 crc kubenswrapper[4771]: I1001 14:59:32.368284 4771 generic.go:334] "Generic (PLEG): container finished" podID="eab21930-5f16-4c0b-a4c9-f309f5cc3049" containerID="798356bdcf5eeb9e9293e376169768ffb5ad06622aeb1fcfcb7b03cd04c9653a" exitCode=0 Oct 01 14:59:32 crc kubenswrapper[4771]: I1001 14:59:32.368345 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f85cn" event={"ID":"eab21930-5f16-4c0b-a4c9-f309f5cc3049","Type":"ContainerDied","Data":"798356bdcf5eeb9e9293e376169768ffb5ad06622aeb1fcfcb7b03cd04c9653a"} Oct 01 14:59:32 crc kubenswrapper[4771]: I1001 14:59:32.448869 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-8qdkc" podStartSLOduration=190.448846466 podStartE2EDuration="3m10.448846466s" podCreationTimestamp="2025-10-01 14:56:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 14:59:32.442685417 +0000 UTC m=+217.061860588" watchObservedRunningTime="2025-10-01 14:59:32.448846466 +0000 UTC m=+217.068021637" Oct 01 14:59:33 crc kubenswrapper[4771]: I1001 14:59:33.375533 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m5m6x" event={"ID":"05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5","Type":"ContainerStarted","Data":"79105ba6f48d2f25e9cb457db31e839ee8d8159b5130c42c95128bf7a3b6b1c3"} Oct 01 14:59:33 crc kubenswrapper[4771]: I1001 14:59:33.377749 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f85cn" event={"ID":"eab21930-5f16-4c0b-a4c9-f309f5cc3049","Type":"ContainerStarted","Data":"8d270a429d8ac7ba40fb633cfef9d37718e9e48458133966fcdb94d5598318a2"} Oct 01 14:59:33 crc kubenswrapper[4771]: I1001 14:59:33.379793 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bnw5f" event={"ID":"acd3867f-b2c0-422f-872e-91b01c3e1eed","Type":"ContainerStarted","Data":"0df68d9b3da4099800efc328c6ad3ccce05534a6fb7ed01e2a3f1e7667b45733"} Oct 01 14:59:33 crc kubenswrapper[4771]: I1001 14:59:33.380776 4771 patch_prober.go:28] interesting pod/downloads-7954f5f757-nrlb2 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Oct 01 14:59:33 crc kubenswrapper[4771]: I1001 14:59:33.380819 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nrlb2" podUID="d92fedc8-d031-40c1-b9fa-695496499a26" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.22:8080/\": dial tcp 10.217.0.22:8080: connect: connection refused" Oct 01 14:59:33 crc kubenswrapper[4771]: I1001 14:59:33.402389 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-m5m6x" podStartSLOduration=2.529378459 podStartE2EDuration="58.402368275s" podCreationTimestamp="2025-10-01 14:58:35 +0000 UTC" firstStartedPulling="2025-10-01 14:58:36.878959362 +0000 UTC m=+161.498134543" lastFinishedPulling="2025-10-01 14:59:32.751949188 +0000 UTC m=+217.371124359" observedRunningTime="2025-10-01 14:59:33.399539652 +0000 UTC m=+218.018714833" watchObservedRunningTime="2025-10-01 14:59:33.402368275 +0000 UTC m=+218.021543466" Oct 01 14:59:33 crc kubenswrapper[4771]: I1001 14:59:33.421417 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-f85cn" podStartSLOduration=2.470798374 podStartE2EDuration="58.421403337s" podCreationTimestamp="2025-10-01 14:58:35 +0000 UTC" firstStartedPulling="2025-10-01 14:58:36.857394752 +0000 UTC m=+161.476569923" lastFinishedPulling="2025-10-01 14:59:32.807999715 +0000 UTC m=+217.427174886" observedRunningTime="2025-10-01 14:59:33.419838036 +0000 UTC m=+218.039013217" watchObservedRunningTime="2025-10-01 14:59:33.421403337 +0000 UTC m=+218.040578508" Oct 01 14:59:33 crc kubenswrapper[4771]: I1001 14:59:33.443115 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bnw5f" podStartSLOduration=2.609683305 podStartE2EDuration="58.443097876s" podCreationTimestamp="2025-10-01 14:58:35 +0000 UTC" firstStartedPulling="2025-10-01 14:58:36.864026721 +0000 UTC m=+161.483201892" lastFinishedPulling="2025-10-01 14:59:32.697441292 +0000 UTC m=+217.316616463" observedRunningTime="2025-10-01 14:59:33.442066449 +0000 UTC m=+218.061241640" watchObservedRunningTime="2025-10-01 14:59:33.443097876 +0000 UTC m=+218.062273047" Oct 01 14:59:35 crc kubenswrapper[4771]: I1001 14:59:35.351486 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-m5m6x" Oct 01 14:59:35 crc kubenswrapper[4771]: I1001 14:59:35.351867 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-m5m6x" Oct 01 14:59:35 crc kubenswrapper[4771]: I1001 14:59:35.748430 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-m5m6x" Oct 01 14:59:35 crc kubenswrapper[4771]: I1001 14:59:35.772786 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bnw5f" Oct 01 14:59:35 crc kubenswrapper[4771]: I1001 14:59:35.772847 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bnw5f" Oct 01 14:59:35 crc kubenswrapper[4771]: I1001 14:59:35.840126 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bnw5f" Oct 01 14:59:35 crc kubenswrapper[4771]: I1001 14:59:35.951564 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-f85cn" Oct 01 14:59:35 crc kubenswrapper[4771]: I1001 14:59:35.951639 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-f85cn" Oct 01 14:59:35 crc kubenswrapper[4771]: I1001 14:59:35.996766 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-f85cn" Oct 01 14:59:38 crc kubenswrapper[4771]: I1001 14:59:38.743786 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-nrlb2" Oct 01 14:59:41 crc kubenswrapper[4771]: I1001 14:59:41.448609 4771 generic.go:334] "Generic (PLEG): container finished" podID="ee2a2c00-b21b-45af-9de5-0cc26da899b3" containerID="1751e4363b77cde8e830b1378c693b854095f3915fb563f1e5fcc0c6bdf1273c" exitCode=0 Oct 01 14:59:41 crc kubenswrapper[4771]: I1001 14:59:41.448772 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6z4mz" event={"ID":"ee2a2c00-b21b-45af-9de5-0cc26da899b3","Type":"ContainerDied","Data":"1751e4363b77cde8e830b1378c693b854095f3915fb563f1e5fcc0c6bdf1273c"} Oct 01 14:59:42 crc kubenswrapper[4771]: I1001 14:59:42.177319 4771 patch_prober.go:28] interesting pod/machine-config-daemon-vck47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 14:59:42 crc kubenswrapper[4771]: I1001 14:59:42.177690 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 14:59:42 crc kubenswrapper[4771]: I1001 14:59:42.177749 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vck47" Oct 01 14:59:42 crc kubenswrapper[4771]: I1001 14:59:42.178335 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"161a9b5c96d73213aa3d0261956487464c5e1398e47d221db9fccafdfdc51856"} pod="openshift-machine-config-operator/machine-config-daemon-vck47" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 14:59:42 crc kubenswrapper[4771]: I1001 14:59:42.178417 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" containerID="cri-o://161a9b5c96d73213aa3d0261956487464c5e1398e47d221db9fccafdfdc51856" gracePeriod=600 Oct 01 14:59:42 crc kubenswrapper[4771]: I1001 14:59:42.457163 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6z4mz" event={"ID":"ee2a2c00-b21b-45af-9de5-0cc26da899b3","Type":"ContainerStarted","Data":"0c2fec1f27fde74ffeb49e8778edde824ecdca988b56b5b628fb3c09e8c20237"} Oct 01 14:59:42 crc kubenswrapper[4771]: I1001 14:59:42.458979 4771 generic.go:334] "Generic (PLEG): container finished" podID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerID="161a9b5c96d73213aa3d0261956487464c5e1398e47d221db9fccafdfdc51856" exitCode=0 Oct 01 14:59:42 crc kubenswrapper[4771]: I1001 14:59:42.459014 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" event={"ID":"289ee6d3-fabe-417f-964c-76ca03c143cc","Type":"ContainerDied","Data":"161a9b5c96d73213aa3d0261956487464c5e1398e47d221db9fccafdfdc51856"} Oct 01 14:59:42 crc kubenswrapper[4771]: I1001 14:59:42.476925 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6z4mz" podStartSLOduration=1.353887701 podStartE2EDuration="1m5.47690305s" podCreationTimestamp="2025-10-01 14:58:37 +0000 UTC" firstStartedPulling="2025-10-01 14:58:37.908331592 +0000 UTC m=+162.527506763" lastFinishedPulling="2025-10-01 14:59:42.031346941 +0000 UTC m=+226.650522112" observedRunningTime="2025-10-01 14:59:42.473462082 +0000 UTC m=+227.092637273" watchObservedRunningTime="2025-10-01 14:59:42.47690305 +0000 UTC m=+227.096078221" Oct 01 14:59:43 crc kubenswrapper[4771]: I1001 14:59:43.467199 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bxmn4" event={"ID":"d16ca88c-55fc-497e-818c-ed358e0c4bfb","Type":"ContainerStarted","Data":"8126075ed7a32f068b114015dba9d3567bbf1ef28998d757170314db893ac0ef"} Oct 01 14:59:43 crc kubenswrapper[4771]: I1001 14:59:43.469811 4771 generic.go:334] "Generic (PLEG): container finished" podID="e62d1022-e44e-4353-a6da-b846b2cb2858" containerID="9aa961c2387154bed99e894981ad8d88c8fce625b2569c0e8d1b5a43d74d30f4" exitCode=0 Oct 01 14:59:43 crc kubenswrapper[4771]: I1001 14:59:43.469868 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9dq54" event={"ID":"e62d1022-e44e-4353-a6da-b846b2cb2858","Type":"ContainerDied","Data":"9aa961c2387154bed99e894981ad8d88c8fce625b2569c0e8d1b5a43d74d30f4"} Oct 01 14:59:43 crc kubenswrapper[4771]: I1001 14:59:43.474272 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" event={"ID":"289ee6d3-fabe-417f-964c-76ca03c143cc","Type":"ContainerStarted","Data":"77301768e84aaf763ab3e5229d66257dc37283214f6e3b73666999b30255a8ac"} Oct 01 14:59:44 crc kubenswrapper[4771]: I1001 14:59:44.480670 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h8g2c" event={"ID":"dc5bfdae-9f89-4ef2-b197-69937670c341","Type":"ContainerStarted","Data":"01ddf22875220071c811425d0661d2044e75278ffd4809e5f39c0109e80f4168"} Oct 01 14:59:44 crc kubenswrapper[4771]: I1001 14:59:44.482536 4771 generic.go:334] "Generic (PLEG): container finished" podID="d16ca88c-55fc-497e-818c-ed358e0c4bfb" containerID="8126075ed7a32f068b114015dba9d3567bbf1ef28998d757170314db893ac0ef" exitCode=0 Oct 01 14:59:44 crc kubenswrapper[4771]: I1001 14:59:44.483060 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bxmn4" event={"ID":"d16ca88c-55fc-497e-818c-ed358e0c4bfb","Type":"ContainerDied","Data":"8126075ed7a32f068b114015dba9d3567bbf1ef28998d757170314db893ac0ef"} Oct 01 14:59:45 crc kubenswrapper[4771]: I1001 14:59:45.410352 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-m5m6x" Oct 01 14:59:45 crc kubenswrapper[4771]: I1001 14:59:45.489132 4771 generic.go:334] "Generic (PLEG): container finished" podID="dc5bfdae-9f89-4ef2-b197-69937670c341" containerID="01ddf22875220071c811425d0661d2044e75278ffd4809e5f39c0109e80f4168" exitCode=0 Oct 01 14:59:45 crc kubenswrapper[4771]: I1001 14:59:45.489238 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h8g2c" event={"ID":"dc5bfdae-9f89-4ef2-b197-69937670c341","Type":"ContainerDied","Data":"01ddf22875220071c811425d0661d2044e75278ffd4809e5f39c0109e80f4168"} Oct 01 14:59:45 crc kubenswrapper[4771]: I1001 14:59:45.491250 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9dq54" event={"ID":"e62d1022-e44e-4353-a6da-b846b2cb2858","Type":"ContainerStarted","Data":"afa6b7f8480b6fedd1c1d3ebe5f1f0a1bc9ca14cb73a88a76990da5d02e144ca"} Oct 01 14:59:45 crc kubenswrapper[4771]: I1001 14:59:45.532218 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9dq54" podStartSLOduration=3.177563125 podStartE2EDuration="1m10.532200715s" podCreationTimestamp="2025-10-01 14:58:35 +0000 UTC" firstStartedPulling="2025-10-01 14:58:36.855207996 +0000 UTC m=+161.474383167" lastFinishedPulling="2025-10-01 14:59:44.209845576 +0000 UTC m=+228.829020757" observedRunningTime="2025-10-01 14:59:45.531417605 +0000 UTC m=+230.150592816" watchObservedRunningTime="2025-10-01 14:59:45.532200715 +0000 UTC m=+230.151375886" Oct 01 14:59:45 crc kubenswrapper[4771]: I1001 14:59:45.536763 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9dq54" Oct 01 14:59:45 crc kubenswrapper[4771]: I1001 14:59:45.536801 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9dq54" Oct 01 14:59:45 crc kubenswrapper[4771]: I1001 14:59:45.831827 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bnw5f" Oct 01 14:59:45 crc kubenswrapper[4771]: I1001 14:59:45.996221 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-f85cn" Oct 01 14:59:46 crc kubenswrapper[4771]: I1001 14:59:46.575481 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-9dq54" podUID="e62d1022-e44e-4353-a6da-b846b2cb2858" containerName="registry-server" probeResult="failure" output=< Oct 01 14:59:46 crc kubenswrapper[4771]: timeout: failed to connect service ":50051" within 1s Oct 01 14:59:46 crc kubenswrapper[4771]: > Oct 01 14:59:46 crc kubenswrapper[4771]: I1001 14:59:46.817035 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f85cn"] Oct 01 14:59:46 crc kubenswrapper[4771]: I1001 14:59:46.817294 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-f85cn" podUID="eab21930-5f16-4c0b-a4c9-f309f5cc3049" containerName="registry-server" containerID="cri-o://8d270a429d8ac7ba40fb633cfef9d37718e9e48458133966fcdb94d5598318a2" gracePeriod=2 Oct 01 14:59:47 crc kubenswrapper[4771]: I1001 14:59:47.348350 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6z4mz" Oct 01 14:59:47 crc kubenswrapper[4771]: I1001 14:59:47.348418 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6z4mz" Oct 01 14:59:47 crc kubenswrapper[4771]: I1001 14:59:47.408265 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6z4mz" Oct 01 14:59:47 crc kubenswrapper[4771]: I1001 14:59:47.544751 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6z4mz" Oct 01 14:59:48 crc kubenswrapper[4771]: I1001 14:59:48.509446 4771 generic.go:334] "Generic (PLEG): container finished" podID="eab21930-5f16-4c0b-a4c9-f309f5cc3049" containerID="8d270a429d8ac7ba40fb633cfef9d37718e9e48458133966fcdb94d5598318a2" exitCode=0 Oct 01 14:59:48 crc kubenswrapper[4771]: I1001 14:59:48.509529 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f85cn" event={"ID":"eab21930-5f16-4c0b-a4c9-f309f5cc3049","Type":"ContainerDied","Data":"8d270a429d8ac7ba40fb633cfef9d37718e9e48458133966fcdb94d5598318a2"} Oct 01 14:59:49 crc kubenswrapper[4771]: I1001 14:59:49.817316 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bnw5f"] Oct 01 14:59:49 crc kubenswrapper[4771]: I1001 14:59:49.817904 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bnw5f" podUID="acd3867f-b2c0-422f-872e-91b01c3e1eed" containerName="registry-server" containerID="cri-o://0df68d9b3da4099800efc328c6ad3ccce05534a6fb7ed01e2a3f1e7667b45733" gracePeriod=2 Oct 01 14:59:51 crc kubenswrapper[4771]: I1001 14:59:51.162197 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f85cn" Oct 01 14:59:51 crc kubenswrapper[4771]: I1001 14:59:51.209448 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eab21930-5f16-4c0b-a4c9-f309f5cc3049-utilities\") pod \"eab21930-5f16-4c0b-a4c9-f309f5cc3049\" (UID: \"eab21930-5f16-4c0b-a4c9-f309f5cc3049\") " Oct 01 14:59:51 crc kubenswrapper[4771]: I1001 14:59:51.209572 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssbhx\" (UniqueName: \"kubernetes.io/projected/eab21930-5f16-4c0b-a4c9-f309f5cc3049-kube-api-access-ssbhx\") pod \"eab21930-5f16-4c0b-a4c9-f309f5cc3049\" (UID: \"eab21930-5f16-4c0b-a4c9-f309f5cc3049\") " Oct 01 14:59:51 crc kubenswrapper[4771]: I1001 14:59:51.209657 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eab21930-5f16-4c0b-a4c9-f309f5cc3049-catalog-content\") pod \"eab21930-5f16-4c0b-a4c9-f309f5cc3049\" (UID: \"eab21930-5f16-4c0b-a4c9-f309f5cc3049\") " Oct 01 14:59:51 crc kubenswrapper[4771]: I1001 14:59:51.210376 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eab21930-5f16-4c0b-a4c9-f309f5cc3049-utilities" (OuterVolumeSpecName: "utilities") pod "eab21930-5f16-4c0b-a4c9-f309f5cc3049" (UID: "eab21930-5f16-4c0b-a4c9-f309f5cc3049"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:59:51 crc kubenswrapper[4771]: I1001 14:59:51.217384 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eab21930-5f16-4c0b-a4c9-f309f5cc3049-kube-api-access-ssbhx" (OuterVolumeSpecName: "kube-api-access-ssbhx") pod "eab21930-5f16-4c0b-a4c9-f309f5cc3049" (UID: "eab21930-5f16-4c0b-a4c9-f309f5cc3049"). InnerVolumeSpecName "kube-api-access-ssbhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:59:51 crc kubenswrapper[4771]: I1001 14:59:51.262689 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eab21930-5f16-4c0b-a4c9-f309f5cc3049-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eab21930-5f16-4c0b-a4c9-f309f5cc3049" (UID: "eab21930-5f16-4c0b-a4c9-f309f5cc3049"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:59:51 crc kubenswrapper[4771]: I1001 14:59:51.310362 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eab21930-5f16-4c0b-a4c9-f309f5cc3049-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:59:51 crc kubenswrapper[4771]: I1001 14:59:51.310394 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssbhx\" (UniqueName: \"kubernetes.io/projected/eab21930-5f16-4c0b-a4c9-f309f5cc3049-kube-api-access-ssbhx\") on node \"crc\" DevicePath \"\"" Oct 01 14:59:51 crc kubenswrapper[4771]: I1001 14:59:51.310404 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eab21930-5f16-4c0b-a4c9-f309f5cc3049-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:59:51 crc kubenswrapper[4771]: I1001 14:59:51.525683 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-f85cn" Oct 01 14:59:51 crc kubenswrapper[4771]: I1001 14:59:51.525659 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-f85cn" event={"ID":"eab21930-5f16-4c0b-a4c9-f309f5cc3049","Type":"ContainerDied","Data":"9199ca63f66a349fe3a58c49e5cee8d31a1258395596bf09ca842c7439dc02c0"} Oct 01 14:59:51 crc kubenswrapper[4771]: I1001 14:59:51.526016 4771 scope.go:117] "RemoveContainer" containerID="8d270a429d8ac7ba40fb633cfef9d37718e9e48458133966fcdb94d5598318a2" Oct 01 14:59:51 crc kubenswrapper[4771]: I1001 14:59:51.528031 4771 generic.go:334] "Generic (PLEG): container finished" podID="acd3867f-b2c0-422f-872e-91b01c3e1eed" containerID="0df68d9b3da4099800efc328c6ad3ccce05534a6fb7ed01e2a3f1e7667b45733" exitCode=0 Oct 01 14:59:51 crc kubenswrapper[4771]: I1001 14:59:51.528072 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bnw5f" event={"ID":"acd3867f-b2c0-422f-872e-91b01c3e1eed","Type":"ContainerDied","Data":"0df68d9b3da4099800efc328c6ad3ccce05534a6fb7ed01e2a3f1e7667b45733"} Oct 01 14:59:51 crc kubenswrapper[4771]: I1001 14:59:51.564539 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-f85cn"] Oct 01 14:59:51 crc kubenswrapper[4771]: I1001 14:59:51.569840 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-f85cn"] Oct 01 14:59:51 crc kubenswrapper[4771]: I1001 14:59:51.670815 4771 scope.go:117] "RemoveContainer" containerID="798356bdcf5eeb9e9293e376169768ffb5ad06622aeb1fcfcb7b03cd04c9653a" Oct 01 14:59:51 crc kubenswrapper[4771]: I1001 14:59:51.700167 4771 scope.go:117] "RemoveContainer" containerID="30a337ea8da485d5fdec7140aa8639514aad715d6c764d61899a58932122dae2" Oct 01 14:59:51 crc kubenswrapper[4771]: I1001 14:59:51.994554 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eab21930-5f16-4c0b-a4c9-f309f5cc3049" path="/var/lib/kubelet/pods/eab21930-5f16-4c0b-a4c9-f309f5cc3049/volumes" Oct 01 14:59:52 crc kubenswrapper[4771]: I1001 14:59:52.421302 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bnw5f" Oct 01 14:59:52 crc kubenswrapper[4771]: I1001 14:59:52.525195 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acd3867f-b2c0-422f-872e-91b01c3e1eed-catalog-content\") pod \"acd3867f-b2c0-422f-872e-91b01c3e1eed\" (UID: \"acd3867f-b2c0-422f-872e-91b01c3e1eed\") " Oct 01 14:59:52 crc kubenswrapper[4771]: I1001 14:59:52.525281 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acd3867f-b2c0-422f-872e-91b01c3e1eed-utilities\") pod \"acd3867f-b2c0-422f-872e-91b01c3e1eed\" (UID: \"acd3867f-b2c0-422f-872e-91b01c3e1eed\") " Oct 01 14:59:52 crc kubenswrapper[4771]: I1001 14:59:52.525380 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qxzl\" (UniqueName: \"kubernetes.io/projected/acd3867f-b2c0-422f-872e-91b01c3e1eed-kube-api-access-8qxzl\") pod \"acd3867f-b2c0-422f-872e-91b01c3e1eed\" (UID: \"acd3867f-b2c0-422f-872e-91b01c3e1eed\") " Oct 01 14:59:52 crc kubenswrapper[4771]: I1001 14:59:52.527107 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acd3867f-b2c0-422f-872e-91b01c3e1eed-utilities" (OuterVolumeSpecName: "utilities") pod "acd3867f-b2c0-422f-872e-91b01c3e1eed" (UID: "acd3867f-b2c0-422f-872e-91b01c3e1eed"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:59:52 crc kubenswrapper[4771]: I1001 14:59:52.534428 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acd3867f-b2c0-422f-872e-91b01c3e1eed-kube-api-access-8qxzl" (OuterVolumeSpecName: "kube-api-access-8qxzl") pod "acd3867f-b2c0-422f-872e-91b01c3e1eed" (UID: "acd3867f-b2c0-422f-872e-91b01c3e1eed"). InnerVolumeSpecName "kube-api-access-8qxzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 14:59:52 crc kubenswrapper[4771]: I1001 14:59:52.537838 4771 generic.go:334] "Generic (PLEG): container finished" podID="8d691866-2c89-43bc-9d4b-931c6ac2b4d7" containerID="6cf793c69ac00d1079ebd0af9d6e4aa3d8936c757f9df8e3525112e2d025a1f2" exitCode=0 Oct 01 14:59:52 crc kubenswrapper[4771]: I1001 14:59:52.537894 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bf6jc" event={"ID":"8d691866-2c89-43bc-9d4b-931c6ac2b4d7","Type":"ContainerDied","Data":"6cf793c69ac00d1079ebd0af9d6e4aa3d8936c757f9df8e3525112e2d025a1f2"} Oct 01 14:59:52 crc kubenswrapper[4771]: I1001 14:59:52.541953 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bnw5f" Oct 01 14:59:52 crc kubenswrapper[4771]: I1001 14:59:52.541981 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bnw5f" event={"ID":"acd3867f-b2c0-422f-872e-91b01c3e1eed","Type":"ContainerDied","Data":"ebabe5161a780394d9ab35f5bd66b4b8150912889eb0ac1957a83f354f933f5c"} Oct 01 14:59:52 crc kubenswrapper[4771]: I1001 14:59:52.542096 4771 scope.go:117] "RemoveContainer" containerID="0df68d9b3da4099800efc328c6ad3ccce05534a6fb7ed01e2a3f1e7667b45733" Oct 01 14:59:52 crc kubenswrapper[4771]: I1001 14:59:52.544774 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bxmn4" event={"ID":"d16ca88c-55fc-497e-818c-ed358e0c4bfb","Type":"ContainerStarted","Data":"5b20d014da56c26b84dbcc511ce16ef8433463d413fe335a04d462dc6c06f1f3"} Oct 01 14:59:52 crc kubenswrapper[4771]: I1001 14:59:52.575963 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acd3867f-b2c0-422f-872e-91b01c3e1eed-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "acd3867f-b2c0-422f-872e-91b01c3e1eed" (UID: "acd3867f-b2c0-422f-872e-91b01c3e1eed"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 14:59:52 crc kubenswrapper[4771]: I1001 14:59:52.626820 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qxzl\" (UniqueName: \"kubernetes.io/projected/acd3867f-b2c0-422f-872e-91b01c3e1eed-kube-api-access-8qxzl\") on node \"crc\" DevicePath \"\"" Oct 01 14:59:52 crc kubenswrapper[4771]: I1001 14:59:52.626868 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acd3867f-b2c0-422f-872e-91b01c3e1eed-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 14:59:52 crc kubenswrapper[4771]: I1001 14:59:52.626878 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acd3867f-b2c0-422f-872e-91b01c3e1eed-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 14:59:52 crc kubenswrapper[4771]: I1001 14:59:52.867047 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bnw5f"] Oct 01 14:59:52 crc kubenswrapper[4771]: I1001 14:59:52.872476 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bnw5f"] Oct 01 14:59:53 crc kubenswrapper[4771]: I1001 14:59:53.119523 4771 scope.go:117] "RemoveContainer" containerID="e2308d0346fcdb39c4e1d751bdf41c0e01f0d41cc1fc7ab4b9980322fdf932db" Oct 01 14:59:53 crc kubenswrapper[4771]: I1001 14:59:53.574304 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bxmn4" podStartSLOduration=4.640018645 podStartE2EDuration="1m15.574280784s" podCreationTimestamp="2025-10-01 14:58:38 +0000 UTC" firstStartedPulling="2025-10-01 14:58:39.97912193 +0000 UTC m=+164.598297101" lastFinishedPulling="2025-10-01 14:59:50.913384069 +0000 UTC m=+235.532559240" observedRunningTime="2025-10-01 14:59:53.570854766 +0000 UTC m=+238.190029937" watchObservedRunningTime="2025-10-01 14:59:53.574280784 +0000 UTC m=+238.193455955" Oct 01 14:59:53 crc kubenswrapper[4771]: I1001 14:59:53.936706 4771 scope.go:117] "RemoveContainer" containerID="d393237977ea8842c37edbd3e615d26b802f7117a912ffed28efc1f84e27280e" Oct 01 14:59:53 crc kubenswrapper[4771]: I1001 14:59:53.992491 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acd3867f-b2c0-422f-872e-91b01c3e1eed" path="/var/lib/kubelet/pods/acd3867f-b2c0-422f-872e-91b01c3e1eed/volumes" Oct 01 14:59:54 crc kubenswrapper[4771]: I1001 14:59:54.561381 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h8g2c" event={"ID":"dc5bfdae-9f89-4ef2-b197-69937670c341","Type":"ContainerStarted","Data":"89de138919bf3a0780ffa9654d7c283d25e54415077fae3cb50d5f22dd1b1ff2"} Oct 01 14:59:55 crc kubenswrapper[4771]: I1001 14:59:55.600518 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-h8g2c" podStartSLOduration=3.619531894 podStartE2EDuration="1m18.600495578s" podCreationTimestamp="2025-10-01 14:58:37 +0000 UTC" firstStartedPulling="2025-10-01 14:58:38.954865471 +0000 UTC m=+163.574040642" lastFinishedPulling="2025-10-01 14:59:53.935829155 +0000 UTC m=+238.555004326" observedRunningTime="2025-10-01 14:59:55.596569107 +0000 UTC m=+240.215744348" watchObservedRunningTime="2025-10-01 14:59:55.600495578 +0000 UTC m=+240.219670749" Oct 01 14:59:55 crc kubenswrapper[4771]: I1001 14:59:55.603131 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9dq54" Oct 01 14:59:55 crc kubenswrapper[4771]: I1001 14:59:55.652779 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9dq54" Oct 01 14:59:56 crc kubenswrapper[4771]: I1001 14:59:56.578074 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bf6jc" event={"ID":"8d691866-2c89-43bc-9d4b-931c6ac2b4d7","Type":"ContainerStarted","Data":"d55ceb8b773a9a98aa3d3bf1d495c78b234b5695d556cd562a218e72d5f6f99e"} Oct 01 14:59:56 crc kubenswrapper[4771]: I1001 14:59:56.602605 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bf6jc" podStartSLOduration=2.923652037 podStartE2EDuration="1m18.602580412s" podCreationTimestamp="2025-10-01 14:58:38 +0000 UTC" firstStartedPulling="2025-10-01 14:58:39.984795715 +0000 UTC m=+164.603970886" lastFinishedPulling="2025-10-01 14:59:55.66372407 +0000 UTC m=+240.282899261" observedRunningTime="2025-10-01 14:59:56.596469753 +0000 UTC m=+241.215644954" watchObservedRunningTime="2025-10-01 14:59:56.602580412 +0000 UTC m=+241.221755623" Oct 01 14:59:57 crc kubenswrapper[4771]: I1001 14:59:57.786575 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-h8g2c" Oct 01 14:59:57 crc kubenswrapper[4771]: I1001 14:59:57.787193 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-h8g2c" Oct 01 14:59:57 crc kubenswrapper[4771]: I1001 14:59:57.842251 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-h8g2c" Oct 01 14:59:58 crc kubenswrapper[4771]: I1001 14:59:58.770022 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bxmn4" Oct 01 14:59:58 crc kubenswrapper[4771]: I1001 14:59:58.770097 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bxmn4" Oct 01 14:59:58 crc kubenswrapper[4771]: I1001 14:59:58.814359 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bxmn4" Oct 01 14:59:59 crc kubenswrapper[4771]: I1001 14:59:59.150445 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bf6jc" Oct 01 14:59:59 crc kubenswrapper[4771]: I1001 14:59:59.151302 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bf6jc" Oct 01 14:59:59 crc kubenswrapper[4771]: I1001 14:59:59.629412 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bxmn4" Oct 01 15:00:00 crc kubenswrapper[4771]: I1001 15:00:00.139027 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322180-7bqch"] Oct 01 15:00:00 crc kubenswrapper[4771]: E1001 15:00:00.139597 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eab21930-5f16-4c0b-a4c9-f309f5cc3049" containerName="extract-utilities" Oct 01 15:00:00 crc kubenswrapper[4771]: I1001 15:00:00.139614 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="eab21930-5f16-4c0b-a4c9-f309f5cc3049" containerName="extract-utilities" Oct 01 15:00:00 crc kubenswrapper[4771]: E1001 15:00:00.139626 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acd3867f-b2c0-422f-872e-91b01c3e1eed" containerName="extract-utilities" Oct 01 15:00:00 crc kubenswrapper[4771]: I1001 15:00:00.139634 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="acd3867f-b2c0-422f-872e-91b01c3e1eed" containerName="extract-utilities" Oct 01 15:00:00 crc kubenswrapper[4771]: E1001 15:00:00.139642 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acd3867f-b2c0-422f-872e-91b01c3e1eed" containerName="extract-content" Oct 01 15:00:00 crc kubenswrapper[4771]: I1001 15:00:00.139649 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="acd3867f-b2c0-422f-872e-91b01c3e1eed" containerName="extract-content" Oct 01 15:00:00 crc kubenswrapper[4771]: E1001 15:00:00.139661 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b0c930b-aa23-40ab-892a-fe0eb123351c" containerName="pruner" Oct 01 15:00:00 crc kubenswrapper[4771]: I1001 15:00:00.139668 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b0c930b-aa23-40ab-892a-fe0eb123351c" containerName="pruner" Oct 01 15:00:00 crc kubenswrapper[4771]: E1001 15:00:00.139678 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acd3867f-b2c0-422f-872e-91b01c3e1eed" containerName="registry-server" Oct 01 15:00:00 crc kubenswrapper[4771]: I1001 15:00:00.139687 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="acd3867f-b2c0-422f-872e-91b01c3e1eed" containerName="registry-server" Oct 01 15:00:00 crc kubenswrapper[4771]: E1001 15:00:00.139702 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eab21930-5f16-4c0b-a4c9-f309f5cc3049" containerName="extract-content" Oct 01 15:00:00 crc kubenswrapper[4771]: I1001 15:00:00.139710 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="eab21930-5f16-4c0b-a4c9-f309f5cc3049" containerName="extract-content" Oct 01 15:00:00 crc kubenswrapper[4771]: E1001 15:00:00.139725 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eab21930-5f16-4c0b-a4c9-f309f5cc3049" containerName="registry-server" Oct 01 15:00:00 crc kubenswrapper[4771]: I1001 15:00:00.139750 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="eab21930-5f16-4c0b-a4c9-f309f5cc3049" containerName="registry-server" Oct 01 15:00:00 crc kubenswrapper[4771]: E1001 15:00:00.139761 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0dfffa8-847f-4e13-970c-b7f72bd3d92f" containerName="pruner" Oct 01 15:00:00 crc kubenswrapper[4771]: I1001 15:00:00.139769 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0dfffa8-847f-4e13-970c-b7f72bd3d92f" containerName="pruner" Oct 01 15:00:00 crc kubenswrapper[4771]: I1001 15:00:00.139893 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="eab21930-5f16-4c0b-a4c9-f309f5cc3049" containerName="registry-server" Oct 01 15:00:00 crc kubenswrapper[4771]: I1001 15:00:00.139908 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b0c930b-aa23-40ab-892a-fe0eb123351c" containerName="pruner" Oct 01 15:00:00 crc kubenswrapper[4771]: I1001 15:00:00.139921 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="acd3867f-b2c0-422f-872e-91b01c3e1eed" containerName="registry-server" Oct 01 15:00:00 crc kubenswrapper[4771]: I1001 15:00:00.139930 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0dfffa8-847f-4e13-970c-b7f72bd3d92f" containerName="pruner" Oct 01 15:00:00 crc kubenswrapper[4771]: I1001 15:00:00.140378 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322180-7bqch" Oct 01 15:00:00 crc kubenswrapper[4771]: I1001 15:00:00.145850 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 15:00:00 crc kubenswrapper[4771]: I1001 15:00:00.146006 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 15:00:00 crc kubenswrapper[4771]: I1001 15:00:00.153982 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322180-7bqch"] Oct 01 15:00:00 crc kubenswrapper[4771]: I1001 15:00:00.194656 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bf6jc" podUID="8d691866-2c89-43bc-9d4b-931c6ac2b4d7" containerName="registry-server" probeResult="failure" output=< Oct 01 15:00:00 crc kubenswrapper[4771]: timeout: failed to connect service ":50051" within 1s Oct 01 15:00:00 crc kubenswrapper[4771]: > Oct 01 15:00:00 crc kubenswrapper[4771]: I1001 15:00:00.226116 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z45x5\" (UniqueName: \"kubernetes.io/projected/f335a8c9-a8a0-4060-a1e0-690673e260de-kube-api-access-z45x5\") pod \"collect-profiles-29322180-7bqch\" (UID: \"f335a8c9-a8a0-4060-a1e0-690673e260de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322180-7bqch" Oct 01 15:00:00 crc kubenswrapper[4771]: I1001 15:00:00.226202 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f335a8c9-a8a0-4060-a1e0-690673e260de-config-volume\") pod \"collect-profiles-29322180-7bqch\" (UID: \"f335a8c9-a8a0-4060-a1e0-690673e260de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322180-7bqch" Oct 01 15:00:00 crc kubenswrapper[4771]: I1001 15:00:00.226266 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f335a8c9-a8a0-4060-a1e0-690673e260de-secret-volume\") pod \"collect-profiles-29322180-7bqch\" (UID: \"f335a8c9-a8a0-4060-a1e0-690673e260de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322180-7bqch" Oct 01 15:00:00 crc kubenswrapper[4771]: I1001 15:00:00.327626 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z45x5\" (UniqueName: \"kubernetes.io/projected/f335a8c9-a8a0-4060-a1e0-690673e260de-kube-api-access-z45x5\") pod \"collect-profiles-29322180-7bqch\" (UID: \"f335a8c9-a8a0-4060-a1e0-690673e260de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322180-7bqch" Oct 01 15:00:00 crc kubenswrapper[4771]: I1001 15:00:00.327704 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f335a8c9-a8a0-4060-a1e0-690673e260de-config-volume\") pod \"collect-profiles-29322180-7bqch\" (UID: \"f335a8c9-a8a0-4060-a1e0-690673e260de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322180-7bqch" Oct 01 15:00:00 crc kubenswrapper[4771]: I1001 15:00:00.327762 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f335a8c9-a8a0-4060-a1e0-690673e260de-secret-volume\") pod \"collect-profiles-29322180-7bqch\" (UID: \"f335a8c9-a8a0-4060-a1e0-690673e260de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322180-7bqch" Oct 01 15:00:00 crc kubenswrapper[4771]: I1001 15:00:00.329538 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f335a8c9-a8a0-4060-a1e0-690673e260de-config-volume\") pod \"collect-profiles-29322180-7bqch\" (UID: \"f335a8c9-a8a0-4060-a1e0-690673e260de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322180-7bqch" Oct 01 15:00:00 crc kubenswrapper[4771]: I1001 15:00:00.334035 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f335a8c9-a8a0-4060-a1e0-690673e260de-secret-volume\") pod \"collect-profiles-29322180-7bqch\" (UID: \"f335a8c9-a8a0-4060-a1e0-690673e260de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322180-7bqch" Oct 01 15:00:00 crc kubenswrapper[4771]: I1001 15:00:00.358850 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z45x5\" (UniqueName: \"kubernetes.io/projected/f335a8c9-a8a0-4060-a1e0-690673e260de-kube-api-access-z45x5\") pod \"collect-profiles-29322180-7bqch\" (UID: \"f335a8c9-a8a0-4060-a1e0-690673e260de\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322180-7bqch" Oct 01 15:00:00 crc kubenswrapper[4771]: I1001 15:00:00.458667 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322180-7bqch" Oct 01 15:00:00 crc kubenswrapper[4771]: I1001 15:00:00.887566 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322180-7bqch"] Oct 01 15:00:00 crc kubenswrapper[4771]: W1001 15:00:00.895516 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf335a8c9_a8a0_4060_a1e0_690673e260de.slice/crio-304c77dfed31d2bbe6c6f5de23ec64879301c10e6b6e7da47dbc87926279cec6 WatchSource:0}: Error finding container 304c77dfed31d2bbe6c6f5de23ec64879301c10e6b6e7da47dbc87926279cec6: Status 404 returned error can't find the container with id 304c77dfed31d2bbe6c6f5de23ec64879301c10e6b6e7da47dbc87926279cec6 Oct 01 15:00:01 crc kubenswrapper[4771]: I1001 15:00:01.604933 4771 generic.go:334] "Generic (PLEG): container finished" podID="f335a8c9-a8a0-4060-a1e0-690673e260de" containerID="896c92d7cf3f1058b27170ad2e06ed3c8a3e8d2e4a1893fd5d60760e7d6128c6" exitCode=0 Oct 01 15:00:01 crc kubenswrapper[4771]: I1001 15:00:01.605469 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322180-7bqch" event={"ID":"f335a8c9-a8a0-4060-a1e0-690673e260de","Type":"ContainerDied","Data":"896c92d7cf3f1058b27170ad2e06ed3c8a3e8d2e4a1893fd5d60760e7d6128c6"} Oct 01 15:00:01 crc kubenswrapper[4771]: I1001 15:00:01.605517 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322180-7bqch" event={"ID":"f335a8c9-a8a0-4060-a1e0-690673e260de","Type":"ContainerStarted","Data":"304c77dfed31d2bbe6c6f5de23ec64879301c10e6b6e7da47dbc87926279cec6"} Oct 01 15:00:02 crc kubenswrapper[4771]: I1001 15:00:02.913420 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322180-7bqch" Oct 01 15:00:03 crc kubenswrapper[4771]: I1001 15:00:03.068547 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f335a8c9-a8a0-4060-a1e0-690673e260de-secret-volume\") pod \"f335a8c9-a8a0-4060-a1e0-690673e260de\" (UID: \"f335a8c9-a8a0-4060-a1e0-690673e260de\") " Oct 01 15:00:03 crc kubenswrapper[4771]: I1001 15:00:03.068695 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f335a8c9-a8a0-4060-a1e0-690673e260de-config-volume\") pod \"f335a8c9-a8a0-4060-a1e0-690673e260de\" (UID: \"f335a8c9-a8a0-4060-a1e0-690673e260de\") " Oct 01 15:00:03 crc kubenswrapper[4771]: I1001 15:00:03.068770 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z45x5\" (UniqueName: \"kubernetes.io/projected/f335a8c9-a8a0-4060-a1e0-690673e260de-kube-api-access-z45x5\") pod \"f335a8c9-a8a0-4060-a1e0-690673e260de\" (UID: \"f335a8c9-a8a0-4060-a1e0-690673e260de\") " Oct 01 15:00:03 crc kubenswrapper[4771]: I1001 15:00:03.069522 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f335a8c9-a8a0-4060-a1e0-690673e260de-config-volume" (OuterVolumeSpecName: "config-volume") pod "f335a8c9-a8a0-4060-a1e0-690673e260de" (UID: "f335a8c9-a8a0-4060-a1e0-690673e260de"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:00:03 crc kubenswrapper[4771]: I1001 15:00:03.073614 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f335a8c9-a8a0-4060-a1e0-690673e260de-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f335a8c9-a8a0-4060-a1e0-690673e260de" (UID: "f335a8c9-a8a0-4060-a1e0-690673e260de"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:00:03 crc kubenswrapper[4771]: I1001 15:00:03.077917 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f335a8c9-a8a0-4060-a1e0-690673e260de-kube-api-access-z45x5" (OuterVolumeSpecName: "kube-api-access-z45x5") pod "f335a8c9-a8a0-4060-a1e0-690673e260de" (UID: "f335a8c9-a8a0-4060-a1e0-690673e260de"). InnerVolumeSpecName "kube-api-access-z45x5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:00:03 crc kubenswrapper[4771]: I1001 15:00:03.170111 4771 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f335a8c9-a8a0-4060-a1e0-690673e260de-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 15:00:03 crc kubenswrapper[4771]: I1001 15:00:03.170173 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z45x5\" (UniqueName: \"kubernetes.io/projected/f335a8c9-a8a0-4060-a1e0-690673e260de-kube-api-access-z45x5\") on node \"crc\" DevicePath \"\"" Oct 01 15:00:03 crc kubenswrapper[4771]: I1001 15:00:03.170195 4771 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f335a8c9-a8a0-4060-a1e0-690673e260de-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 15:00:03 crc kubenswrapper[4771]: I1001 15:00:03.621648 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322180-7bqch" event={"ID":"f335a8c9-a8a0-4060-a1e0-690673e260de","Type":"ContainerDied","Data":"304c77dfed31d2bbe6c6f5de23ec64879301c10e6b6e7da47dbc87926279cec6"} Oct 01 15:00:03 crc kubenswrapper[4771]: I1001 15:00:03.621958 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="304c77dfed31d2bbe6c6f5de23ec64879301c10e6b6e7da47dbc87926279cec6" Oct 01 15:00:03 crc kubenswrapper[4771]: I1001 15:00:03.621883 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322180-7bqch" Oct 01 15:00:07 crc kubenswrapper[4771]: I1001 15:00:07.840599 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-h8g2c" Oct 01 15:00:07 crc kubenswrapper[4771]: I1001 15:00:07.885440 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h8g2c"] Oct 01 15:00:08 crc kubenswrapper[4771]: I1001 15:00:08.139250 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-pr972"] Oct 01 15:00:08 crc kubenswrapper[4771]: I1001 15:00:08.649079 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-h8g2c" podUID="dc5bfdae-9f89-4ef2-b197-69937670c341" containerName="registry-server" containerID="cri-o://89de138919bf3a0780ffa9654d7c283d25e54415077fae3cb50d5f22dd1b1ff2" gracePeriod=2 Oct 01 15:00:08 crc kubenswrapper[4771]: I1001 15:00:08.993594 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h8g2c" Oct 01 15:00:09 crc kubenswrapper[4771]: I1001 15:00:09.141795 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc5bfdae-9f89-4ef2-b197-69937670c341-utilities\") pod \"dc5bfdae-9f89-4ef2-b197-69937670c341\" (UID: \"dc5bfdae-9f89-4ef2-b197-69937670c341\") " Oct 01 15:00:09 crc kubenswrapper[4771]: I1001 15:00:09.141893 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tcdb\" (UniqueName: \"kubernetes.io/projected/dc5bfdae-9f89-4ef2-b197-69937670c341-kube-api-access-7tcdb\") pod \"dc5bfdae-9f89-4ef2-b197-69937670c341\" (UID: \"dc5bfdae-9f89-4ef2-b197-69937670c341\") " Oct 01 15:00:09 crc kubenswrapper[4771]: I1001 15:00:09.142764 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc5bfdae-9f89-4ef2-b197-69937670c341-utilities" (OuterVolumeSpecName: "utilities") pod "dc5bfdae-9f89-4ef2-b197-69937670c341" (UID: "dc5bfdae-9f89-4ef2-b197-69937670c341"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:00:09 crc kubenswrapper[4771]: I1001 15:00:09.142855 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc5bfdae-9f89-4ef2-b197-69937670c341-catalog-content\") pod \"dc5bfdae-9f89-4ef2-b197-69937670c341\" (UID: \"dc5bfdae-9f89-4ef2-b197-69937670c341\") " Oct 01 15:00:09 crc kubenswrapper[4771]: I1001 15:00:09.143090 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc5bfdae-9f89-4ef2-b197-69937670c341-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 15:00:09 crc kubenswrapper[4771]: I1001 15:00:09.147019 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc5bfdae-9f89-4ef2-b197-69937670c341-kube-api-access-7tcdb" (OuterVolumeSpecName: "kube-api-access-7tcdb") pod "dc5bfdae-9f89-4ef2-b197-69937670c341" (UID: "dc5bfdae-9f89-4ef2-b197-69937670c341"). InnerVolumeSpecName "kube-api-access-7tcdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:00:09 crc kubenswrapper[4771]: I1001 15:00:09.155109 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc5bfdae-9f89-4ef2-b197-69937670c341-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dc5bfdae-9f89-4ef2-b197-69937670c341" (UID: "dc5bfdae-9f89-4ef2-b197-69937670c341"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:00:09 crc kubenswrapper[4771]: I1001 15:00:09.189414 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bf6jc" Oct 01 15:00:09 crc kubenswrapper[4771]: I1001 15:00:09.227645 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bf6jc" Oct 01 15:00:09 crc kubenswrapper[4771]: I1001 15:00:09.243866 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tcdb\" (UniqueName: \"kubernetes.io/projected/dc5bfdae-9f89-4ef2-b197-69937670c341-kube-api-access-7tcdb\") on node \"crc\" DevicePath \"\"" Oct 01 15:00:09 crc kubenswrapper[4771]: I1001 15:00:09.243899 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc5bfdae-9f89-4ef2-b197-69937670c341-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 15:00:09 crc kubenswrapper[4771]: I1001 15:00:09.670711 4771 generic.go:334] "Generic (PLEG): container finished" podID="dc5bfdae-9f89-4ef2-b197-69937670c341" containerID="89de138919bf3a0780ffa9654d7c283d25e54415077fae3cb50d5f22dd1b1ff2" exitCode=0 Oct 01 15:00:09 crc kubenswrapper[4771]: I1001 15:00:09.670797 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h8g2c" event={"ID":"dc5bfdae-9f89-4ef2-b197-69937670c341","Type":"ContainerDied","Data":"89de138919bf3a0780ffa9654d7c283d25e54415077fae3cb50d5f22dd1b1ff2"} Oct 01 15:00:09 crc kubenswrapper[4771]: I1001 15:00:09.671200 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-h8g2c" event={"ID":"dc5bfdae-9f89-4ef2-b197-69937670c341","Type":"ContainerDied","Data":"0857d1856fa16309ceec618084603c51f1d80810a52385b0360eeca3092771cc"} Oct 01 15:00:09 crc kubenswrapper[4771]: I1001 15:00:09.671226 4771 scope.go:117] "RemoveContainer" containerID="89de138919bf3a0780ffa9654d7c283d25e54415077fae3cb50d5f22dd1b1ff2" Oct 01 15:00:09 crc kubenswrapper[4771]: I1001 15:00:09.670918 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-h8g2c" Oct 01 15:00:09 crc kubenswrapper[4771]: I1001 15:00:09.690200 4771 scope.go:117] "RemoveContainer" containerID="01ddf22875220071c811425d0661d2044e75278ffd4809e5f39c0109e80f4168" Oct 01 15:00:09 crc kubenswrapper[4771]: I1001 15:00:09.700004 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-h8g2c"] Oct 01 15:00:09 crc kubenswrapper[4771]: I1001 15:00:09.702737 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-h8g2c"] Oct 01 15:00:09 crc kubenswrapper[4771]: I1001 15:00:09.722950 4771 scope.go:117] "RemoveContainer" containerID="6c5f00b7d5ea7b3521c534813eb31d382d7d633f273d2208799751249d9c49c7" Oct 01 15:00:09 crc kubenswrapper[4771]: I1001 15:00:09.735029 4771 scope.go:117] "RemoveContainer" containerID="89de138919bf3a0780ffa9654d7c283d25e54415077fae3cb50d5f22dd1b1ff2" Oct 01 15:00:09 crc kubenswrapper[4771]: E1001 15:00:09.735487 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89de138919bf3a0780ffa9654d7c283d25e54415077fae3cb50d5f22dd1b1ff2\": container with ID starting with 89de138919bf3a0780ffa9654d7c283d25e54415077fae3cb50d5f22dd1b1ff2 not found: ID does not exist" containerID="89de138919bf3a0780ffa9654d7c283d25e54415077fae3cb50d5f22dd1b1ff2" Oct 01 15:00:09 crc kubenswrapper[4771]: I1001 15:00:09.735520 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89de138919bf3a0780ffa9654d7c283d25e54415077fae3cb50d5f22dd1b1ff2"} err="failed to get container status \"89de138919bf3a0780ffa9654d7c283d25e54415077fae3cb50d5f22dd1b1ff2\": rpc error: code = NotFound desc = could not find container \"89de138919bf3a0780ffa9654d7c283d25e54415077fae3cb50d5f22dd1b1ff2\": container with ID starting with 89de138919bf3a0780ffa9654d7c283d25e54415077fae3cb50d5f22dd1b1ff2 not found: ID does not exist" Oct 01 15:00:09 crc kubenswrapper[4771]: I1001 15:00:09.735543 4771 scope.go:117] "RemoveContainer" containerID="01ddf22875220071c811425d0661d2044e75278ffd4809e5f39c0109e80f4168" Oct 01 15:00:09 crc kubenswrapper[4771]: E1001 15:00:09.735927 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01ddf22875220071c811425d0661d2044e75278ffd4809e5f39c0109e80f4168\": container with ID starting with 01ddf22875220071c811425d0661d2044e75278ffd4809e5f39c0109e80f4168 not found: ID does not exist" containerID="01ddf22875220071c811425d0661d2044e75278ffd4809e5f39c0109e80f4168" Oct 01 15:00:09 crc kubenswrapper[4771]: I1001 15:00:09.735971 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01ddf22875220071c811425d0661d2044e75278ffd4809e5f39c0109e80f4168"} err="failed to get container status \"01ddf22875220071c811425d0661d2044e75278ffd4809e5f39c0109e80f4168\": rpc error: code = NotFound desc = could not find container \"01ddf22875220071c811425d0661d2044e75278ffd4809e5f39c0109e80f4168\": container with ID starting with 01ddf22875220071c811425d0661d2044e75278ffd4809e5f39c0109e80f4168 not found: ID does not exist" Oct 01 15:00:09 crc kubenswrapper[4771]: I1001 15:00:09.736003 4771 scope.go:117] "RemoveContainer" containerID="6c5f00b7d5ea7b3521c534813eb31d382d7d633f273d2208799751249d9c49c7" Oct 01 15:00:09 crc kubenswrapper[4771]: E1001 15:00:09.736294 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c5f00b7d5ea7b3521c534813eb31d382d7d633f273d2208799751249d9c49c7\": container with ID starting with 6c5f00b7d5ea7b3521c534813eb31d382d7d633f273d2208799751249d9c49c7 not found: ID does not exist" containerID="6c5f00b7d5ea7b3521c534813eb31d382d7d633f273d2208799751249d9c49c7" Oct 01 15:00:09 crc kubenswrapper[4771]: I1001 15:00:09.736316 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c5f00b7d5ea7b3521c534813eb31d382d7d633f273d2208799751249d9c49c7"} err="failed to get container status \"6c5f00b7d5ea7b3521c534813eb31d382d7d633f273d2208799751249d9c49c7\": rpc error: code = NotFound desc = could not find container \"6c5f00b7d5ea7b3521c534813eb31d382d7d633f273d2208799751249d9c49c7\": container with ID starting with 6c5f00b7d5ea7b3521c534813eb31d382d7d633f273d2208799751249d9c49c7 not found: ID does not exist" Oct 01 15:00:09 crc kubenswrapper[4771]: I1001 15:00:09.990615 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc5bfdae-9f89-4ef2-b197-69937670c341" path="/var/lib/kubelet/pods/dc5bfdae-9f89-4ef2-b197-69937670c341/volumes" Oct 01 15:00:11 crc kubenswrapper[4771]: I1001 15:00:11.616075 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bf6jc"] Oct 01 15:00:11 crc kubenswrapper[4771]: I1001 15:00:11.616280 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bf6jc" podUID="8d691866-2c89-43bc-9d4b-931c6ac2b4d7" containerName="registry-server" containerID="cri-o://d55ceb8b773a9a98aa3d3bf1d495c78b234b5695d556cd562a218e72d5f6f99e" gracePeriod=2 Oct 01 15:00:11 crc kubenswrapper[4771]: I1001 15:00:11.992279 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bf6jc" Oct 01 15:00:12 crc kubenswrapper[4771]: I1001 15:00:12.178959 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hd69v\" (UniqueName: \"kubernetes.io/projected/8d691866-2c89-43bc-9d4b-931c6ac2b4d7-kube-api-access-hd69v\") pod \"8d691866-2c89-43bc-9d4b-931c6ac2b4d7\" (UID: \"8d691866-2c89-43bc-9d4b-931c6ac2b4d7\") " Oct 01 15:00:12 crc kubenswrapper[4771]: I1001 15:00:12.179041 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d691866-2c89-43bc-9d4b-931c6ac2b4d7-catalog-content\") pod \"8d691866-2c89-43bc-9d4b-931c6ac2b4d7\" (UID: \"8d691866-2c89-43bc-9d4b-931c6ac2b4d7\") " Oct 01 15:00:12 crc kubenswrapper[4771]: I1001 15:00:12.179140 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d691866-2c89-43bc-9d4b-931c6ac2b4d7-utilities\") pod \"8d691866-2c89-43bc-9d4b-931c6ac2b4d7\" (UID: \"8d691866-2c89-43bc-9d4b-931c6ac2b4d7\") " Oct 01 15:00:12 crc kubenswrapper[4771]: I1001 15:00:12.179806 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d691866-2c89-43bc-9d4b-931c6ac2b4d7-utilities" (OuterVolumeSpecName: "utilities") pod "8d691866-2c89-43bc-9d4b-931c6ac2b4d7" (UID: "8d691866-2c89-43bc-9d4b-931c6ac2b4d7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:00:12 crc kubenswrapper[4771]: I1001 15:00:12.187844 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d691866-2c89-43bc-9d4b-931c6ac2b4d7-kube-api-access-hd69v" (OuterVolumeSpecName: "kube-api-access-hd69v") pod "8d691866-2c89-43bc-9d4b-931c6ac2b4d7" (UID: "8d691866-2c89-43bc-9d4b-931c6ac2b4d7"). InnerVolumeSpecName "kube-api-access-hd69v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:00:12 crc kubenswrapper[4771]: I1001 15:00:12.258613 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d691866-2c89-43bc-9d4b-931c6ac2b4d7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8d691866-2c89-43bc-9d4b-931c6ac2b4d7" (UID: "8d691866-2c89-43bc-9d4b-931c6ac2b4d7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:00:12 crc kubenswrapper[4771]: I1001 15:00:12.280999 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d691866-2c89-43bc-9d4b-931c6ac2b4d7-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 15:00:12 crc kubenswrapper[4771]: I1001 15:00:12.281043 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hd69v\" (UniqueName: \"kubernetes.io/projected/8d691866-2c89-43bc-9d4b-931c6ac2b4d7-kube-api-access-hd69v\") on node \"crc\" DevicePath \"\"" Oct 01 15:00:12 crc kubenswrapper[4771]: I1001 15:00:12.281064 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d691866-2c89-43bc-9d4b-931c6ac2b4d7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 15:00:12 crc kubenswrapper[4771]: I1001 15:00:12.689115 4771 generic.go:334] "Generic (PLEG): container finished" podID="8d691866-2c89-43bc-9d4b-931c6ac2b4d7" containerID="d55ceb8b773a9a98aa3d3bf1d495c78b234b5695d556cd562a218e72d5f6f99e" exitCode=0 Oct 01 15:00:12 crc kubenswrapper[4771]: I1001 15:00:12.689176 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bf6jc" event={"ID":"8d691866-2c89-43bc-9d4b-931c6ac2b4d7","Type":"ContainerDied","Data":"d55ceb8b773a9a98aa3d3bf1d495c78b234b5695d556cd562a218e72d5f6f99e"} Oct 01 15:00:12 crc kubenswrapper[4771]: I1001 15:00:12.689825 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bf6jc" event={"ID":"8d691866-2c89-43bc-9d4b-931c6ac2b4d7","Type":"ContainerDied","Data":"7f900588b1defff85cecd6a2b750afca4dfab02df369e6cd97483b77ea92aabd"} Oct 01 15:00:12 crc kubenswrapper[4771]: I1001 15:00:12.689872 4771 scope.go:117] "RemoveContainer" containerID="d55ceb8b773a9a98aa3d3bf1d495c78b234b5695d556cd562a218e72d5f6f99e" Oct 01 15:00:12 crc kubenswrapper[4771]: I1001 15:00:12.689933 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bf6jc" Oct 01 15:00:12 crc kubenswrapper[4771]: I1001 15:00:12.720227 4771 scope.go:117] "RemoveContainer" containerID="6cf793c69ac00d1079ebd0af9d6e4aa3d8936c757f9df8e3525112e2d025a1f2" Oct 01 15:00:12 crc kubenswrapper[4771]: I1001 15:00:12.720616 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bf6jc"] Oct 01 15:00:12 crc kubenswrapper[4771]: I1001 15:00:12.722795 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bf6jc"] Oct 01 15:00:12 crc kubenswrapper[4771]: I1001 15:00:12.750945 4771 scope.go:117] "RemoveContainer" containerID="f6f929d21bd6fc0cf757d137678b06043a82d39555d25c2586279292bb2e96d0" Oct 01 15:00:12 crc kubenswrapper[4771]: I1001 15:00:12.764462 4771 scope.go:117] "RemoveContainer" containerID="d55ceb8b773a9a98aa3d3bf1d495c78b234b5695d556cd562a218e72d5f6f99e" Oct 01 15:00:12 crc kubenswrapper[4771]: E1001 15:00:12.765020 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d55ceb8b773a9a98aa3d3bf1d495c78b234b5695d556cd562a218e72d5f6f99e\": container with ID starting with d55ceb8b773a9a98aa3d3bf1d495c78b234b5695d556cd562a218e72d5f6f99e not found: ID does not exist" containerID="d55ceb8b773a9a98aa3d3bf1d495c78b234b5695d556cd562a218e72d5f6f99e" Oct 01 15:00:12 crc kubenswrapper[4771]: I1001 15:00:12.765069 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d55ceb8b773a9a98aa3d3bf1d495c78b234b5695d556cd562a218e72d5f6f99e"} err="failed to get container status \"d55ceb8b773a9a98aa3d3bf1d495c78b234b5695d556cd562a218e72d5f6f99e\": rpc error: code = NotFound desc = could not find container \"d55ceb8b773a9a98aa3d3bf1d495c78b234b5695d556cd562a218e72d5f6f99e\": container with ID starting with d55ceb8b773a9a98aa3d3bf1d495c78b234b5695d556cd562a218e72d5f6f99e not found: ID does not exist" Oct 01 15:00:12 crc kubenswrapper[4771]: I1001 15:00:12.765094 4771 scope.go:117] "RemoveContainer" containerID="6cf793c69ac00d1079ebd0af9d6e4aa3d8936c757f9df8e3525112e2d025a1f2" Oct 01 15:00:12 crc kubenswrapper[4771]: E1001 15:00:12.765544 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cf793c69ac00d1079ebd0af9d6e4aa3d8936c757f9df8e3525112e2d025a1f2\": container with ID starting with 6cf793c69ac00d1079ebd0af9d6e4aa3d8936c757f9df8e3525112e2d025a1f2 not found: ID does not exist" containerID="6cf793c69ac00d1079ebd0af9d6e4aa3d8936c757f9df8e3525112e2d025a1f2" Oct 01 15:00:12 crc kubenswrapper[4771]: I1001 15:00:12.765803 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cf793c69ac00d1079ebd0af9d6e4aa3d8936c757f9df8e3525112e2d025a1f2"} err="failed to get container status \"6cf793c69ac00d1079ebd0af9d6e4aa3d8936c757f9df8e3525112e2d025a1f2\": rpc error: code = NotFound desc = could not find container \"6cf793c69ac00d1079ebd0af9d6e4aa3d8936c757f9df8e3525112e2d025a1f2\": container with ID starting with 6cf793c69ac00d1079ebd0af9d6e4aa3d8936c757f9df8e3525112e2d025a1f2 not found: ID does not exist" Oct 01 15:00:12 crc kubenswrapper[4771]: I1001 15:00:12.766034 4771 scope.go:117] "RemoveContainer" containerID="f6f929d21bd6fc0cf757d137678b06043a82d39555d25c2586279292bb2e96d0" Oct 01 15:00:12 crc kubenswrapper[4771]: E1001 15:00:12.766481 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6f929d21bd6fc0cf757d137678b06043a82d39555d25c2586279292bb2e96d0\": container with ID starting with f6f929d21bd6fc0cf757d137678b06043a82d39555d25c2586279292bb2e96d0 not found: ID does not exist" containerID="f6f929d21bd6fc0cf757d137678b06043a82d39555d25c2586279292bb2e96d0" Oct 01 15:00:12 crc kubenswrapper[4771]: I1001 15:00:12.766506 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6f929d21bd6fc0cf757d137678b06043a82d39555d25c2586279292bb2e96d0"} err="failed to get container status \"f6f929d21bd6fc0cf757d137678b06043a82d39555d25c2586279292bb2e96d0\": rpc error: code = NotFound desc = could not find container \"f6f929d21bd6fc0cf757d137678b06043a82d39555d25c2586279292bb2e96d0\": container with ID starting with f6f929d21bd6fc0cf757d137678b06043a82d39555d25c2586279292bb2e96d0 not found: ID does not exist" Oct 01 15:00:14 crc kubenswrapper[4771]: I1001 15:00:14.006245 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d691866-2c89-43bc-9d4b-931c6ac2b4d7" path="/var/lib/kubelet/pods/8d691866-2c89-43bc-9d4b-931c6ac2b4d7/volumes" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.181791 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-pr972" podUID="0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d" containerName="oauth-openshift" containerID="cri-o://cc913c282407242dce14acab5ada402e8f333fc188d851c277881a722300e207" gracePeriod=15 Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.661785 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-pr972" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.695684 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5686c9c7dd-2q89s"] Oct 01 15:00:33 crc kubenswrapper[4771]: E1001 15:00:33.696025 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d691866-2c89-43bc-9d4b-931c6ac2b4d7" containerName="extract-content" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.696052 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d691866-2c89-43bc-9d4b-931c6ac2b4d7" containerName="extract-content" Oct 01 15:00:33 crc kubenswrapper[4771]: E1001 15:00:33.696069 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d691866-2c89-43bc-9d4b-931c6ac2b4d7" containerName="registry-server" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.696080 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d691866-2c89-43bc-9d4b-931c6ac2b4d7" containerName="registry-server" Oct 01 15:00:33 crc kubenswrapper[4771]: E1001 15:00:33.696101 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc5bfdae-9f89-4ef2-b197-69937670c341" containerName="extract-content" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.696112 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc5bfdae-9f89-4ef2-b197-69937670c341" containerName="extract-content" Oct 01 15:00:33 crc kubenswrapper[4771]: E1001 15:00:33.696134 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc5bfdae-9f89-4ef2-b197-69937670c341" containerName="registry-server" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.696145 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc5bfdae-9f89-4ef2-b197-69937670c341" containerName="registry-server" Oct 01 15:00:33 crc kubenswrapper[4771]: E1001 15:00:33.696161 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f335a8c9-a8a0-4060-a1e0-690673e260de" containerName="collect-profiles" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.696171 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f335a8c9-a8a0-4060-a1e0-690673e260de" containerName="collect-profiles" Oct 01 15:00:33 crc kubenswrapper[4771]: E1001 15:00:33.696183 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc5bfdae-9f89-4ef2-b197-69937670c341" containerName="extract-utilities" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.696194 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc5bfdae-9f89-4ef2-b197-69937670c341" containerName="extract-utilities" Oct 01 15:00:33 crc kubenswrapper[4771]: E1001 15:00:33.696213 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d" containerName="oauth-openshift" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.696223 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d" containerName="oauth-openshift" Oct 01 15:00:33 crc kubenswrapper[4771]: E1001 15:00:33.696236 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d691866-2c89-43bc-9d4b-931c6ac2b4d7" containerName="extract-utilities" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.696246 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d691866-2c89-43bc-9d4b-931c6ac2b4d7" containerName="extract-utilities" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.696401 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f335a8c9-a8a0-4060-a1e0-690673e260de" containerName="collect-profiles" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.696418 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc5bfdae-9f89-4ef2-b197-69937670c341" containerName="registry-server" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.696429 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d691866-2c89-43bc-9d4b-931c6ac2b4d7" containerName="registry-server" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.696448 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d" containerName="oauth-openshift" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.697070 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.712121 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5686c9c7dd-2q89s"] Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.808165 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-system-session\") pod \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.808295 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-user-idp-0-file-data\") pod \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.808333 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-system-cliconfig\") pod \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.808394 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-system-router-certs\") pod \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.808432 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-user-template-login\") pod \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.808465 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-audit-dir\") pod \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.808519 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-user-template-provider-selection\") pod \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.808563 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-system-service-ca\") pod \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.808594 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-audit-policies\") pod \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.808646 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hp8lb\" (UniqueName: \"kubernetes.io/projected/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-kube-api-access-hp8lb\") pod \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.808682 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-user-template-error\") pod \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.808767 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-system-trusted-ca-bundle\") pod \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.808800 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-system-serving-cert\") pod \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.808839 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-system-ocp-branding-template\") pod \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\" (UID: \"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d\") " Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.809024 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/801f1e63-b4dc-4975-82e9-15ebc2502aba-v4-0-config-user-template-error\") pod \"oauth-openshift-5686c9c7dd-2q89s\" (UID: \"801f1e63-b4dc-4975-82e9-15ebc2502aba\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.809082 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/801f1e63-b4dc-4975-82e9-15ebc2502aba-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5686c9c7dd-2q89s\" (UID: \"801f1e63-b4dc-4975-82e9-15ebc2502aba\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.809115 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/801f1e63-b4dc-4975-82e9-15ebc2502aba-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5686c9c7dd-2q89s\" (UID: \"801f1e63-b4dc-4975-82e9-15ebc2502aba\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.809156 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/801f1e63-b4dc-4975-82e9-15ebc2502aba-v4-0-config-user-template-login\") pod \"oauth-openshift-5686c9c7dd-2q89s\" (UID: \"801f1e63-b4dc-4975-82e9-15ebc2502aba\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.809190 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/801f1e63-b4dc-4975-82e9-15ebc2502aba-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5686c9c7dd-2q89s\" (UID: \"801f1e63-b4dc-4975-82e9-15ebc2502aba\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.809262 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/801f1e63-b4dc-4975-82e9-15ebc2502aba-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5686c9c7dd-2q89s\" (UID: \"801f1e63-b4dc-4975-82e9-15ebc2502aba\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.809303 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md6tl\" (UniqueName: \"kubernetes.io/projected/801f1e63-b4dc-4975-82e9-15ebc2502aba-kube-api-access-md6tl\") pod \"oauth-openshift-5686c9c7dd-2q89s\" (UID: \"801f1e63-b4dc-4975-82e9-15ebc2502aba\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.809322 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d" (UID: "0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.809342 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/801f1e63-b4dc-4975-82e9-15ebc2502aba-audit-dir\") pod \"oauth-openshift-5686c9c7dd-2q89s\" (UID: \"801f1e63-b4dc-4975-82e9-15ebc2502aba\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.809512 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/801f1e63-b4dc-4975-82e9-15ebc2502aba-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5686c9c7dd-2q89s\" (UID: \"801f1e63-b4dc-4975-82e9-15ebc2502aba\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.809586 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/801f1e63-b4dc-4975-82e9-15ebc2502aba-v4-0-config-system-session\") pod \"oauth-openshift-5686c9c7dd-2q89s\" (UID: \"801f1e63-b4dc-4975-82e9-15ebc2502aba\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.809600 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d" (UID: "0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.809622 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/801f1e63-b4dc-4975-82e9-15ebc2502aba-v4-0-config-system-service-ca\") pod \"oauth-openshift-5686c9c7dd-2q89s\" (UID: \"801f1e63-b4dc-4975-82e9-15ebc2502aba\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.809663 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/801f1e63-b4dc-4975-82e9-15ebc2502aba-audit-policies\") pod \"oauth-openshift-5686c9c7dd-2q89s\" (UID: \"801f1e63-b4dc-4975-82e9-15ebc2502aba\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.809696 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/801f1e63-b4dc-4975-82e9-15ebc2502aba-v4-0-config-system-router-certs\") pod \"oauth-openshift-5686c9c7dd-2q89s\" (UID: \"801f1e63-b4dc-4975-82e9-15ebc2502aba\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.809707 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d" (UID: "0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.809721 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d" (UID: "0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.809766 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/801f1e63-b4dc-4975-82e9-15ebc2502aba-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5686c9c7dd-2q89s\" (UID: \"801f1e63-b4dc-4975-82e9-15ebc2502aba\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.809794 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d" (UID: "0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.809971 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.809990 4771 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.810006 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.810018 4771 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.810031 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.814213 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d" (UID: "0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.814407 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-kube-api-access-hp8lb" (OuterVolumeSpecName: "kube-api-access-hp8lb") pod "0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d" (UID: "0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d"). InnerVolumeSpecName "kube-api-access-hp8lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.814871 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d" (UID: "0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.815176 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d" (UID: "0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.817071 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d" (UID: "0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.817415 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d" (UID: "0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.820187 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d" (UID: "0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.820386 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d" (UID: "0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.821407 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d" (UID: "0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.835518 4771 generic.go:334] "Generic (PLEG): container finished" podID="0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d" containerID="cc913c282407242dce14acab5ada402e8f333fc188d851c277881a722300e207" exitCode=0 Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.835569 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-pr972" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.835565 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-pr972" event={"ID":"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d","Type":"ContainerDied","Data":"cc913c282407242dce14acab5ada402e8f333fc188d851c277881a722300e207"} Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.836435 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-pr972" event={"ID":"0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d","Type":"ContainerDied","Data":"34b765a834fe7b4fac8af6c5f00d2a569085e864c140edf538a322ef3ed46827"} Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.836462 4771 scope.go:117] "RemoveContainer" containerID="cc913c282407242dce14acab5ada402e8f333fc188d851c277881a722300e207" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.873823 4771 scope.go:117] "RemoveContainer" containerID="cc913c282407242dce14acab5ada402e8f333fc188d851c277881a722300e207" Oct 01 15:00:33 crc kubenswrapper[4771]: E1001 15:00:33.874286 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc913c282407242dce14acab5ada402e8f333fc188d851c277881a722300e207\": container with ID starting with cc913c282407242dce14acab5ada402e8f333fc188d851c277881a722300e207 not found: ID does not exist" containerID="cc913c282407242dce14acab5ada402e8f333fc188d851c277881a722300e207" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.874320 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc913c282407242dce14acab5ada402e8f333fc188d851c277881a722300e207"} err="failed to get container status \"cc913c282407242dce14acab5ada402e8f333fc188d851c277881a722300e207\": rpc error: code = NotFound desc = could not find container \"cc913c282407242dce14acab5ada402e8f333fc188d851c277881a722300e207\": container with ID starting with cc913c282407242dce14acab5ada402e8f333fc188d851c277881a722300e207 not found: ID does not exist" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.883747 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-pr972"] Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.888160 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-pr972"] Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.911169 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/801f1e63-b4dc-4975-82e9-15ebc2502aba-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5686c9c7dd-2q89s\" (UID: \"801f1e63-b4dc-4975-82e9-15ebc2502aba\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.911354 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-md6tl\" (UniqueName: \"kubernetes.io/projected/801f1e63-b4dc-4975-82e9-15ebc2502aba-kube-api-access-md6tl\") pod \"oauth-openshift-5686c9c7dd-2q89s\" (UID: \"801f1e63-b4dc-4975-82e9-15ebc2502aba\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.911472 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/801f1e63-b4dc-4975-82e9-15ebc2502aba-audit-dir\") pod \"oauth-openshift-5686c9c7dd-2q89s\" (UID: \"801f1e63-b4dc-4975-82e9-15ebc2502aba\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.911572 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/801f1e63-b4dc-4975-82e9-15ebc2502aba-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5686c9c7dd-2q89s\" (UID: \"801f1e63-b4dc-4975-82e9-15ebc2502aba\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.911680 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/801f1e63-b4dc-4975-82e9-15ebc2502aba-v4-0-config-system-session\") pod \"oauth-openshift-5686c9c7dd-2q89s\" (UID: \"801f1e63-b4dc-4975-82e9-15ebc2502aba\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.911789 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/801f1e63-b4dc-4975-82e9-15ebc2502aba-v4-0-config-system-service-ca\") pod \"oauth-openshift-5686c9c7dd-2q89s\" (UID: \"801f1e63-b4dc-4975-82e9-15ebc2502aba\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.911895 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/801f1e63-b4dc-4975-82e9-15ebc2502aba-audit-policies\") pod \"oauth-openshift-5686c9c7dd-2q89s\" (UID: \"801f1e63-b4dc-4975-82e9-15ebc2502aba\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.911997 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/801f1e63-b4dc-4975-82e9-15ebc2502aba-v4-0-config-system-router-certs\") pod \"oauth-openshift-5686c9c7dd-2q89s\" (UID: \"801f1e63-b4dc-4975-82e9-15ebc2502aba\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.912101 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/801f1e63-b4dc-4975-82e9-15ebc2502aba-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5686c9c7dd-2q89s\" (UID: \"801f1e63-b4dc-4975-82e9-15ebc2502aba\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.912202 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/801f1e63-b4dc-4975-82e9-15ebc2502aba-v4-0-config-user-template-error\") pod \"oauth-openshift-5686c9c7dd-2q89s\" (UID: \"801f1e63-b4dc-4975-82e9-15ebc2502aba\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.912274 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/801f1e63-b4dc-4975-82e9-15ebc2502aba-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5686c9c7dd-2q89s\" (UID: \"801f1e63-b4dc-4975-82e9-15ebc2502aba\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.912352 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/801f1e63-b4dc-4975-82e9-15ebc2502aba-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5686c9c7dd-2q89s\" (UID: \"801f1e63-b4dc-4975-82e9-15ebc2502aba\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.912424 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/801f1e63-b4dc-4975-82e9-15ebc2502aba-v4-0-config-user-template-login\") pod \"oauth-openshift-5686c9c7dd-2q89s\" (UID: \"801f1e63-b4dc-4975-82e9-15ebc2502aba\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.912498 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/801f1e63-b4dc-4975-82e9-15ebc2502aba-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5686c9c7dd-2q89s\" (UID: \"801f1e63-b4dc-4975-82e9-15ebc2502aba\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.912599 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.912662 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.912748 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.912819 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.912886 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.912954 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.913013 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.913076 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hp8lb\" (UniqueName: \"kubernetes.io/projected/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-kube-api-access-hp8lb\") on node \"crc\" DevicePath \"\"" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.913141 4771 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.914027 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/801f1e63-b4dc-4975-82e9-15ebc2502aba-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5686c9c7dd-2q89s\" (UID: \"801f1e63-b4dc-4975-82e9-15ebc2502aba\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.914347 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/801f1e63-b4dc-4975-82e9-15ebc2502aba-audit-dir\") pod \"oauth-openshift-5686c9c7dd-2q89s\" (UID: \"801f1e63-b4dc-4975-82e9-15ebc2502aba\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.914427 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/801f1e63-b4dc-4975-82e9-15ebc2502aba-audit-policies\") pod \"oauth-openshift-5686c9c7dd-2q89s\" (UID: \"801f1e63-b4dc-4975-82e9-15ebc2502aba\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.915355 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/801f1e63-b4dc-4975-82e9-15ebc2502aba-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5686c9c7dd-2q89s\" (UID: \"801f1e63-b4dc-4975-82e9-15ebc2502aba\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.916221 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/801f1e63-b4dc-4975-82e9-15ebc2502aba-v4-0-config-system-service-ca\") pod \"oauth-openshift-5686c9c7dd-2q89s\" (UID: \"801f1e63-b4dc-4975-82e9-15ebc2502aba\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.919520 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/801f1e63-b4dc-4975-82e9-15ebc2502aba-v4-0-config-system-router-certs\") pod \"oauth-openshift-5686c9c7dd-2q89s\" (UID: \"801f1e63-b4dc-4975-82e9-15ebc2502aba\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.919607 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/801f1e63-b4dc-4975-82e9-15ebc2502aba-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5686c9c7dd-2q89s\" (UID: \"801f1e63-b4dc-4975-82e9-15ebc2502aba\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.919810 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/801f1e63-b4dc-4975-82e9-15ebc2502aba-v4-0-config-user-template-login\") pod \"oauth-openshift-5686c9c7dd-2q89s\" (UID: \"801f1e63-b4dc-4975-82e9-15ebc2502aba\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.919890 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/801f1e63-b4dc-4975-82e9-15ebc2502aba-v4-0-config-user-template-error\") pod \"oauth-openshift-5686c9c7dd-2q89s\" (UID: \"801f1e63-b4dc-4975-82e9-15ebc2502aba\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.920388 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/801f1e63-b4dc-4975-82e9-15ebc2502aba-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5686c9c7dd-2q89s\" (UID: \"801f1e63-b4dc-4975-82e9-15ebc2502aba\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.922036 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/801f1e63-b4dc-4975-82e9-15ebc2502aba-v4-0-config-system-session\") pod \"oauth-openshift-5686c9c7dd-2q89s\" (UID: \"801f1e63-b4dc-4975-82e9-15ebc2502aba\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.922321 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/801f1e63-b4dc-4975-82e9-15ebc2502aba-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5686c9c7dd-2q89s\" (UID: \"801f1e63-b4dc-4975-82e9-15ebc2502aba\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.922475 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/801f1e63-b4dc-4975-82e9-15ebc2502aba-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5686c9c7dd-2q89s\" (UID: \"801f1e63-b4dc-4975-82e9-15ebc2502aba\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.940048 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-md6tl\" (UniqueName: \"kubernetes.io/projected/801f1e63-b4dc-4975-82e9-15ebc2502aba-kube-api-access-md6tl\") pod \"oauth-openshift-5686c9c7dd-2q89s\" (UID: \"801f1e63-b4dc-4975-82e9-15ebc2502aba\") " pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" Oct 01 15:00:33 crc kubenswrapper[4771]: I1001 15:00:33.994958 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d" path="/var/lib/kubelet/pods/0ec63d2f-8b57-41c2-9cda-b65bbb63fd9d/volumes" Oct 01 15:00:34 crc kubenswrapper[4771]: I1001 15:00:34.018204 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" Oct 01 15:00:34 crc kubenswrapper[4771]: I1001 15:00:34.476315 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5686c9c7dd-2q89s"] Oct 01 15:00:34 crc kubenswrapper[4771]: I1001 15:00:34.853828 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" event={"ID":"801f1e63-b4dc-4975-82e9-15ebc2502aba","Type":"ContainerStarted","Data":"a62519deb999115aff2295535b7935ee8a99d87e4d7ef0ad4f1e4b52edf2dd7e"} Oct 01 15:00:34 crc kubenswrapper[4771]: I1001 15:00:34.855161 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" event={"ID":"801f1e63-b4dc-4975-82e9-15ebc2502aba","Type":"ContainerStarted","Data":"e3d1f5db114c98fa9325191f2c70a741e313fc0f09a950171a78cb4f37ebb0fe"} Oct 01 15:00:34 crc kubenswrapper[4771]: I1001 15:00:34.855179 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" Oct 01 15:00:34 crc kubenswrapper[4771]: I1001 15:00:34.859059 4771 patch_prober.go:28] interesting pod/oauth-openshift-5686c9c7dd-2q89s container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.55:6443/healthz\": dial tcp 10.217.0.55:6443: connect: connection refused" start-of-body= Oct 01 15:00:34 crc kubenswrapper[4771]: I1001 15:00:34.859117 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" podUID="801f1e63-b4dc-4975-82e9-15ebc2502aba" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.55:6443/healthz\": dial tcp 10.217.0.55:6443: connect: connection refused" Oct 01 15:00:34 crc kubenswrapper[4771]: I1001 15:00:34.876368 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" podStartSLOduration=26.876349978 podStartE2EDuration="26.876349978s" podCreationTimestamp="2025-10-01 15:00:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:00:34.874162191 +0000 UTC m=+279.493337372" watchObservedRunningTime="2025-10-01 15:00:34.876349978 +0000 UTC m=+279.495525159" Oct 01 15:00:35 crc kubenswrapper[4771]: I1001 15:00:35.874675 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5686c9c7dd-2q89s" Oct 01 15:00:53 crc kubenswrapper[4771]: I1001 15:00:53.902585 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m5m6x"] Oct 01 15:00:53 crc kubenswrapper[4771]: I1001 15:00:53.903305 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-m5m6x" podUID="05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5" containerName="registry-server" containerID="cri-o://79105ba6f48d2f25e9cb457db31e839ee8d8159b5130c42c95128bf7a3b6b1c3" gracePeriod=30 Oct 01 15:00:53 crc kubenswrapper[4771]: I1001 15:00:53.907777 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9dq54"] Oct 01 15:00:53 crc kubenswrapper[4771]: I1001 15:00:53.908100 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9dq54" podUID="e62d1022-e44e-4353-a6da-b846b2cb2858" containerName="registry-server" containerID="cri-o://afa6b7f8480b6fedd1c1d3ebe5f1f0a1bc9ca14cb73a88a76990da5d02e144ca" gracePeriod=30 Oct 01 15:00:53 crc kubenswrapper[4771]: I1001 15:00:53.912568 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rsd2x"] Oct 01 15:00:53 crc kubenswrapper[4771]: I1001 15:00:53.912790 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-rsd2x" podUID="b248e756-e3b8-4fd9-b6fb-99ee87df696d" containerName="marketplace-operator" containerID="cri-o://f6f1afa972fa89803fd7f98a31e66d614175b184d9c6cb8962079e93a350a899" gracePeriod=30 Oct 01 15:00:53 crc kubenswrapper[4771]: I1001 15:00:53.918010 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6z4mz"] Oct 01 15:00:53 crc kubenswrapper[4771]: I1001 15:00:53.918264 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6z4mz" podUID="ee2a2c00-b21b-45af-9de5-0cc26da899b3" containerName="registry-server" containerID="cri-o://0c2fec1f27fde74ffeb49e8778edde824ecdca988b56b5b628fb3c09e8c20237" gracePeriod=30 Oct 01 15:00:53 crc kubenswrapper[4771]: I1001 15:00:53.919928 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bxmn4"] Oct 01 15:00:53 crc kubenswrapper[4771]: I1001 15:00:53.920174 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bxmn4" podUID="d16ca88c-55fc-497e-818c-ed358e0c4bfb" containerName="registry-server" containerID="cri-o://5b20d014da56c26b84dbcc511ce16ef8433463d413fe335a04d462dc6c06f1f3" gracePeriod=30 Oct 01 15:00:53 crc kubenswrapper[4771]: I1001 15:00:53.929837 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4bz2c"] Oct 01 15:00:53 crc kubenswrapper[4771]: I1001 15:00:53.931103 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4bz2c" Oct 01 15:00:53 crc kubenswrapper[4771]: I1001 15:00:53.951825 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4bz2c"] Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.077654 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bf346cda-7a16-42a2-b731-d8834b7a1380-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4bz2c\" (UID: \"bf346cda-7a16-42a2-b731-d8834b7a1380\") " pod="openshift-marketplace/marketplace-operator-79b997595-4bz2c" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.078244 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf346cda-7a16-42a2-b731-d8834b7a1380-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4bz2c\" (UID: \"bf346cda-7a16-42a2-b731-d8834b7a1380\") " pod="openshift-marketplace/marketplace-operator-79b997595-4bz2c" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.078278 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9m7x\" (UniqueName: \"kubernetes.io/projected/bf346cda-7a16-42a2-b731-d8834b7a1380-kube-api-access-s9m7x\") pod \"marketplace-operator-79b997595-4bz2c\" (UID: \"bf346cda-7a16-42a2-b731-d8834b7a1380\") " pod="openshift-marketplace/marketplace-operator-79b997595-4bz2c" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.178950 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf346cda-7a16-42a2-b731-d8834b7a1380-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4bz2c\" (UID: \"bf346cda-7a16-42a2-b731-d8834b7a1380\") " pod="openshift-marketplace/marketplace-operator-79b997595-4bz2c" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.179009 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9m7x\" (UniqueName: \"kubernetes.io/projected/bf346cda-7a16-42a2-b731-d8834b7a1380-kube-api-access-s9m7x\") pod \"marketplace-operator-79b997595-4bz2c\" (UID: \"bf346cda-7a16-42a2-b731-d8834b7a1380\") " pod="openshift-marketplace/marketplace-operator-79b997595-4bz2c" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.179067 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bf346cda-7a16-42a2-b731-d8834b7a1380-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4bz2c\" (UID: \"bf346cda-7a16-42a2-b731-d8834b7a1380\") " pod="openshift-marketplace/marketplace-operator-79b997595-4bz2c" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.181140 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf346cda-7a16-42a2-b731-d8834b7a1380-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4bz2c\" (UID: \"bf346cda-7a16-42a2-b731-d8834b7a1380\") " pod="openshift-marketplace/marketplace-operator-79b997595-4bz2c" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.191078 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/bf346cda-7a16-42a2-b731-d8834b7a1380-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4bz2c\" (UID: \"bf346cda-7a16-42a2-b731-d8834b7a1380\") " pod="openshift-marketplace/marketplace-operator-79b997595-4bz2c" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.196143 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9m7x\" (UniqueName: \"kubernetes.io/projected/bf346cda-7a16-42a2-b731-d8834b7a1380-kube-api-access-s9m7x\") pod \"marketplace-operator-79b997595-4bz2c\" (UID: \"bf346cda-7a16-42a2-b731-d8834b7a1380\") " pod="openshift-marketplace/marketplace-operator-79b997595-4bz2c" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.251209 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4bz2c" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.396710 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bxmn4" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.408181 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rsd2x" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.431604 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9dq54" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.435547 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6z4mz" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.455577 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m5m6x" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.502744 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4bz2c"] Oct 01 15:00:54 crc kubenswrapper[4771]: W1001 15:00:54.513219 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf346cda_7a16_42a2_b731_d8834b7a1380.slice/crio-50b8feb2cb67fdce3bad727049cfc55c0482ff6f6ee5f2ef8516a942edbf9d00 WatchSource:0}: Error finding container 50b8feb2cb67fdce3bad727049cfc55c0482ff6f6ee5f2ef8516a942edbf9d00: Status 404 returned error can't find the container with id 50b8feb2cb67fdce3bad727049cfc55c0482ff6f6ee5f2ef8516a942edbf9d00 Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.583068 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee2a2c00-b21b-45af-9de5-0cc26da899b3-utilities\") pod \"ee2a2c00-b21b-45af-9de5-0cc26da899b3\" (UID: \"ee2a2c00-b21b-45af-9de5-0cc26da899b3\") " Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.583124 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d16ca88c-55fc-497e-818c-ed358e0c4bfb-catalog-content\") pod \"d16ca88c-55fc-497e-818c-ed358e0c4bfb\" (UID: \"d16ca88c-55fc-497e-818c-ed358e0c4bfb\") " Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.583143 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9649d\" (UniqueName: \"kubernetes.io/projected/d16ca88c-55fc-497e-818c-ed358e0c4bfb-kube-api-access-9649d\") pod \"d16ca88c-55fc-497e-818c-ed358e0c4bfb\" (UID: \"d16ca88c-55fc-497e-818c-ed358e0c4bfb\") " Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.583166 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4htw\" (UniqueName: \"kubernetes.io/projected/e62d1022-e44e-4353-a6da-b846b2cb2858-kube-api-access-g4htw\") pod \"e62d1022-e44e-4353-a6da-b846b2cb2858\" (UID: \"e62d1022-e44e-4353-a6da-b846b2cb2858\") " Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.583204 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e62d1022-e44e-4353-a6da-b846b2cb2858-utilities\") pod \"e62d1022-e44e-4353-a6da-b846b2cb2858\" (UID: \"e62d1022-e44e-4353-a6da-b846b2cb2858\") " Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.583259 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e62d1022-e44e-4353-a6da-b846b2cb2858-catalog-content\") pod \"e62d1022-e44e-4353-a6da-b846b2cb2858\" (UID: \"e62d1022-e44e-4353-a6da-b846b2cb2858\") " Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.583281 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5-utilities\") pod \"05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5\" (UID: \"05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5\") " Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.584495 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee2a2c00-b21b-45af-9de5-0cc26da899b3-utilities" (OuterVolumeSpecName: "utilities") pod "ee2a2c00-b21b-45af-9de5-0cc26da899b3" (UID: "ee2a2c00-b21b-45af-9de5-0cc26da899b3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.584561 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e62d1022-e44e-4353-a6da-b846b2cb2858-utilities" (OuterVolumeSpecName: "utilities") pod "e62d1022-e44e-4353-a6da-b846b2cb2858" (UID: "e62d1022-e44e-4353-a6da-b846b2cb2858"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.585200 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5-utilities" (OuterVolumeSpecName: "utilities") pod "05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5" (UID: "05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.585263 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b248e756-e3b8-4fd9-b6fb-99ee87df696d-marketplace-operator-metrics\") pod \"b248e756-e3b8-4fd9-b6fb-99ee87df696d\" (UID: \"b248e756-e3b8-4fd9-b6fb-99ee87df696d\") " Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.585317 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b248e756-e3b8-4fd9-b6fb-99ee87df696d-marketplace-trusted-ca\") pod \"b248e756-e3b8-4fd9-b6fb-99ee87df696d\" (UID: \"b248e756-e3b8-4fd9-b6fb-99ee87df696d\") " Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.586270 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b248e756-e3b8-4fd9-b6fb-99ee87df696d-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b248e756-e3b8-4fd9-b6fb-99ee87df696d" (UID: "b248e756-e3b8-4fd9-b6fb-99ee87df696d"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.586342 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9q9q6\" (UniqueName: \"kubernetes.io/projected/b248e756-e3b8-4fd9-b6fb-99ee87df696d-kube-api-access-9q9q6\") pod \"b248e756-e3b8-4fd9-b6fb-99ee87df696d\" (UID: \"b248e756-e3b8-4fd9-b6fb-99ee87df696d\") " Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.586379 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee2a2c00-b21b-45af-9de5-0cc26da899b3-catalog-content\") pod \"ee2a2c00-b21b-45af-9de5-0cc26da899b3\" (UID: \"ee2a2c00-b21b-45af-9de5-0cc26da899b3\") " Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.586412 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5-catalog-content\") pod \"05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5\" (UID: \"05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5\") " Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.586441 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d16ca88c-55fc-497e-818c-ed358e0c4bfb-utilities\") pod \"d16ca88c-55fc-497e-818c-ed358e0c4bfb\" (UID: \"d16ca88c-55fc-497e-818c-ed358e0c4bfb\") " Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.586474 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvtlq\" (UniqueName: \"kubernetes.io/projected/05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5-kube-api-access-gvtlq\") pod \"05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5\" (UID: \"05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5\") " Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.586505 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgz52\" (UniqueName: \"kubernetes.io/projected/ee2a2c00-b21b-45af-9de5-0cc26da899b3-kube-api-access-bgz52\") pod \"ee2a2c00-b21b-45af-9de5-0cc26da899b3\" (UID: \"ee2a2c00-b21b-45af-9de5-0cc26da899b3\") " Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.587445 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d16ca88c-55fc-497e-818c-ed358e0c4bfb-utilities" (OuterVolumeSpecName: "utilities") pod "d16ca88c-55fc-497e-818c-ed358e0c4bfb" (UID: "d16ca88c-55fc-497e-818c-ed358e0c4bfb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.588187 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e62d1022-e44e-4353-a6da-b846b2cb2858-kube-api-access-g4htw" (OuterVolumeSpecName: "kube-api-access-g4htw") pod "e62d1022-e44e-4353-a6da-b846b2cb2858" (UID: "e62d1022-e44e-4353-a6da-b846b2cb2858"). InnerVolumeSpecName "kube-api-access-g4htw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.588654 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b248e756-e3b8-4fd9-b6fb-99ee87df696d-kube-api-access-9q9q6" (OuterVolumeSpecName: "kube-api-access-9q9q6") pod "b248e756-e3b8-4fd9-b6fb-99ee87df696d" (UID: "b248e756-e3b8-4fd9-b6fb-99ee87df696d"). InnerVolumeSpecName "kube-api-access-9q9q6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.589809 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5-kube-api-access-gvtlq" (OuterVolumeSpecName: "kube-api-access-gvtlq") pod "05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5" (UID: "05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5"). InnerVolumeSpecName "kube-api-access-gvtlq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.590068 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d16ca88c-55fc-497e-818c-ed358e0c4bfb-kube-api-access-9649d" (OuterVolumeSpecName: "kube-api-access-9649d") pod "d16ca88c-55fc-497e-818c-ed358e0c4bfb" (UID: "d16ca88c-55fc-497e-818c-ed358e0c4bfb"). InnerVolumeSpecName "kube-api-access-9649d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.590287 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee2a2c00-b21b-45af-9de5-0cc26da899b3-kube-api-access-bgz52" (OuterVolumeSpecName: "kube-api-access-bgz52") pod "ee2a2c00-b21b-45af-9de5-0cc26da899b3" (UID: "ee2a2c00-b21b-45af-9de5-0cc26da899b3"). InnerVolumeSpecName "kube-api-access-bgz52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.591179 4771 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b248e756-e3b8-4fd9-b6fb-99ee87df696d-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.591196 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9q9q6\" (UniqueName: \"kubernetes.io/projected/b248e756-e3b8-4fd9-b6fb-99ee87df696d-kube-api-access-9q9q6\") on node \"crc\" DevicePath \"\"" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.591207 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d16ca88c-55fc-497e-818c-ed358e0c4bfb-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.591270 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvtlq\" (UniqueName: \"kubernetes.io/projected/05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5-kube-api-access-gvtlq\") on node \"crc\" DevicePath \"\"" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.591337 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgz52\" (UniqueName: \"kubernetes.io/projected/ee2a2c00-b21b-45af-9de5-0cc26da899b3-kube-api-access-bgz52\") on node \"crc\" DevicePath \"\"" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.591349 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee2a2c00-b21b-45af-9de5-0cc26da899b3-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.591358 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9649d\" (UniqueName: \"kubernetes.io/projected/d16ca88c-55fc-497e-818c-ed358e0c4bfb-kube-api-access-9649d\") on node \"crc\" DevicePath \"\"" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.591918 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4htw\" (UniqueName: \"kubernetes.io/projected/e62d1022-e44e-4353-a6da-b846b2cb2858-kube-api-access-g4htw\") on node \"crc\" DevicePath \"\"" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.591933 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e62d1022-e44e-4353-a6da-b846b2cb2858-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.591956 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.592075 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b248e756-e3b8-4fd9-b6fb-99ee87df696d-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b248e756-e3b8-4fd9-b6fb-99ee87df696d" (UID: "b248e756-e3b8-4fd9-b6fb-99ee87df696d"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.601754 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee2a2c00-b21b-45af-9de5-0cc26da899b3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ee2a2c00-b21b-45af-9de5-0cc26da899b3" (UID: "ee2a2c00-b21b-45af-9de5-0cc26da899b3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.648308 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e62d1022-e44e-4353-a6da-b846b2cb2858-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e62d1022-e44e-4353-a6da-b846b2cb2858" (UID: "e62d1022-e44e-4353-a6da-b846b2cb2858"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.654632 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5" (UID: "05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.665486 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d16ca88c-55fc-497e-818c-ed358e0c4bfb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d16ca88c-55fc-497e-818c-ed358e0c4bfb" (UID: "d16ca88c-55fc-497e-818c-ed358e0c4bfb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.693494 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d16ca88c-55fc-497e-818c-ed358e0c4bfb-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.693532 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e62d1022-e44e-4353-a6da-b846b2cb2858-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.693546 4771 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b248e756-e3b8-4fd9-b6fb-99ee87df696d-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.693561 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee2a2c00-b21b-45af-9de5-0cc26da899b3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.693575 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.973672 4771 generic.go:334] "Generic (PLEG): container finished" podID="ee2a2c00-b21b-45af-9de5-0cc26da899b3" containerID="0c2fec1f27fde74ffeb49e8778edde824ecdca988b56b5b628fb3c09e8c20237" exitCode=0 Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.973753 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6z4mz" event={"ID":"ee2a2c00-b21b-45af-9de5-0cc26da899b3","Type":"ContainerDied","Data":"0c2fec1f27fde74ffeb49e8778edde824ecdca988b56b5b628fb3c09e8c20237"} Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.973772 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6z4mz" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.973850 4771 scope.go:117] "RemoveContainer" containerID="0c2fec1f27fde74ffeb49e8778edde824ecdca988b56b5b628fb3c09e8c20237" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.973781 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6z4mz" event={"ID":"ee2a2c00-b21b-45af-9de5-0cc26da899b3","Type":"ContainerDied","Data":"645836519f313cb989cf46703d1303657e531546ef20dc8b250f740dc1940d18"} Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.975965 4771 generic.go:334] "Generic (PLEG): container finished" podID="d16ca88c-55fc-497e-818c-ed358e0c4bfb" containerID="5b20d014da56c26b84dbcc511ce16ef8433463d413fe335a04d462dc6c06f1f3" exitCode=0 Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.976027 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bxmn4" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.976045 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bxmn4" event={"ID":"d16ca88c-55fc-497e-818c-ed358e0c4bfb","Type":"ContainerDied","Data":"5b20d014da56c26b84dbcc511ce16ef8433463d413fe335a04d462dc6c06f1f3"} Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.976465 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bxmn4" event={"ID":"d16ca88c-55fc-497e-818c-ed358e0c4bfb","Type":"ContainerDied","Data":"23d5dd4409626d4894a20c327d4474765c0058fc558a03af92887a780fc4f6d2"} Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.987411 4771 scope.go:117] "RemoveContainer" containerID="1751e4363b77cde8e830b1378c693b854095f3915fb563f1e5fcc0c6bdf1273c" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.988181 4771 generic.go:334] "Generic (PLEG): container finished" podID="e62d1022-e44e-4353-a6da-b846b2cb2858" containerID="afa6b7f8480b6fedd1c1d3ebe5f1f0a1bc9ca14cb73a88a76990da5d02e144ca" exitCode=0 Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.988221 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9dq54" event={"ID":"e62d1022-e44e-4353-a6da-b846b2cb2858","Type":"ContainerDied","Data":"afa6b7f8480b6fedd1c1d3ebe5f1f0a1bc9ca14cb73a88a76990da5d02e144ca"} Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.988237 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9dq54" event={"ID":"e62d1022-e44e-4353-a6da-b846b2cb2858","Type":"ContainerDied","Data":"da254b2561fe1afd7a542115905138debae75bd6cab5ad95ea1d1bfa3520870e"} Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.988375 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9dq54" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.994273 4771 generic.go:334] "Generic (PLEG): container finished" podID="05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5" containerID="79105ba6f48d2f25e9cb457db31e839ee8d8159b5130c42c95128bf7a3b6b1c3" exitCode=0 Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.994331 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m5m6x" Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.994361 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m5m6x" event={"ID":"05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5","Type":"ContainerDied","Data":"79105ba6f48d2f25e9cb457db31e839ee8d8159b5130c42c95128bf7a3b6b1c3"} Oct 01 15:00:54 crc kubenswrapper[4771]: I1001 15:00:54.994399 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m5m6x" event={"ID":"05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5","Type":"ContainerDied","Data":"9274edeb931beec0afd0bc06bcfb580d57a7ecd37484f96913597cc81ac17588"} Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.001455 4771 generic.go:334] "Generic (PLEG): container finished" podID="b248e756-e3b8-4fd9-b6fb-99ee87df696d" containerID="f6f1afa972fa89803fd7f98a31e66d614175b184d9c6cb8962079e93a350a899" exitCode=0 Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.001534 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rsd2x" Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.001597 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rsd2x" event={"ID":"b248e756-e3b8-4fd9-b6fb-99ee87df696d","Type":"ContainerDied","Data":"f6f1afa972fa89803fd7f98a31e66d614175b184d9c6cb8962079e93a350a899"} Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.001641 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rsd2x" event={"ID":"b248e756-e3b8-4fd9-b6fb-99ee87df696d","Type":"ContainerDied","Data":"7dec245134ec70155e79f5150e26a7d3b78f05c6ff1948a3ecd14210dc66e1f9"} Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.003797 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6z4mz"] Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.006268 4771 scope.go:117] "RemoveContainer" containerID="e02e3953bdb6ef5e2c5be459755277b258337be69082faf2bef16b8d688ac7d4" Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.008919 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6z4mz"] Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.020705 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bxmn4"] Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.021797 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4bz2c" event={"ID":"bf346cda-7a16-42a2-b731-d8834b7a1380","Type":"ContainerStarted","Data":"44cdaaacac6ffb4ba1408f8f22f08a0d8d1f031c53bd491d3dfb108bb17d189d"} Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.021826 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4bz2c" event={"ID":"bf346cda-7a16-42a2-b731-d8834b7a1380","Type":"ContainerStarted","Data":"50b8feb2cb67fdce3bad727049cfc55c0482ff6f6ee5f2ef8516a942edbf9d00"} Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.024591 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bxmn4"] Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.024860 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-4bz2c" Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.032814 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-4bz2c" Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.039065 4771 scope.go:117] "RemoveContainer" containerID="0c2fec1f27fde74ffeb49e8778edde824ecdca988b56b5b628fb3c09e8c20237" Oct 01 15:00:55 crc kubenswrapper[4771]: E1001 15:00:55.039441 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c2fec1f27fde74ffeb49e8778edde824ecdca988b56b5b628fb3c09e8c20237\": container with ID starting with 0c2fec1f27fde74ffeb49e8778edde824ecdca988b56b5b628fb3c09e8c20237 not found: ID does not exist" containerID="0c2fec1f27fde74ffeb49e8778edde824ecdca988b56b5b628fb3c09e8c20237" Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.039465 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c2fec1f27fde74ffeb49e8778edde824ecdca988b56b5b628fb3c09e8c20237"} err="failed to get container status \"0c2fec1f27fde74ffeb49e8778edde824ecdca988b56b5b628fb3c09e8c20237\": rpc error: code = NotFound desc = could not find container \"0c2fec1f27fde74ffeb49e8778edde824ecdca988b56b5b628fb3c09e8c20237\": container with ID starting with 0c2fec1f27fde74ffeb49e8778edde824ecdca988b56b5b628fb3c09e8c20237 not found: ID does not exist" Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.039538 4771 scope.go:117] "RemoveContainer" containerID="1751e4363b77cde8e830b1378c693b854095f3915fb563f1e5fcc0c6bdf1273c" Oct 01 15:00:55 crc kubenswrapper[4771]: E1001 15:00:55.039926 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1751e4363b77cde8e830b1378c693b854095f3915fb563f1e5fcc0c6bdf1273c\": container with ID starting with 1751e4363b77cde8e830b1378c693b854095f3915fb563f1e5fcc0c6bdf1273c not found: ID does not exist" containerID="1751e4363b77cde8e830b1378c693b854095f3915fb563f1e5fcc0c6bdf1273c" Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.039977 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1751e4363b77cde8e830b1378c693b854095f3915fb563f1e5fcc0c6bdf1273c"} err="failed to get container status \"1751e4363b77cde8e830b1378c693b854095f3915fb563f1e5fcc0c6bdf1273c\": rpc error: code = NotFound desc = could not find container \"1751e4363b77cde8e830b1378c693b854095f3915fb563f1e5fcc0c6bdf1273c\": container with ID starting with 1751e4363b77cde8e830b1378c693b854095f3915fb563f1e5fcc0c6bdf1273c not found: ID does not exist" Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.040005 4771 scope.go:117] "RemoveContainer" containerID="e02e3953bdb6ef5e2c5be459755277b258337be69082faf2bef16b8d688ac7d4" Oct 01 15:00:55 crc kubenswrapper[4771]: E1001 15:00:55.040376 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e02e3953bdb6ef5e2c5be459755277b258337be69082faf2bef16b8d688ac7d4\": container with ID starting with e02e3953bdb6ef5e2c5be459755277b258337be69082faf2bef16b8d688ac7d4 not found: ID does not exist" containerID="e02e3953bdb6ef5e2c5be459755277b258337be69082faf2bef16b8d688ac7d4" Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.040426 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e02e3953bdb6ef5e2c5be459755277b258337be69082faf2bef16b8d688ac7d4"} err="failed to get container status \"e02e3953bdb6ef5e2c5be459755277b258337be69082faf2bef16b8d688ac7d4\": rpc error: code = NotFound desc = could not find container \"e02e3953bdb6ef5e2c5be459755277b258337be69082faf2bef16b8d688ac7d4\": container with ID starting with e02e3953bdb6ef5e2c5be459755277b258337be69082faf2bef16b8d688ac7d4 not found: ID does not exist" Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.040451 4771 scope.go:117] "RemoveContainer" containerID="5b20d014da56c26b84dbcc511ce16ef8433463d413fe335a04d462dc6c06f1f3" Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.050065 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9dq54"] Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.055164 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9dq54"] Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.058656 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-4bz2c" podStartSLOduration=2.058642436 podStartE2EDuration="2.058642436s" podCreationTimestamp="2025-10-01 15:00:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:00:55.054109006 +0000 UTC m=+299.673284167" watchObservedRunningTime="2025-10-01 15:00:55.058642436 +0000 UTC m=+299.677817607" Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.069919 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rsd2x"] Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.073013 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rsd2x"] Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.077623 4771 scope.go:117] "RemoveContainer" containerID="8126075ed7a32f068b114015dba9d3567bbf1ef28998d757170314db893ac0ef" Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.083843 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m5m6x"] Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.085857 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-m5m6x"] Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.100199 4771 scope.go:117] "RemoveContainer" containerID="427d4c1624c21c39bda646df812b0a91eb475f6d8c7f3a209a2441ad40d26e75" Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.113667 4771 scope.go:117] "RemoveContainer" containerID="5b20d014da56c26b84dbcc511ce16ef8433463d413fe335a04d462dc6c06f1f3" Oct 01 15:00:55 crc kubenswrapper[4771]: E1001 15:00:55.114146 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b20d014da56c26b84dbcc511ce16ef8433463d413fe335a04d462dc6c06f1f3\": container with ID starting with 5b20d014da56c26b84dbcc511ce16ef8433463d413fe335a04d462dc6c06f1f3 not found: ID does not exist" containerID="5b20d014da56c26b84dbcc511ce16ef8433463d413fe335a04d462dc6c06f1f3" Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.114181 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b20d014da56c26b84dbcc511ce16ef8433463d413fe335a04d462dc6c06f1f3"} err="failed to get container status \"5b20d014da56c26b84dbcc511ce16ef8433463d413fe335a04d462dc6c06f1f3\": rpc error: code = NotFound desc = could not find container \"5b20d014da56c26b84dbcc511ce16ef8433463d413fe335a04d462dc6c06f1f3\": container with ID starting with 5b20d014da56c26b84dbcc511ce16ef8433463d413fe335a04d462dc6c06f1f3 not found: ID does not exist" Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.114205 4771 scope.go:117] "RemoveContainer" containerID="8126075ed7a32f068b114015dba9d3567bbf1ef28998d757170314db893ac0ef" Oct 01 15:00:55 crc kubenswrapper[4771]: E1001 15:00:55.117157 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8126075ed7a32f068b114015dba9d3567bbf1ef28998d757170314db893ac0ef\": container with ID starting with 8126075ed7a32f068b114015dba9d3567bbf1ef28998d757170314db893ac0ef not found: ID does not exist" containerID="8126075ed7a32f068b114015dba9d3567bbf1ef28998d757170314db893ac0ef" Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.117208 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8126075ed7a32f068b114015dba9d3567bbf1ef28998d757170314db893ac0ef"} err="failed to get container status \"8126075ed7a32f068b114015dba9d3567bbf1ef28998d757170314db893ac0ef\": rpc error: code = NotFound desc = could not find container \"8126075ed7a32f068b114015dba9d3567bbf1ef28998d757170314db893ac0ef\": container with ID starting with 8126075ed7a32f068b114015dba9d3567bbf1ef28998d757170314db893ac0ef not found: ID does not exist" Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.117238 4771 scope.go:117] "RemoveContainer" containerID="427d4c1624c21c39bda646df812b0a91eb475f6d8c7f3a209a2441ad40d26e75" Oct 01 15:00:55 crc kubenswrapper[4771]: E1001 15:00:55.117480 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"427d4c1624c21c39bda646df812b0a91eb475f6d8c7f3a209a2441ad40d26e75\": container with ID starting with 427d4c1624c21c39bda646df812b0a91eb475f6d8c7f3a209a2441ad40d26e75 not found: ID does not exist" containerID="427d4c1624c21c39bda646df812b0a91eb475f6d8c7f3a209a2441ad40d26e75" Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.117570 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"427d4c1624c21c39bda646df812b0a91eb475f6d8c7f3a209a2441ad40d26e75"} err="failed to get container status \"427d4c1624c21c39bda646df812b0a91eb475f6d8c7f3a209a2441ad40d26e75\": rpc error: code = NotFound desc = could not find container \"427d4c1624c21c39bda646df812b0a91eb475f6d8c7f3a209a2441ad40d26e75\": container with ID starting with 427d4c1624c21c39bda646df812b0a91eb475f6d8c7f3a209a2441ad40d26e75 not found: ID does not exist" Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.117644 4771 scope.go:117] "RemoveContainer" containerID="afa6b7f8480b6fedd1c1d3ebe5f1f0a1bc9ca14cb73a88a76990da5d02e144ca" Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.131087 4771 scope.go:117] "RemoveContainer" containerID="9aa961c2387154bed99e894981ad8d88c8fce625b2569c0e8d1b5a43d74d30f4" Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.142692 4771 scope.go:117] "RemoveContainer" containerID="157ab001d44f25990e8b825881903ef859154893d98ff24099d52ca4f86d3660" Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.154275 4771 scope.go:117] "RemoveContainer" containerID="afa6b7f8480b6fedd1c1d3ebe5f1f0a1bc9ca14cb73a88a76990da5d02e144ca" Oct 01 15:00:55 crc kubenswrapper[4771]: E1001 15:00:55.155289 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afa6b7f8480b6fedd1c1d3ebe5f1f0a1bc9ca14cb73a88a76990da5d02e144ca\": container with ID starting with afa6b7f8480b6fedd1c1d3ebe5f1f0a1bc9ca14cb73a88a76990da5d02e144ca not found: ID does not exist" containerID="afa6b7f8480b6fedd1c1d3ebe5f1f0a1bc9ca14cb73a88a76990da5d02e144ca" Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.155398 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afa6b7f8480b6fedd1c1d3ebe5f1f0a1bc9ca14cb73a88a76990da5d02e144ca"} err="failed to get container status \"afa6b7f8480b6fedd1c1d3ebe5f1f0a1bc9ca14cb73a88a76990da5d02e144ca\": rpc error: code = NotFound desc = could not find container \"afa6b7f8480b6fedd1c1d3ebe5f1f0a1bc9ca14cb73a88a76990da5d02e144ca\": container with ID starting with afa6b7f8480b6fedd1c1d3ebe5f1f0a1bc9ca14cb73a88a76990da5d02e144ca not found: ID does not exist" Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.155483 4771 scope.go:117] "RemoveContainer" containerID="9aa961c2387154bed99e894981ad8d88c8fce625b2569c0e8d1b5a43d74d30f4" Oct 01 15:00:55 crc kubenswrapper[4771]: E1001 15:00:55.155901 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9aa961c2387154bed99e894981ad8d88c8fce625b2569c0e8d1b5a43d74d30f4\": container with ID starting with 9aa961c2387154bed99e894981ad8d88c8fce625b2569c0e8d1b5a43d74d30f4 not found: ID does not exist" containerID="9aa961c2387154bed99e894981ad8d88c8fce625b2569c0e8d1b5a43d74d30f4" Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.155936 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9aa961c2387154bed99e894981ad8d88c8fce625b2569c0e8d1b5a43d74d30f4"} err="failed to get container status \"9aa961c2387154bed99e894981ad8d88c8fce625b2569c0e8d1b5a43d74d30f4\": rpc error: code = NotFound desc = could not find container \"9aa961c2387154bed99e894981ad8d88c8fce625b2569c0e8d1b5a43d74d30f4\": container with ID starting with 9aa961c2387154bed99e894981ad8d88c8fce625b2569c0e8d1b5a43d74d30f4 not found: ID does not exist" Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.155958 4771 scope.go:117] "RemoveContainer" containerID="157ab001d44f25990e8b825881903ef859154893d98ff24099d52ca4f86d3660" Oct 01 15:00:55 crc kubenswrapper[4771]: E1001 15:00:55.156236 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"157ab001d44f25990e8b825881903ef859154893d98ff24099d52ca4f86d3660\": container with ID starting with 157ab001d44f25990e8b825881903ef859154893d98ff24099d52ca4f86d3660 not found: ID does not exist" containerID="157ab001d44f25990e8b825881903ef859154893d98ff24099d52ca4f86d3660" Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.156265 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"157ab001d44f25990e8b825881903ef859154893d98ff24099d52ca4f86d3660"} err="failed to get container status \"157ab001d44f25990e8b825881903ef859154893d98ff24099d52ca4f86d3660\": rpc error: code = NotFound desc = could not find container \"157ab001d44f25990e8b825881903ef859154893d98ff24099d52ca4f86d3660\": container with ID starting with 157ab001d44f25990e8b825881903ef859154893d98ff24099d52ca4f86d3660 not found: ID does not exist" Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.156281 4771 scope.go:117] "RemoveContainer" containerID="79105ba6f48d2f25e9cb457db31e839ee8d8159b5130c42c95128bf7a3b6b1c3" Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.165761 4771 scope.go:117] "RemoveContainer" containerID="716a0283fb58c69dc04014abe4cac65bdb6162ed0a8463669dd48bab78da4c7e" Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.177164 4771 scope.go:117] "RemoveContainer" containerID="8cc390e041456cc03f1bf05c4e2bb2799f2e2dbe703b3e6bf0967c1998554ca9" Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.186717 4771 scope.go:117] "RemoveContainer" containerID="79105ba6f48d2f25e9cb457db31e839ee8d8159b5130c42c95128bf7a3b6b1c3" Oct 01 15:00:55 crc kubenswrapper[4771]: E1001 15:00:55.187024 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79105ba6f48d2f25e9cb457db31e839ee8d8159b5130c42c95128bf7a3b6b1c3\": container with ID starting with 79105ba6f48d2f25e9cb457db31e839ee8d8159b5130c42c95128bf7a3b6b1c3 not found: ID does not exist" containerID="79105ba6f48d2f25e9cb457db31e839ee8d8159b5130c42c95128bf7a3b6b1c3" Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.187049 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79105ba6f48d2f25e9cb457db31e839ee8d8159b5130c42c95128bf7a3b6b1c3"} err="failed to get container status \"79105ba6f48d2f25e9cb457db31e839ee8d8159b5130c42c95128bf7a3b6b1c3\": rpc error: code = NotFound desc = could not find container \"79105ba6f48d2f25e9cb457db31e839ee8d8159b5130c42c95128bf7a3b6b1c3\": container with ID starting with 79105ba6f48d2f25e9cb457db31e839ee8d8159b5130c42c95128bf7a3b6b1c3 not found: ID does not exist" Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.187069 4771 scope.go:117] "RemoveContainer" containerID="716a0283fb58c69dc04014abe4cac65bdb6162ed0a8463669dd48bab78da4c7e" Oct 01 15:00:55 crc kubenswrapper[4771]: E1001 15:00:55.187253 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"716a0283fb58c69dc04014abe4cac65bdb6162ed0a8463669dd48bab78da4c7e\": container with ID starting with 716a0283fb58c69dc04014abe4cac65bdb6162ed0a8463669dd48bab78da4c7e not found: ID does not exist" containerID="716a0283fb58c69dc04014abe4cac65bdb6162ed0a8463669dd48bab78da4c7e" Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.187274 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"716a0283fb58c69dc04014abe4cac65bdb6162ed0a8463669dd48bab78da4c7e"} err="failed to get container status \"716a0283fb58c69dc04014abe4cac65bdb6162ed0a8463669dd48bab78da4c7e\": rpc error: code = NotFound desc = could not find container \"716a0283fb58c69dc04014abe4cac65bdb6162ed0a8463669dd48bab78da4c7e\": container with ID starting with 716a0283fb58c69dc04014abe4cac65bdb6162ed0a8463669dd48bab78da4c7e not found: ID does not exist" Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.187286 4771 scope.go:117] "RemoveContainer" containerID="8cc390e041456cc03f1bf05c4e2bb2799f2e2dbe703b3e6bf0967c1998554ca9" Oct 01 15:00:55 crc kubenswrapper[4771]: E1001 15:00:55.187544 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cc390e041456cc03f1bf05c4e2bb2799f2e2dbe703b3e6bf0967c1998554ca9\": container with ID starting with 8cc390e041456cc03f1bf05c4e2bb2799f2e2dbe703b3e6bf0967c1998554ca9 not found: ID does not exist" containerID="8cc390e041456cc03f1bf05c4e2bb2799f2e2dbe703b3e6bf0967c1998554ca9" Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.187588 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cc390e041456cc03f1bf05c4e2bb2799f2e2dbe703b3e6bf0967c1998554ca9"} err="failed to get container status \"8cc390e041456cc03f1bf05c4e2bb2799f2e2dbe703b3e6bf0967c1998554ca9\": rpc error: code = NotFound desc = could not find container \"8cc390e041456cc03f1bf05c4e2bb2799f2e2dbe703b3e6bf0967c1998554ca9\": container with ID starting with 8cc390e041456cc03f1bf05c4e2bb2799f2e2dbe703b3e6bf0967c1998554ca9 not found: ID does not exist" Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.187617 4771 scope.go:117] "RemoveContainer" containerID="f6f1afa972fa89803fd7f98a31e66d614175b184d9c6cb8962079e93a350a899" Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.197667 4771 scope.go:117] "RemoveContainer" containerID="f6f1afa972fa89803fd7f98a31e66d614175b184d9c6cb8962079e93a350a899" Oct 01 15:00:55 crc kubenswrapper[4771]: E1001 15:00:55.198071 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6f1afa972fa89803fd7f98a31e66d614175b184d9c6cb8962079e93a350a899\": container with ID starting with f6f1afa972fa89803fd7f98a31e66d614175b184d9c6cb8962079e93a350a899 not found: ID does not exist" containerID="f6f1afa972fa89803fd7f98a31e66d614175b184d9c6cb8962079e93a350a899" Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.198099 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6f1afa972fa89803fd7f98a31e66d614175b184d9c6cb8962079e93a350a899"} err="failed to get container status \"f6f1afa972fa89803fd7f98a31e66d614175b184d9c6cb8962079e93a350a899\": rpc error: code = NotFound desc = could not find container \"f6f1afa972fa89803fd7f98a31e66d614175b184d9c6cb8962079e93a350a899\": container with ID starting with f6f1afa972fa89803fd7f98a31e66d614175b184d9c6cb8962079e93a350a899 not found: ID does not exist" Oct 01 15:00:55 crc kubenswrapper[4771]: I1001 15:00:55.999174 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5" path="/var/lib/kubelet/pods/05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5/volumes" Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.000431 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b248e756-e3b8-4fd9-b6fb-99ee87df696d" path="/var/lib/kubelet/pods/b248e756-e3b8-4fd9-b6fb-99ee87df696d/volumes" Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.001325 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d16ca88c-55fc-497e-818c-ed358e0c4bfb" path="/var/lib/kubelet/pods/d16ca88c-55fc-497e-818c-ed358e0c4bfb/volumes" Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.003475 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e62d1022-e44e-4353-a6da-b846b2cb2858" path="/var/lib/kubelet/pods/e62d1022-e44e-4353-a6da-b846b2cb2858/volumes" Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.004625 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee2a2c00-b21b-45af-9de5-0cc26da899b3" path="/var/lib/kubelet/pods/ee2a2c00-b21b-45af-9de5-0cc26da899b3/volumes" Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.110878 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9ftld"] Oct 01 15:00:56 crc kubenswrapper[4771]: E1001 15:00:56.111223 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d16ca88c-55fc-497e-818c-ed358e0c4bfb" containerName="extract-content" Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.111307 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d16ca88c-55fc-497e-818c-ed358e0c4bfb" containerName="extract-content" Oct 01 15:00:56 crc kubenswrapper[4771]: E1001 15:00:56.111365 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e62d1022-e44e-4353-a6da-b846b2cb2858" containerName="extract-utilities" Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.111416 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e62d1022-e44e-4353-a6da-b846b2cb2858" containerName="extract-utilities" Oct 01 15:00:56 crc kubenswrapper[4771]: E1001 15:00:56.111474 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5" containerName="registry-server" Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.111533 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5" containerName="registry-server" Oct 01 15:00:56 crc kubenswrapper[4771]: E1001 15:00:56.111592 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee2a2c00-b21b-45af-9de5-0cc26da899b3" containerName="extract-content" Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.111685 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee2a2c00-b21b-45af-9de5-0cc26da899b3" containerName="extract-content" Oct 01 15:00:56 crc kubenswrapper[4771]: E1001 15:00:56.111809 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee2a2c00-b21b-45af-9de5-0cc26da899b3" containerName="extract-utilities" Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.111892 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee2a2c00-b21b-45af-9de5-0cc26da899b3" containerName="extract-utilities" Oct 01 15:00:56 crc kubenswrapper[4771]: E1001 15:00:56.111971 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d16ca88c-55fc-497e-818c-ed358e0c4bfb" containerName="extract-utilities" Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.112098 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d16ca88c-55fc-497e-818c-ed358e0c4bfb" containerName="extract-utilities" Oct 01 15:00:56 crc kubenswrapper[4771]: E1001 15:00:56.112212 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b248e756-e3b8-4fd9-b6fb-99ee87df696d" containerName="marketplace-operator" Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.112307 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b248e756-e3b8-4fd9-b6fb-99ee87df696d" containerName="marketplace-operator" Oct 01 15:00:56 crc kubenswrapper[4771]: E1001 15:00:56.112418 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee2a2c00-b21b-45af-9de5-0cc26da899b3" containerName="registry-server" Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.112501 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee2a2c00-b21b-45af-9de5-0cc26da899b3" containerName="registry-server" Oct 01 15:00:56 crc kubenswrapper[4771]: E1001 15:00:56.112581 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e62d1022-e44e-4353-a6da-b846b2cb2858" containerName="extract-content" Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.112663 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e62d1022-e44e-4353-a6da-b846b2cb2858" containerName="extract-content" Oct 01 15:00:56 crc kubenswrapper[4771]: E1001 15:00:56.112763 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d16ca88c-55fc-497e-818c-ed358e0c4bfb" containerName="registry-server" Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.112866 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d16ca88c-55fc-497e-818c-ed358e0c4bfb" containerName="registry-server" Oct 01 15:00:56 crc kubenswrapper[4771]: E1001 15:00:56.113117 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5" containerName="extract-utilities" Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.113246 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5" containerName="extract-utilities" Oct 01 15:00:56 crc kubenswrapper[4771]: E1001 15:00:56.113316 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5" containerName="extract-content" Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.113376 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5" containerName="extract-content" Oct 01 15:00:56 crc kubenswrapper[4771]: E1001 15:00:56.113436 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e62d1022-e44e-4353-a6da-b846b2cb2858" containerName="registry-server" Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.113495 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e62d1022-e44e-4353-a6da-b846b2cb2858" containerName="registry-server" Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.113654 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee2a2c00-b21b-45af-9de5-0cc26da899b3" containerName="registry-server" Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.113713 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b248e756-e3b8-4fd9-b6fb-99ee87df696d" containerName="marketplace-operator" Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.113789 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="05d6c075-fd4d-4bc6-bafc-a9d0e1085dc5" containerName="registry-server" Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.113849 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="d16ca88c-55fc-497e-818c-ed358e0c4bfb" containerName="registry-server" Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.113987 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e62d1022-e44e-4353-a6da-b846b2cb2858" containerName="registry-server" Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.114656 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9ftld" Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.117562 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.120621 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9ftld"] Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.212437 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e61495fc-ef85-44de-8135-f080a089e4ed-catalog-content\") pod \"redhat-marketplace-9ftld\" (UID: \"e61495fc-ef85-44de-8135-f080a089e4ed\") " pod="openshift-marketplace/redhat-marketplace-9ftld" Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.212479 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvjhs\" (UniqueName: \"kubernetes.io/projected/e61495fc-ef85-44de-8135-f080a089e4ed-kube-api-access-jvjhs\") pod \"redhat-marketplace-9ftld\" (UID: \"e61495fc-ef85-44de-8135-f080a089e4ed\") " pod="openshift-marketplace/redhat-marketplace-9ftld" Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.212510 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e61495fc-ef85-44de-8135-f080a089e4ed-utilities\") pod \"redhat-marketplace-9ftld\" (UID: \"e61495fc-ef85-44de-8135-f080a089e4ed\") " pod="openshift-marketplace/redhat-marketplace-9ftld" Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.310842 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-lzbhb"] Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.314893 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lzbhb" Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.314918 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e61495fc-ef85-44de-8135-f080a089e4ed-catalog-content\") pod \"redhat-marketplace-9ftld\" (UID: \"e61495fc-ef85-44de-8135-f080a089e4ed\") " pod="openshift-marketplace/redhat-marketplace-9ftld" Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.315551 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvjhs\" (UniqueName: \"kubernetes.io/projected/e61495fc-ef85-44de-8135-f080a089e4ed-kube-api-access-jvjhs\") pod \"redhat-marketplace-9ftld\" (UID: \"e61495fc-ef85-44de-8135-f080a089e4ed\") " pod="openshift-marketplace/redhat-marketplace-9ftld" Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.315613 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e61495fc-ef85-44de-8135-f080a089e4ed-utilities\") pod \"redhat-marketplace-9ftld\" (UID: \"e61495fc-ef85-44de-8135-f080a089e4ed\") " pod="openshift-marketplace/redhat-marketplace-9ftld" Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.316486 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e61495fc-ef85-44de-8135-f080a089e4ed-utilities\") pod \"redhat-marketplace-9ftld\" (UID: \"e61495fc-ef85-44de-8135-f080a089e4ed\") " pod="openshift-marketplace/redhat-marketplace-9ftld" Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.318400 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.316824 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e61495fc-ef85-44de-8135-f080a089e4ed-catalog-content\") pod \"redhat-marketplace-9ftld\" (UID: \"e61495fc-ef85-44de-8135-f080a089e4ed\") " pod="openshift-marketplace/redhat-marketplace-9ftld" Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.335811 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvjhs\" (UniqueName: \"kubernetes.io/projected/e61495fc-ef85-44de-8135-f080a089e4ed-kube-api-access-jvjhs\") pod \"redhat-marketplace-9ftld\" (UID: \"e61495fc-ef85-44de-8135-f080a089e4ed\") " pod="openshift-marketplace/redhat-marketplace-9ftld" Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.350575 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lzbhb"] Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.417268 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8c77405-b06a-457f-ae26-aa105e1e638c-catalog-content\") pod \"redhat-operators-lzbhb\" (UID: \"e8c77405-b06a-457f-ae26-aa105e1e638c\") " pod="openshift-marketplace/redhat-operators-lzbhb" Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.417606 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npr4g\" (UniqueName: \"kubernetes.io/projected/e8c77405-b06a-457f-ae26-aa105e1e638c-kube-api-access-npr4g\") pod \"redhat-operators-lzbhb\" (UID: \"e8c77405-b06a-457f-ae26-aa105e1e638c\") " pod="openshift-marketplace/redhat-operators-lzbhb" Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.417781 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8c77405-b06a-457f-ae26-aa105e1e638c-utilities\") pod \"redhat-operators-lzbhb\" (UID: \"e8c77405-b06a-457f-ae26-aa105e1e638c\") " pod="openshift-marketplace/redhat-operators-lzbhb" Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.443258 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9ftld" Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.518910 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8c77405-b06a-457f-ae26-aa105e1e638c-catalog-content\") pod \"redhat-operators-lzbhb\" (UID: \"e8c77405-b06a-457f-ae26-aa105e1e638c\") " pod="openshift-marketplace/redhat-operators-lzbhb" Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.519101 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npr4g\" (UniqueName: \"kubernetes.io/projected/e8c77405-b06a-457f-ae26-aa105e1e638c-kube-api-access-npr4g\") pod \"redhat-operators-lzbhb\" (UID: \"e8c77405-b06a-457f-ae26-aa105e1e638c\") " pod="openshift-marketplace/redhat-operators-lzbhb" Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.519131 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8c77405-b06a-457f-ae26-aa105e1e638c-utilities\") pod \"redhat-operators-lzbhb\" (UID: \"e8c77405-b06a-457f-ae26-aa105e1e638c\") " pod="openshift-marketplace/redhat-operators-lzbhb" Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.519499 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e8c77405-b06a-457f-ae26-aa105e1e638c-utilities\") pod \"redhat-operators-lzbhb\" (UID: \"e8c77405-b06a-457f-ae26-aa105e1e638c\") " pod="openshift-marketplace/redhat-operators-lzbhb" Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.519751 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e8c77405-b06a-457f-ae26-aa105e1e638c-catalog-content\") pod \"redhat-operators-lzbhb\" (UID: \"e8c77405-b06a-457f-ae26-aa105e1e638c\") " pod="openshift-marketplace/redhat-operators-lzbhb" Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.542564 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npr4g\" (UniqueName: \"kubernetes.io/projected/e8c77405-b06a-457f-ae26-aa105e1e638c-kube-api-access-npr4g\") pod \"redhat-operators-lzbhb\" (UID: \"e8c77405-b06a-457f-ae26-aa105e1e638c\") " pod="openshift-marketplace/redhat-operators-lzbhb" Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.617377 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9ftld"] Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.635486 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-lzbhb" Oct 01 15:00:56 crc kubenswrapper[4771]: I1001 15:00:56.902075 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-lzbhb"] Oct 01 15:00:56 crc kubenswrapper[4771]: W1001 15:00:56.927193 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8c77405_b06a_457f_ae26_aa105e1e638c.slice/crio-fd3b2f09bc4b9611e317f78b3138a29a694407e9a37f4f652c7bcffb146df28f WatchSource:0}: Error finding container fd3b2f09bc4b9611e317f78b3138a29a694407e9a37f4f652c7bcffb146df28f: Status 404 returned error can't find the container with id fd3b2f09bc4b9611e317f78b3138a29a694407e9a37f4f652c7bcffb146df28f Oct 01 15:00:57 crc kubenswrapper[4771]: I1001 15:00:57.045822 4771 generic.go:334] "Generic (PLEG): container finished" podID="e61495fc-ef85-44de-8135-f080a089e4ed" containerID="58fe812c47d87e0da595c264b0ce54275d5203558b2ca94798b227f1b529597a" exitCode=0 Oct 01 15:00:57 crc kubenswrapper[4771]: I1001 15:00:57.045907 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9ftld" event={"ID":"e61495fc-ef85-44de-8135-f080a089e4ed","Type":"ContainerDied","Data":"58fe812c47d87e0da595c264b0ce54275d5203558b2ca94798b227f1b529597a"} Oct 01 15:00:57 crc kubenswrapper[4771]: I1001 15:00:57.045944 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9ftld" event={"ID":"e61495fc-ef85-44de-8135-f080a089e4ed","Type":"ContainerStarted","Data":"683925e85b058103df04518392aa0979e36ecb609172da36e450eb6d560ed5ad"} Oct 01 15:00:57 crc kubenswrapper[4771]: I1001 15:00:57.048111 4771 generic.go:334] "Generic (PLEG): container finished" podID="e8c77405-b06a-457f-ae26-aa105e1e638c" containerID="c2fa27160c6f4bdee88035e454a3b3bbf4091302e0dace270637df7221321f30" exitCode=0 Oct 01 15:00:57 crc kubenswrapper[4771]: I1001 15:00:57.048358 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lzbhb" event={"ID":"e8c77405-b06a-457f-ae26-aa105e1e638c","Type":"ContainerDied","Data":"c2fa27160c6f4bdee88035e454a3b3bbf4091302e0dace270637df7221321f30"} Oct 01 15:00:57 crc kubenswrapper[4771]: I1001 15:00:57.048388 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lzbhb" event={"ID":"e8c77405-b06a-457f-ae26-aa105e1e638c","Type":"ContainerStarted","Data":"fd3b2f09bc4b9611e317f78b3138a29a694407e9a37f4f652c7bcffb146df28f"} Oct 01 15:00:58 crc kubenswrapper[4771]: I1001 15:00:58.517455 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cxlz6"] Oct 01 15:00:58 crc kubenswrapper[4771]: I1001 15:00:58.519181 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cxlz6" Oct 01 15:00:58 crc kubenswrapper[4771]: I1001 15:00:58.521938 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 01 15:00:58 crc kubenswrapper[4771]: I1001 15:00:58.537679 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cxlz6"] Oct 01 15:00:58 crc kubenswrapper[4771]: I1001 15:00:58.542515 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0ead9f7-cc56-4d45-9718-d175502b54df-catalog-content\") pod \"certified-operators-cxlz6\" (UID: \"e0ead9f7-cc56-4d45-9718-d175502b54df\") " pod="openshift-marketplace/certified-operators-cxlz6" Oct 01 15:00:58 crc kubenswrapper[4771]: I1001 15:00:58.542610 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdkzp\" (UniqueName: \"kubernetes.io/projected/e0ead9f7-cc56-4d45-9718-d175502b54df-kube-api-access-hdkzp\") pod \"certified-operators-cxlz6\" (UID: \"e0ead9f7-cc56-4d45-9718-d175502b54df\") " pod="openshift-marketplace/certified-operators-cxlz6" Oct 01 15:00:58 crc kubenswrapper[4771]: I1001 15:00:58.542656 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0ead9f7-cc56-4d45-9718-d175502b54df-utilities\") pod \"certified-operators-cxlz6\" (UID: \"e0ead9f7-cc56-4d45-9718-d175502b54df\") " pod="openshift-marketplace/certified-operators-cxlz6" Oct 01 15:00:58 crc kubenswrapper[4771]: I1001 15:00:58.643461 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0ead9f7-cc56-4d45-9718-d175502b54df-utilities\") pod \"certified-operators-cxlz6\" (UID: \"e0ead9f7-cc56-4d45-9718-d175502b54df\") " pod="openshift-marketplace/certified-operators-cxlz6" Oct 01 15:00:58 crc kubenswrapper[4771]: I1001 15:00:58.643560 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0ead9f7-cc56-4d45-9718-d175502b54df-catalog-content\") pod \"certified-operators-cxlz6\" (UID: \"e0ead9f7-cc56-4d45-9718-d175502b54df\") " pod="openshift-marketplace/certified-operators-cxlz6" Oct 01 15:00:58 crc kubenswrapper[4771]: I1001 15:00:58.643601 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdkzp\" (UniqueName: \"kubernetes.io/projected/e0ead9f7-cc56-4d45-9718-d175502b54df-kube-api-access-hdkzp\") pod \"certified-operators-cxlz6\" (UID: \"e0ead9f7-cc56-4d45-9718-d175502b54df\") " pod="openshift-marketplace/certified-operators-cxlz6" Oct 01 15:00:58 crc kubenswrapper[4771]: I1001 15:00:58.643951 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e0ead9f7-cc56-4d45-9718-d175502b54df-utilities\") pod \"certified-operators-cxlz6\" (UID: \"e0ead9f7-cc56-4d45-9718-d175502b54df\") " pod="openshift-marketplace/certified-operators-cxlz6" Oct 01 15:00:58 crc kubenswrapper[4771]: I1001 15:00:58.644172 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e0ead9f7-cc56-4d45-9718-d175502b54df-catalog-content\") pod \"certified-operators-cxlz6\" (UID: \"e0ead9f7-cc56-4d45-9718-d175502b54df\") " pod="openshift-marketplace/certified-operators-cxlz6" Oct 01 15:00:58 crc kubenswrapper[4771]: I1001 15:00:58.679216 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdkzp\" (UniqueName: \"kubernetes.io/projected/e0ead9f7-cc56-4d45-9718-d175502b54df-kube-api-access-hdkzp\") pod \"certified-operators-cxlz6\" (UID: \"e0ead9f7-cc56-4d45-9718-d175502b54df\") " pod="openshift-marketplace/certified-operators-cxlz6" Oct 01 15:00:58 crc kubenswrapper[4771]: I1001 15:00:58.711320 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bv7vl"] Oct 01 15:00:58 crc kubenswrapper[4771]: I1001 15:00:58.712395 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bv7vl" Oct 01 15:00:58 crc kubenswrapper[4771]: I1001 15:00:58.714781 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 01 15:00:58 crc kubenswrapper[4771]: I1001 15:00:58.722168 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bv7vl"] Oct 01 15:00:58 crc kubenswrapper[4771]: I1001 15:00:58.845179 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnp76\" (UniqueName: \"kubernetes.io/projected/2061f39f-1b36-4d01-b13f-33156f106012-kube-api-access-pnp76\") pod \"community-operators-bv7vl\" (UID: \"2061f39f-1b36-4d01-b13f-33156f106012\") " pod="openshift-marketplace/community-operators-bv7vl" Oct 01 15:00:58 crc kubenswrapper[4771]: I1001 15:00:58.845248 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2061f39f-1b36-4d01-b13f-33156f106012-utilities\") pod \"community-operators-bv7vl\" (UID: \"2061f39f-1b36-4d01-b13f-33156f106012\") " pod="openshift-marketplace/community-operators-bv7vl" Oct 01 15:00:58 crc kubenswrapper[4771]: I1001 15:00:58.845304 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2061f39f-1b36-4d01-b13f-33156f106012-catalog-content\") pod \"community-operators-bv7vl\" (UID: \"2061f39f-1b36-4d01-b13f-33156f106012\") " pod="openshift-marketplace/community-operators-bv7vl" Oct 01 15:00:58 crc kubenswrapper[4771]: I1001 15:00:58.845929 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cxlz6" Oct 01 15:00:58 crc kubenswrapper[4771]: I1001 15:00:58.946594 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2061f39f-1b36-4d01-b13f-33156f106012-utilities\") pod \"community-operators-bv7vl\" (UID: \"2061f39f-1b36-4d01-b13f-33156f106012\") " pod="openshift-marketplace/community-operators-bv7vl" Oct 01 15:00:58 crc kubenswrapper[4771]: I1001 15:00:58.946994 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2061f39f-1b36-4d01-b13f-33156f106012-catalog-content\") pod \"community-operators-bv7vl\" (UID: \"2061f39f-1b36-4d01-b13f-33156f106012\") " pod="openshift-marketplace/community-operators-bv7vl" Oct 01 15:00:58 crc kubenswrapper[4771]: I1001 15:00:58.947039 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnp76\" (UniqueName: \"kubernetes.io/projected/2061f39f-1b36-4d01-b13f-33156f106012-kube-api-access-pnp76\") pod \"community-operators-bv7vl\" (UID: \"2061f39f-1b36-4d01-b13f-33156f106012\") " pod="openshift-marketplace/community-operators-bv7vl" Oct 01 15:00:58 crc kubenswrapper[4771]: I1001 15:00:58.947304 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2061f39f-1b36-4d01-b13f-33156f106012-utilities\") pod \"community-operators-bv7vl\" (UID: \"2061f39f-1b36-4d01-b13f-33156f106012\") " pod="openshift-marketplace/community-operators-bv7vl" Oct 01 15:00:58 crc kubenswrapper[4771]: I1001 15:00:58.947523 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2061f39f-1b36-4d01-b13f-33156f106012-catalog-content\") pod \"community-operators-bv7vl\" (UID: \"2061f39f-1b36-4d01-b13f-33156f106012\") " pod="openshift-marketplace/community-operators-bv7vl" Oct 01 15:00:58 crc kubenswrapper[4771]: I1001 15:00:58.984802 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnp76\" (UniqueName: \"kubernetes.io/projected/2061f39f-1b36-4d01-b13f-33156f106012-kube-api-access-pnp76\") pod \"community-operators-bv7vl\" (UID: \"2061f39f-1b36-4d01-b13f-33156f106012\") " pod="openshift-marketplace/community-operators-bv7vl" Oct 01 15:00:59 crc kubenswrapper[4771]: I1001 15:00:59.058969 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cxlz6"] Oct 01 15:00:59 crc kubenswrapper[4771]: I1001 15:00:59.060956 4771 generic.go:334] "Generic (PLEG): container finished" podID="e8c77405-b06a-457f-ae26-aa105e1e638c" containerID="db543509859f612688dbf44155932abc4ae29655e2fa40fab7871aaecca5846f" exitCode=0 Oct 01 15:00:59 crc kubenswrapper[4771]: I1001 15:00:59.061034 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lzbhb" event={"ID":"e8c77405-b06a-457f-ae26-aa105e1e638c","Type":"ContainerDied","Data":"db543509859f612688dbf44155932abc4ae29655e2fa40fab7871aaecca5846f"} Oct 01 15:00:59 crc kubenswrapper[4771]: I1001 15:00:59.063156 4771 generic.go:334] "Generic (PLEG): container finished" podID="e61495fc-ef85-44de-8135-f080a089e4ed" containerID="5835cef6a16d89ec43e90b1bd47d1f380a70693db9cfb859565fe36c08a65e7f" exitCode=0 Oct 01 15:00:59 crc kubenswrapper[4771]: I1001 15:00:59.063189 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9ftld" event={"ID":"e61495fc-ef85-44de-8135-f080a089e4ed","Type":"ContainerDied","Data":"5835cef6a16d89ec43e90b1bd47d1f380a70693db9cfb859565fe36c08a65e7f"} Oct 01 15:00:59 crc kubenswrapper[4771]: I1001 15:00:59.077028 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bv7vl" Oct 01 15:00:59 crc kubenswrapper[4771]: I1001 15:00:59.256893 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bv7vl"] Oct 01 15:00:59 crc kubenswrapper[4771]: W1001 15:00:59.301185 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2061f39f_1b36_4d01_b13f_33156f106012.slice/crio-b465aa4ed4285732bde6f9cb53cb419ce4c021620459016742b81ae4909c9faf WatchSource:0}: Error finding container b465aa4ed4285732bde6f9cb53cb419ce4c021620459016742b81ae4909c9faf: Status 404 returned error can't find the container with id b465aa4ed4285732bde6f9cb53cb419ce4c021620459016742b81ae4909c9faf Oct 01 15:01:00 crc kubenswrapper[4771]: I1001 15:01:00.070837 4771 generic.go:334] "Generic (PLEG): container finished" podID="e0ead9f7-cc56-4d45-9718-d175502b54df" containerID="ed853b9be913ee4e3e03934a3fddc7c780fff08bb72e7f790a957f01ac690375" exitCode=0 Oct 01 15:01:00 crc kubenswrapper[4771]: I1001 15:01:00.070916 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cxlz6" event={"ID":"e0ead9f7-cc56-4d45-9718-d175502b54df","Type":"ContainerDied","Data":"ed853b9be913ee4e3e03934a3fddc7c780fff08bb72e7f790a957f01ac690375"} Oct 01 15:01:00 crc kubenswrapper[4771]: I1001 15:01:00.070954 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cxlz6" event={"ID":"e0ead9f7-cc56-4d45-9718-d175502b54df","Type":"ContainerStarted","Data":"8561125c04d48fd99f597358cd23c0a1ffadcadba1a7f42a0154459555a6a2c4"} Oct 01 15:01:00 crc kubenswrapper[4771]: I1001 15:01:00.080556 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-lzbhb" event={"ID":"e8c77405-b06a-457f-ae26-aa105e1e638c","Type":"ContainerStarted","Data":"6f62ea30968e5bb210d6b72fbe55a3a2e4fa8bf15e2c8075354ba321bccb231e"} Oct 01 15:01:00 crc kubenswrapper[4771]: I1001 15:01:00.084308 4771 generic.go:334] "Generic (PLEG): container finished" podID="2061f39f-1b36-4d01-b13f-33156f106012" containerID="a9a4c2b4fe31af1704020f107088abc837ea63891af3fe6aee0cfa8e931b2882" exitCode=0 Oct 01 15:01:00 crc kubenswrapper[4771]: I1001 15:01:00.084386 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bv7vl" event={"ID":"2061f39f-1b36-4d01-b13f-33156f106012","Type":"ContainerDied","Data":"a9a4c2b4fe31af1704020f107088abc837ea63891af3fe6aee0cfa8e931b2882"} Oct 01 15:01:00 crc kubenswrapper[4771]: I1001 15:01:00.084416 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bv7vl" event={"ID":"2061f39f-1b36-4d01-b13f-33156f106012","Type":"ContainerStarted","Data":"b465aa4ed4285732bde6f9cb53cb419ce4c021620459016742b81ae4909c9faf"} Oct 01 15:01:00 crc kubenswrapper[4771]: I1001 15:01:00.113346 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9ftld" podStartSLOduration=1.209311075 podStartE2EDuration="4.113329104s" podCreationTimestamp="2025-10-01 15:00:56 +0000 UTC" firstStartedPulling="2025-10-01 15:00:57.047065134 +0000 UTC m=+301.666240305" lastFinishedPulling="2025-10-01 15:00:59.951083163 +0000 UTC m=+304.570258334" observedRunningTime="2025-10-01 15:01:00.111208708 +0000 UTC m=+304.730383879" watchObservedRunningTime="2025-10-01 15:01:00.113329104 +0000 UTC m=+304.732504275" Oct 01 15:01:00 crc kubenswrapper[4771]: I1001 15:01:00.135583 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-lzbhb" podStartSLOduration=1.353813075 podStartE2EDuration="4.135565743s" podCreationTimestamp="2025-10-01 15:00:56 +0000 UTC" firstStartedPulling="2025-10-01 15:00:57.050836874 +0000 UTC m=+301.670012045" lastFinishedPulling="2025-10-01 15:00:59.832589542 +0000 UTC m=+304.451764713" observedRunningTime="2025-10-01 15:01:00.131535836 +0000 UTC m=+304.750711027" watchObservedRunningTime="2025-10-01 15:01:00.135565743 +0000 UTC m=+304.754740914" Oct 01 15:01:01 crc kubenswrapper[4771]: I1001 15:01:01.099225 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9ftld" event={"ID":"e61495fc-ef85-44de-8135-f080a089e4ed","Type":"ContainerStarted","Data":"f2a3164ec9f298d9c8e3cee20263e163ceacf2cbd45abf8413fd2ce3398ee719"} Oct 01 15:01:02 crc kubenswrapper[4771]: I1001 15:01:02.109681 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cxlz6" event={"ID":"e0ead9f7-cc56-4d45-9718-d175502b54df","Type":"ContainerStarted","Data":"ede0d8b9d243feacfa0234d56838218c8daf9ea1ef081b8e1ae5f2558d2dd4b5"} Oct 01 15:01:02 crc kubenswrapper[4771]: I1001 15:01:02.112240 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bv7vl" event={"ID":"2061f39f-1b36-4d01-b13f-33156f106012","Type":"ContainerStarted","Data":"bc80bae3ac16445dcfc73a0bb32af27df6d26f239e46266590434e9eb5911707"} Oct 01 15:01:03 crc kubenswrapper[4771]: I1001 15:01:03.126213 4771 generic.go:334] "Generic (PLEG): container finished" podID="2061f39f-1b36-4d01-b13f-33156f106012" containerID="bc80bae3ac16445dcfc73a0bb32af27df6d26f239e46266590434e9eb5911707" exitCode=0 Oct 01 15:01:03 crc kubenswrapper[4771]: I1001 15:01:03.126403 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bv7vl" event={"ID":"2061f39f-1b36-4d01-b13f-33156f106012","Type":"ContainerDied","Data":"bc80bae3ac16445dcfc73a0bb32af27df6d26f239e46266590434e9eb5911707"} Oct 01 15:01:03 crc kubenswrapper[4771]: I1001 15:01:03.132024 4771 generic.go:334] "Generic (PLEG): container finished" podID="e0ead9f7-cc56-4d45-9718-d175502b54df" containerID="ede0d8b9d243feacfa0234d56838218c8daf9ea1ef081b8e1ae5f2558d2dd4b5" exitCode=0 Oct 01 15:01:03 crc kubenswrapper[4771]: I1001 15:01:03.132067 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cxlz6" event={"ID":"e0ead9f7-cc56-4d45-9718-d175502b54df","Type":"ContainerDied","Data":"ede0d8b9d243feacfa0234d56838218c8daf9ea1ef081b8e1ae5f2558d2dd4b5"} Oct 01 15:01:04 crc kubenswrapper[4771]: I1001 15:01:04.142593 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bv7vl" event={"ID":"2061f39f-1b36-4d01-b13f-33156f106012","Type":"ContainerStarted","Data":"299e4e64751e34947d22ba4879cb6e2604f698624074071ce5b5cc116faa301b"} Oct 01 15:01:04 crc kubenswrapper[4771]: I1001 15:01:04.169327 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bv7vl" podStartSLOduration=2.717096598 podStartE2EDuration="6.169266937s" podCreationTimestamp="2025-10-01 15:00:58 +0000 UTC" firstStartedPulling="2025-10-01 15:01:00.086572885 +0000 UTC m=+304.705748056" lastFinishedPulling="2025-10-01 15:01:03.538743224 +0000 UTC m=+308.157918395" observedRunningTime="2025-10-01 15:01:04.165842596 +0000 UTC m=+308.785017807" watchObservedRunningTime="2025-10-01 15:01:04.169266937 +0000 UTC m=+308.788442108" Oct 01 15:01:05 crc kubenswrapper[4771]: I1001 15:01:05.149391 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cxlz6" event={"ID":"e0ead9f7-cc56-4d45-9718-d175502b54df","Type":"ContainerStarted","Data":"8a045985af2c57442348075f54c95876d713d50f2d67536c946b37f8e60fd6a9"} Oct 01 15:01:05 crc kubenswrapper[4771]: I1001 15:01:05.168397 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cxlz6" podStartSLOduration=3.192657584 podStartE2EDuration="7.168379781s" podCreationTimestamp="2025-10-01 15:00:58 +0000 UTC" firstStartedPulling="2025-10-01 15:01:00.074874185 +0000 UTC m=+304.694049366" lastFinishedPulling="2025-10-01 15:01:04.050596392 +0000 UTC m=+308.669771563" observedRunningTime="2025-10-01 15:01:05.165764173 +0000 UTC m=+309.784939344" watchObservedRunningTime="2025-10-01 15:01:05.168379781 +0000 UTC m=+309.787554952" Oct 01 15:01:06 crc kubenswrapper[4771]: I1001 15:01:06.443409 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9ftld" Oct 01 15:01:06 crc kubenswrapper[4771]: I1001 15:01:06.444211 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9ftld" Oct 01 15:01:06 crc kubenswrapper[4771]: I1001 15:01:06.481338 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9ftld" Oct 01 15:01:06 crc kubenswrapper[4771]: I1001 15:01:06.636010 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-lzbhb" Oct 01 15:01:06 crc kubenswrapper[4771]: I1001 15:01:06.636565 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-lzbhb" Oct 01 15:01:06 crc kubenswrapper[4771]: I1001 15:01:06.677994 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-lzbhb" Oct 01 15:01:07 crc kubenswrapper[4771]: I1001 15:01:07.197282 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9ftld" Oct 01 15:01:07 crc kubenswrapper[4771]: I1001 15:01:07.198665 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-lzbhb" Oct 01 15:01:08 crc kubenswrapper[4771]: I1001 15:01:08.846675 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cxlz6" Oct 01 15:01:08 crc kubenswrapper[4771]: I1001 15:01:08.847228 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cxlz6" Oct 01 15:01:08 crc kubenswrapper[4771]: I1001 15:01:08.890812 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cxlz6" Oct 01 15:01:09 crc kubenswrapper[4771]: I1001 15:01:09.077992 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bv7vl" Oct 01 15:01:09 crc kubenswrapper[4771]: I1001 15:01:09.078063 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bv7vl" Oct 01 15:01:09 crc kubenswrapper[4771]: I1001 15:01:09.139365 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bv7vl" Oct 01 15:01:09 crc kubenswrapper[4771]: I1001 15:01:09.224051 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cxlz6" Oct 01 15:01:09 crc kubenswrapper[4771]: I1001 15:01:09.238288 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bv7vl" Oct 01 15:02:12 crc kubenswrapper[4771]: I1001 15:02:12.177245 4771 patch_prober.go:28] interesting pod/machine-config-daemon-vck47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 15:02:12 crc kubenswrapper[4771]: I1001 15:02:12.178507 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 15:02:39 crc kubenswrapper[4771]: I1001 15:02:39.964973 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-69gzz"] Oct 01 15:02:40 crc kubenswrapper[4771]: I1001 15:02:39.966363 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-69gzz" Oct 01 15:02:40 crc kubenswrapper[4771]: I1001 15:02:40.028596 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-69gzz"] Oct 01 15:02:40 crc kubenswrapper[4771]: I1001 15:02:40.058354 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ea365acf-d04c-42f8-895d-0cfbadc1fb88-registry-certificates\") pod \"image-registry-66df7c8f76-69gzz\" (UID: \"ea365acf-d04c-42f8-895d-0cfbadc1fb88\") " pod="openshift-image-registry/image-registry-66df7c8f76-69gzz" Oct 01 15:02:40 crc kubenswrapper[4771]: I1001 15:02:40.058436 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ea365acf-d04c-42f8-895d-0cfbadc1fb88-installation-pull-secrets\") pod \"image-registry-66df7c8f76-69gzz\" (UID: \"ea365acf-d04c-42f8-895d-0cfbadc1fb88\") " pod="openshift-image-registry/image-registry-66df7c8f76-69gzz" Oct 01 15:02:40 crc kubenswrapper[4771]: I1001 15:02:40.058458 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ea365acf-d04c-42f8-895d-0cfbadc1fb88-bound-sa-token\") pod \"image-registry-66df7c8f76-69gzz\" (UID: \"ea365acf-d04c-42f8-895d-0cfbadc1fb88\") " pod="openshift-image-registry/image-registry-66df7c8f76-69gzz" Oct 01 15:02:40 crc kubenswrapper[4771]: I1001 15:02:40.058479 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ea365acf-d04c-42f8-895d-0cfbadc1fb88-trusted-ca\") pod \"image-registry-66df7c8f76-69gzz\" (UID: \"ea365acf-d04c-42f8-895d-0cfbadc1fb88\") " pod="openshift-image-registry/image-registry-66df7c8f76-69gzz" Oct 01 15:02:40 crc kubenswrapper[4771]: I1001 15:02:40.058518 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ea365acf-d04c-42f8-895d-0cfbadc1fb88-ca-trust-extracted\") pod \"image-registry-66df7c8f76-69gzz\" (UID: \"ea365acf-d04c-42f8-895d-0cfbadc1fb88\") " pod="openshift-image-registry/image-registry-66df7c8f76-69gzz" Oct 01 15:02:40 crc kubenswrapper[4771]: I1001 15:02:40.058546 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbqrk\" (UniqueName: \"kubernetes.io/projected/ea365acf-d04c-42f8-895d-0cfbadc1fb88-kube-api-access-cbqrk\") pod \"image-registry-66df7c8f76-69gzz\" (UID: \"ea365acf-d04c-42f8-895d-0cfbadc1fb88\") " pod="openshift-image-registry/image-registry-66df7c8f76-69gzz" Oct 01 15:02:40 crc kubenswrapper[4771]: I1001 15:02:40.058563 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ea365acf-d04c-42f8-895d-0cfbadc1fb88-registry-tls\") pod \"image-registry-66df7c8f76-69gzz\" (UID: \"ea365acf-d04c-42f8-895d-0cfbadc1fb88\") " pod="openshift-image-registry/image-registry-66df7c8f76-69gzz" Oct 01 15:02:40 crc kubenswrapper[4771]: I1001 15:02:40.058598 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-69gzz\" (UID: \"ea365acf-d04c-42f8-895d-0cfbadc1fb88\") " pod="openshift-image-registry/image-registry-66df7c8f76-69gzz" Oct 01 15:02:40 crc kubenswrapper[4771]: I1001 15:02:40.087942 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-69gzz\" (UID: \"ea365acf-d04c-42f8-895d-0cfbadc1fb88\") " pod="openshift-image-registry/image-registry-66df7c8f76-69gzz" Oct 01 15:02:40 crc kubenswrapper[4771]: I1001 15:02:40.161299 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ea365acf-d04c-42f8-895d-0cfbadc1fb88-installation-pull-secrets\") pod \"image-registry-66df7c8f76-69gzz\" (UID: \"ea365acf-d04c-42f8-895d-0cfbadc1fb88\") " pod="openshift-image-registry/image-registry-66df7c8f76-69gzz" Oct 01 15:02:40 crc kubenswrapper[4771]: I1001 15:02:40.161497 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ea365acf-d04c-42f8-895d-0cfbadc1fb88-bound-sa-token\") pod \"image-registry-66df7c8f76-69gzz\" (UID: \"ea365acf-d04c-42f8-895d-0cfbadc1fb88\") " pod="openshift-image-registry/image-registry-66df7c8f76-69gzz" Oct 01 15:02:40 crc kubenswrapper[4771]: I1001 15:02:40.161686 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ea365acf-d04c-42f8-895d-0cfbadc1fb88-trusted-ca\") pod \"image-registry-66df7c8f76-69gzz\" (UID: \"ea365acf-d04c-42f8-895d-0cfbadc1fb88\") " pod="openshift-image-registry/image-registry-66df7c8f76-69gzz" Oct 01 15:02:40 crc kubenswrapper[4771]: I1001 15:02:40.161895 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ea365acf-d04c-42f8-895d-0cfbadc1fb88-ca-trust-extracted\") pod \"image-registry-66df7c8f76-69gzz\" (UID: \"ea365acf-d04c-42f8-895d-0cfbadc1fb88\") " pod="openshift-image-registry/image-registry-66df7c8f76-69gzz" Oct 01 15:02:40 crc kubenswrapper[4771]: I1001 15:02:40.161943 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbqrk\" (UniqueName: \"kubernetes.io/projected/ea365acf-d04c-42f8-895d-0cfbadc1fb88-kube-api-access-cbqrk\") pod \"image-registry-66df7c8f76-69gzz\" (UID: \"ea365acf-d04c-42f8-895d-0cfbadc1fb88\") " pod="openshift-image-registry/image-registry-66df7c8f76-69gzz" Oct 01 15:02:40 crc kubenswrapper[4771]: I1001 15:02:40.161986 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ea365acf-d04c-42f8-895d-0cfbadc1fb88-registry-tls\") pod \"image-registry-66df7c8f76-69gzz\" (UID: \"ea365acf-d04c-42f8-895d-0cfbadc1fb88\") " pod="openshift-image-registry/image-registry-66df7c8f76-69gzz" Oct 01 15:02:40 crc kubenswrapper[4771]: I1001 15:02:40.162051 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ea365acf-d04c-42f8-895d-0cfbadc1fb88-registry-certificates\") pod \"image-registry-66df7c8f76-69gzz\" (UID: \"ea365acf-d04c-42f8-895d-0cfbadc1fb88\") " pod="openshift-image-registry/image-registry-66df7c8f76-69gzz" Oct 01 15:02:40 crc kubenswrapper[4771]: I1001 15:02:40.162451 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ea365acf-d04c-42f8-895d-0cfbadc1fb88-ca-trust-extracted\") pod \"image-registry-66df7c8f76-69gzz\" (UID: \"ea365acf-d04c-42f8-895d-0cfbadc1fb88\") " pod="openshift-image-registry/image-registry-66df7c8f76-69gzz" Oct 01 15:02:40 crc kubenswrapper[4771]: I1001 15:02:40.166055 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ea365acf-d04c-42f8-895d-0cfbadc1fb88-trusted-ca\") pod \"image-registry-66df7c8f76-69gzz\" (UID: \"ea365acf-d04c-42f8-895d-0cfbadc1fb88\") " pod="openshift-image-registry/image-registry-66df7c8f76-69gzz" Oct 01 15:02:40 crc kubenswrapper[4771]: I1001 15:02:40.167073 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ea365acf-d04c-42f8-895d-0cfbadc1fb88-registry-certificates\") pod \"image-registry-66df7c8f76-69gzz\" (UID: \"ea365acf-d04c-42f8-895d-0cfbadc1fb88\") " pod="openshift-image-registry/image-registry-66df7c8f76-69gzz" Oct 01 15:02:40 crc kubenswrapper[4771]: I1001 15:02:40.168693 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ea365acf-d04c-42f8-895d-0cfbadc1fb88-installation-pull-secrets\") pod \"image-registry-66df7c8f76-69gzz\" (UID: \"ea365acf-d04c-42f8-895d-0cfbadc1fb88\") " pod="openshift-image-registry/image-registry-66df7c8f76-69gzz" Oct 01 15:02:40 crc kubenswrapper[4771]: I1001 15:02:40.169015 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ea365acf-d04c-42f8-895d-0cfbadc1fb88-registry-tls\") pod \"image-registry-66df7c8f76-69gzz\" (UID: \"ea365acf-d04c-42f8-895d-0cfbadc1fb88\") " pod="openshift-image-registry/image-registry-66df7c8f76-69gzz" Oct 01 15:02:40 crc kubenswrapper[4771]: I1001 15:02:40.189849 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbqrk\" (UniqueName: \"kubernetes.io/projected/ea365acf-d04c-42f8-895d-0cfbadc1fb88-kube-api-access-cbqrk\") pod \"image-registry-66df7c8f76-69gzz\" (UID: \"ea365acf-d04c-42f8-895d-0cfbadc1fb88\") " pod="openshift-image-registry/image-registry-66df7c8f76-69gzz" Oct 01 15:02:40 crc kubenswrapper[4771]: I1001 15:02:40.192391 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ea365acf-d04c-42f8-895d-0cfbadc1fb88-bound-sa-token\") pod \"image-registry-66df7c8f76-69gzz\" (UID: \"ea365acf-d04c-42f8-895d-0cfbadc1fb88\") " pod="openshift-image-registry/image-registry-66df7c8f76-69gzz" Oct 01 15:02:40 crc kubenswrapper[4771]: I1001 15:02:40.294570 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-69gzz" Oct 01 15:02:40 crc kubenswrapper[4771]: I1001 15:02:40.481786 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-69gzz"] Oct 01 15:02:40 crc kubenswrapper[4771]: I1001 15:02:40.757451 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-69gzz" event={"ID":"ea365acf-d04c-42f8-895d-0cfbadc1fb88","Type":"ContainerStarted","Data":"dd97d5610bededa98fd051ba9ce2b2ab12d56b450332bb1092d119ee06240080"} Oct 01 15:02:40 crc kubenswrapper[4771]: I1001 15:02:40.757516 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-69gzz" event={"ID":"ea365acf-d04c-42f8-895d-0cfbadc1fb88","Type":"ContainerStarted","Data":"5ccd75fef313dc7cf4e1b08dc33ffbfcdc421af721cd9e84d8cce6de55797d48"} Oct 01 15:02:40 crc kubenswrapper[4771]: I1001 15:02:40.757925 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-69gzz" Oct 01 15:02:40 crc kubenswrapper[4771]: I1001 15:02:40.789697 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-69gzz" podStartSLOduration=1.789667536 podStartE2EDuration="1.789667536s" podCreationTimestamp="2025-10-01 15:02:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:02:40.788181537 +0000 UTC m=+405.407356718" watchObservedRunningTime="2025-10-01 15:02:40.789667536 +0000 UTC m=+405.408842747" Oct 01 15:02:42 crc kubenswrapper[4771]: I1001 15:02:42.178390 4771 patch_prober.go:28] interesting pod/machine-config-daemon-vck47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 15:02:42 crc kubenswrapper[4771]: I1001 15:02:42.179095 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 15:03:00 crc kubenswrapper[4771]: I1001 15:03:00.307202 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-69gzz" Oct 01 15:03:00 crc kubenswrapper[4771]: I1001 15:03:00.372327 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zhdpb"] Oct 01 15:03:12 crc kubenswrapper[4771]: I1001 15:03:12.177567 4771 patch_prober.go:28] interesting pod/machine-config-daemon-vck47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 15:03:12 crc kubenswrapper[4771]: I1001 15:03:12.178501 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 15:03:12 crc kubenswrapper[4771]: I1001 15:03:12.178568 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vck47" Oct 01 15:03:12 crc kubenswrapper[4771]: I1001 15:03:12.179455 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"77301768e84aaf763ab3e5229d66257dc37283214f6e3b73666999b30255a8ac"} pod="openshift-machine-config-operator/machine-config-daemon-vck47" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 15:03:12 crc kubenswrapper[4771]: I1001 15:03:12.179566 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" containerID="cri-o://77301768e84aaf763ab3e5229d66257dc37283214f6e3b73666999b30255a8ac" gracePeriod=600 Oct 01 15:03:12 crc kubenswrapper[4771]: I1001 15:03:12.942773 4771 generic.go:334] "Generic (PLEG): container finished" podID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerID="77301768e84aaf763ab3e5229d66257dc37283214f6e3b73666999b30255a8ac" exitCode=0 Oct 01 15:03:12 crc kubenswrapper[4771]: I1001 15:03:12.942870 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" event={"ID":"289ee6d3-fabe-417f-964c-76ca03c143cc","Type":"ContainerDied","Data":"77301768e84aaf763ab3e5229d66257dc37283214f6e3b73666999b30255a8ac"} Oct 01 15:03:12 crc kubenswrapper[4771]: I1001 15:03:12.943117 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" event={"ID":"289ee6d3-fabe-417f-964c-76ca03c143cc","Type":"ContainerStarted","Data":"19a5b763fc09a48e284061322b6ea6e90ab3ff0404cebd1078a70132290e4cb2"} Oct 01 15:03:12 crc kubenswrapper[4771]: I1001 15:03:12.943138 4771 scope.go:117] "RemoveContainer" containerID="161a9b5c96d73213aa3d0261956487464c5e1398e47d221db9fccafdfdc51856" Oct 01 15:03:25 crc kubenswrapper[4771]: I1001 15:03:25.429253 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" podUID="8b800e30-2559-4c0b-9732-7a069ae3da91" containerName="registry" containerID="cri-o://618e1641aa67bab1f7d90d08fc7f0331d42f857de85d23fb278c9da637f2c175" gracePeriod=30 Oct 01 15:03:25 crc kubenswrapper[4771]: I1001 15:03:25.783922 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 15:03:25 crc kubenswrapper[4771]: I1001 15:03:25.900129 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8b800e30-2559-4c0b-9732-7a069ae3da91-registry-certificates\") pod \"8b800e30-2559-4c0b-9732-7a069ae3da91\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " Oct 01 15:03:25 crc kubenswrapper[4771]: I1001 15:03:25.900377 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8b800e30-2559-4c0b-9732-7a069ae3da91\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " Oct 01 15:03:25 crc kubenswrapper[4771]: I1001 15:03:25.900407 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rss87\" (UniqueName: \"kubernetes.io/projected/8b800e30-2559-4c0b-9732-7a069ae3da91-kube-api-access-rss87\") pod \"8b800e30-2559-4c0b-9732-7a069ae3da91\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " Oct 01 15:03:25 crc kubenswrapper[4771]: I1001 15:03:25.900423 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8b800e30-2559-4c0b-9732-7a069ae3da91-bound-sa-token\") pod \"8b800e30-2559-4c0b-9732-7a069ae3da91\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " Oct 01 15:03:25 crc kubenswrapper[4771]: I1001 15:03:25.900448 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8b800e30-2559-4c0b-9732-7a069ae3da91-trusted-ca\") pod \"8b800e30-2559-4c0b-9732-7a069ae3da91\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " Oct 01 15:03:25 crc kubenswrapper[4771]: I1001 15:03:25.900474 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8b800e30-2559-4c0b-9732-7a069ae3da91-registry-tls\") pod \"8b800e30-2559-4c0b-9732-7a069ae3da91\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " Oct 01 15:03:25 crc kubenswrapper[4771]: I1001 15:03:25.900501 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8b800e30-2559-4c0b-9732-7a069ae3da91-installation-pull-secrets\") pod \"8b800e30-2559-4c0b-9732-7a069ae3da91\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " Oct 01 15:03:25 crc kubenswrapper[4771]: I1001 15:03:25.900520 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8b800e30-2559-4c0b-9732-7a069ae3da91-ca-trust-extracted\") pod \"8b800e30-2559-4c0b-9732-7a069ae3da91\" (UID: \"8b800e30-2559-4c0b-9732-7a069ae3da91\") " Oct 01 15:03:25 crc kubenswrapper[4771]: I1001 15:03:25.900994 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b800e30-2559-4c0b-9732-7a069ae3da91-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8b800e30-2559-4c0b-9732-7a069ae3da91" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:03:25 crc kubenswrapper[4771]: I1001 15:03:25.902046 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b800e30-2559-4c0b-9732-7a069ae3da91-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8b800e30-2559-4c0b-9732-7a069ae3da91" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:03:25 crc kubenswrapper[4771]: I1001 15:03:25.908467 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b800e30-2559-4c0b-9732-7a069ae3da91-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8b800e30-2559-4c0b-9732-7a069ae3da91" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:03:25 crc kubenswrapper[4771]: I1001 15:03:25.908508 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b800e30-2559-4c0b-9732-7a069ae3da91-kube-api-access-rss87" (OuterVolumeSpecName: "kube-api-access-rss87") pod "8b800e30-2559-4c0b-9732-7a069ae3da91" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91"). InnerVolumeSpecName "kube-api-access-rss87". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:03:25 crc kubenswrapper[4771]: I1001 15:03:25.910010 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b800e30-2559-4c0b-9732-7a069ae3da91-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8b800e30-2559-4c0b-9732-7a069ae3da91" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:03:25 crc kubenswrapper[4771]: I1001 15:03:25.910339 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b800e30-2559-4c0b-9732-7a069ae3da91-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8b800e30-2559-4c0b-9732-7a069ae3da91" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:03:25 crc kubenswrapper[4771]: I1001 15:03:25.914443 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "8b800e30-2559-4c0b-9732-7a069ae3da91" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 01 15:03:25 crc kubenswrapper[4771]: I1001 15:03:25.917032 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b800e30-2559-4c0b-9732-7a069ae3da91-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8b800e30-2559-4c0b-9732-7a069ae3da91" (UID: "8b800e30-2559-4c0b-9732-7a069ae3da91"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:03:26 crc kubenswrapper[4771]: I1001 15:03:26.001970 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rss87\" (UniqueName: \"kubernetes.io/projected/8b800e30-2559-4c0b-9732-7a069ae3da91-kube-api-access-rss87\") on node \"crc\" DevicePath \"\"" Oct 01 15:03:26 crc kubenswrapper[4771]: I1001 15:03:26.002002 4771 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8b800e30-2559-4c0b-9732-7a069ae3da91-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 01 15:03:26 crc kubenswrapper[4771]: I1001 15:03:26.002011 4771 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8b800e30-2559-4c0b-9732-7a069ae3da91-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 01 15:03:26 crc kubenswrapper[4771]: I1001 15:03:26.002020 4771 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8b800e30-2559-4c0b-9732-7a069ae3da91-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 01 15:03:26 crc kubenswrapper[4771]: I1001 15:03:26.002028 4771 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8b800e30-2559-4c0b-9732-7a069ae3da91-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 01 15:03:26 crc kubenswrapper[4771]: I1001 15:03:26.002037 4771 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8b800e30-2559-4c0b-9732-7a069ae3da91-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 01 15:03:26 crc kubenswrapper[4771]: I1001 15:03:26.002045 4771 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8b800e30-2559-4c0b-9732-7a069ae3da91-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 01 15:03:26 crc kubenswrapper[4771]: I1001 15:03:26.034810 4771 generic.go:334] "Generic (PLEG): container finished" podID="8b800e30-2559-4c0b-9732-7a069ae3da91" containerID="618e1641aa67bab1f7d90d08fc7f0331d42f857de85d23fb278c9da637f2c175" exitCode=0 Oct 01 15:03:26 crc kubenswrapper[4771]: I1001 15:03:26.034854 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" event={"ID":"8b800e30-2559-4c0b-9732-7a069ae3da91","Type":"ContainerDied","Data":"618e1641aa67bab1f7d90d08fc7f0331d42f857de85d23fb278c9da637f2c175"} Oct 01 15:03:26 crc kubenswrapper[4771]: I1001 15:03:26.034902 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" event={"ID":"8b800e30-2559-4c0b-9732-7a069ae3da91","Type":"ContainerDied","Data":"35f75a6301fa6e562a0456ad86e9c733adb667c160045964ade968b6944c736b"} Oct 01 15:03:26 crc kubenswrapper[4771]: I1001 15:03:26.034930 4771 scope.go:117] "RemoveContainer" containerID="618e1641aa67bab1f7d90d08fc7f0331d42f857de85d23fb278c9da637f2c175" Oct 01 15:03:26 crc kubenswrapper[4771]: I1001 15:03:26.035222 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-zhdpb" Oct 01 15:03:26 crc kubenswrapper[4771]: I1001 15:03:26.052035 4771 scope.go:117] "RemoveContainer" containerID="618e1641aa67bab1f7d90d08fc7f0331d42f857de85d23fb278c9da637f2c175" Oct 01 15:03:26 crc kubenswrapper[4771]: E1001 15:03:26.052677 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"618e1641aa67bab1f7d90d08fc7f0331d42f857de85d23fb278c9da637f2c175\": container with ID starting with 618e1641aa67bab1f7d90d08fc7f0331d42f857de85d23fb278c9da637f2c175 not found: ID does not exist" containerID="618e1641aa67bab1f7d90d08fc7f0331d42f857de85d23fb278c9da637f2c175" Oct 01 15:03:26 crc kubenswrapper[4771]: I1001 15:03:26.052753 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"618e1641aa67bab1f7d90d08fc7f0331d42f857de85d23fb278c9da637f2c175"} err="failed to get container status \"618e1641aa67bab1f7d90d08fc7f0331d42f857de85d23fb278c9da637f2c175\": rpc error: code = NotFound desc = could not find container \"618e1641aa67bab1f7d90d08fc7f0331d42f857de85d23fb278c9da637f2c175\": container with ID starting with 618e1641aa67bab1f7d90d08fc7f0331d42f857de85d23fb278c9da637f2c175 not found: ID does not exist" Oct 01 15:03:26 crc kubenswrapper[4771]: I1001 15:03:26.059040 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zhdpb"] Oct 01 15:03:26 crc kubenswrapper[4771]: I1001 15:03:26.064792 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-zhdpb"] Oct 01 15:03:27 crc kubenswrapper[4771]: I1001 15:03:27.994016 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b800e30-2559-4c0b-9732-7a069ae3da91" path="/var/lib/kubelet/pods/8b800e30-2559-4c0b-9732-7a069ae3da91/volumes" Oct 01 15:05:12 crc kubenswrapper[4771]: I1001 15:05:12.177932 4771 patch_prober.go:28] interesting pod/machine-config-daemon-vck47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 15:05:12 crc kubenswrapper[4771]: I1001 15:05:12.178567 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 15:05:42 crc kubenswrapper[4771]: I1001 15:05:42.177937 4771 patch_prober.go:28] interesting pod/machine-config-daemon-vck47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 15:05:42 crc kubenswrapper[4771]: I1001 15:05:42.178738 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 15:06:12 crc kubenswrapper[4771]: I1001 15:06:12.177489 4771 patch_prober.go:28] interesting pod/machine-config-daemon-vck47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 15:06:12 crc kubenswrapper[4771]: I1001 15:06:12.178122 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 15:06:12 crc kubenswrapper[4771]: I1001 15:06:12.178195 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vck47" Oct 01 15:06:12 crc kubenswrapper[4771]: I1001 15:06:12.179096 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"19a5b763fc09a48e284061322b6ea6e90ab3ff0404cebd1078a70132290e4cb2"} pod="openshift-machine-config-operator/machine-config-daemon-vck47" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 15:06:12 crc kubenswrapper[4771]: I1001 15:06:12.179220 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" containerID="cri-o://19a5b763fc09a48e284061322b6ea6e90ab3ff0404cebd1078a70132290e4cb2" gracePeriod=600 Oct 01 15:06:13 crc kubenswrapper[4771]: I1001 15:06:13.124473 4771 generic.go:334] "Generic (PLEG): container finished" podID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerID="19a5b763fc09a48e284061322b6ea6e90ab3ff0404cebd1078a70132290e4cb2" exitCode=0 Oct 01 15:06:13 crc kubenswrapper[4771]: I1001 15:06:13.124534 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" event={"ID":"289ee6d3-fabe-417f-964c-76ca03c143cc","Type":"ContainerDied","Data":"19a5b763fc09a48e284061322b6ea6e90ab3ff0404cebd1078a70132290e4cb2"} Oct 01 15:06:13 crc kubenswrapper[4771]: I1001 15:06:13.125087 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" event={"ID":"289ee6d3-fabe-417f-964c-76ca03c143cc","Type":"ContainerStarted","Data":"80215404bb4371102288dbe39becc7517e16d1990f65ff6bea20c4cb7f6f0681"} Oct 01 15:06:13 crc kubenswrapper[4771]: I1001 15:06:13.125110 4771 scope.go:117] "RemoveContainer" containerID="77301768e84aaf763ab3e5229d66257dc37283214f6e3b73666999b30255a8ac" Oct 01 15:06:18 crc kubenswrapper[4771]: I1001 15:06:18.019859 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-wjfnq"] Oct 01 15:06:18 crc kubenswrapper[4771]: E1001 15:06:18.020447 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b800e30-2559-4c0b-9732-7a069ae3da91" containerName="registry" Oct 01 15:06:18 crc kubenswrapper[4771]: I1001 15:06:18.020458 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b800e30-2559-4c0b-9732-7a069ae3da91" containerName="registry" Oct 01 15:06:18 crc kubenswrapper[4771]: I1001 15:06:18.020553 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b800e30-2559-4c0b-9732-7a069ae3da91" containerName="registry" Oct 01 15:06:18 crc kubenswrapper[4771]: I1001 15:06:18.020974 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-wjfnq" Oct 01 15:06:18 crc kubenswrapper[4771]: I1001 15:06:18.022807 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 01 15:06:18 crc kubenswrapper[4771]: I1001 15:06:18.028059 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 01 15:06:18 crc kubenswrapper[4771]: I1001 15:06:18.028206 4771 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-wjj2s" Oct 01 15:06:18 crc kubenswrapper[4771]: I1001 15:06:18.037715 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-wjfnq"] Oct 01 15:06:18 crc kubenswrapper[4771]: I1001 15:06:18.039597 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-n6g7s"] Oct 01 15:06:18 crc kubenswrapper[4771]: I1001 15:06:18.040198 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-n6g7s" Oct 01 15:06:18 crc kubenswrapper[4771]: I1001 15:06:18.042138 4771 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-2gmfv" Oct 01 15:06:18 crc kubenswrapper[4771]: I1001 15:06:18.049295 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qlfh\" (UniqueName: \"kubernetes.io/projected/153105d4-1f8e-43f2-bcea-0f3a36598eb0-kube-api-access-6qlfh\") pod \"cert-manager-cainjector-7f985d654d-wjfnq\" (UID: \"153105d4-1f8e-43f2-bcea-0f3a36598eb0\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-wjfnq" Oct 01 15:06:18 crc kubenswrapper[4771]: I1001 15:06:18.053887 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-n6g7s"] Oct 01 15:06:18 crc kubenswrapper[4771]: I1001 15:06:18.056048 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-9zpvt"] Oct 01 15:06:18 crc kubenswrapper[4771]: I1001 15:06:18.056656 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-9zpvt" Oct 01 15:06:18 crc kubenswrapper[4771]: I1001 15:06:18.061561 4771 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-j22ng" Oct 01 15:06:18 crc kubenswrapper[4771]: I1001 15:06:18.075960 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-9zpvt"] Oct 01 15:06:18 crc kubenswrapper[4771]: I1001 15:06:18.150207 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhstv\" (UniqueName: \"kubernetes.io/projected/96d1876b-3e09-4899-8b04-a49c88ebf65d-kube-api-access-nhstv\") pod \"cert-manager-5b446d88c5-n6g7s\" (UID: \"96d1876b-3e09-4899-8b04-a49c88ebf65d\") " pod="cert-manager/cert-manager-5b446d88c5-n6g7s" Oct 01 15:06:18 crc kubenswrapper[4771]: I1001 15:06:18.150280 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qlfh\" (UniqueName: \"kubernetes.io/projected/153105d4-1f8e-43f2-bcea-0f3a36598eb0-kube-api-access-6qlfh\") pod \"cert-manager-cainjector-7f985d654d-wjfnq\" (UID: \"153105d4-1f8e-43f2-bcea-0f3a36598eb0\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-wjfnq" Oct 01 15:06:18 crc kubenswrapper[4771]: I1001 15:06:18.150415 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvjcm\" (UniqueName: \"kubernetes.io/projected/fdd5e5a8-3303-4cee-ad76-d47d1a0da067-kube-api-access-hvjcm\") pod \"cert-manager-webhook-5655c58dd6-9zpvt\" (UID: \"fdd5e5a8-3303-4cee-ad76-d47d1a0da067\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-9zpvt" Oct 01 15:06:18 crc kubenswrapper[4771]: I1001 15:06:18.176014 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qlfh\" (UniqueName: \"kubernetes.io/projected/153105d4-1f8e-43f2-bcea-0f3a36598eb0-kube-api-access-6qlfh\") pod \"cert-manager-cainjector-7f985d654d-wjfnq\" (UID: \"153105d4-1f8e-43f2-bcea-0f3a36598eb0\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-wjfnq" Oct 01 15:06:18 crc kubenswrapper[4771]: I1001 15:06:18.251912 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvjcm\" (UniqueName: \"kubernetes.io/projected/fdd5e5a8-3303-4cee-ad76-d47d1a0da067-kube-api-access-hvjcm\") pod \"cert-manager-webhook-5655c58dd6-9zpvt\" (UID: \"fdd5e5a8-3303-4cee-ad76-d47d1a0da067\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-9zpvt" Oct 01 15:06:18 crc kubenswrapper[4771]: I1001 15:06:18.252238 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhstv\" (UniqueName: \"kubernetes.io/projected/96d1876b-3e09-4899-8b04-a49c88ebf65d-kube-api-access-nhstv\") pod \"cert-manager-5b446d88c5-n6g7s\" (UID: \"96d1876b-3e09-4899-8b04-a49c88ebf65d\") " pod="cert-manager/cert-manager-5b446d88c5-n6g7s" Oct 01 15:06:18 crc kubenswrapper[4771]: I1001 15:06:18.271364 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhstv\" (UniqueName: \"kubernetes.io/projected/96d1876b-3e09-4899-8b04-a49c88ebf65d-kube-api-access-nhstv\") pod \"cert-manager-5b446d88c5-n6g7s\" (UID: \"96d1876b-3e09-4899-8b04-a49c88ebf65d\") " pod="cert-manager/cert-manager-5b446d88c5-n6g7s" Oct 01 15:06:18 crc kubenswrapper[4771]: I1001 15:06:18.282967 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvjcm\" (UniqueName: \"kubernetes.io/projected/fdd5e5a8-3303-4cee-ad76-d47d1a0da067-kube-api-access-hvjcm\") pod \"cert-manager-webhook-5655c58dd6-9zpvt\" (UID: \"fdd5e5a8-3303-4cee-ad76-d47d1a0da067\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-9zpvt" Oct 01 15:06:18 crc kubenswrapper[4771]: I1001 15:06:18.337071 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-wjfnq" Oct 01 15:06:18 crc kubenswrapper[4771]: I1001 15:06:18.358567 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-n6g7s" Oct 01 15:06:18 crc kubenswrapper[4771]: I1001 15:06:18.372855 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-9zpvt" Oct 01 15:06:18 crc kubenswrapper[4771]: I1001 15:06:18.639098 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-9zpvt"] Oct 01 15:06:18 crc kubenswrapper[4771]: I1001 15:06:18.650760 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 15:06:18 crc kubenswrapper[4771]: I1001 15:06:18.803582 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-wjfnq"] Oct 01 15:06:18 crc kubenswrapper[4771]: I1001 15:06:18.810333 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-n6g7s"] Oct 01 15:06:18 crc kubenswrapper[4771]: W1001 15:06:18.817983 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96d1876b_3e09_4899_8b04_a49c88ebf65d.slice/crio-372779e19fa13d2ecbf964a09a3fa958009605c2b6b486a9cd4d18709cfa1cd3 WatchSource:0}: Error finding container 372779e19fa13d2ecbf964a09a3fa958009605c2b6b486a9cd4d18709cfa1cd3: Status 404 returned error can't find the container with id 372779e19fa13d2ecbf964a09a3fa958009605c2b6b486a9cd4d18709cfa1cd3 Oct 01 15:06:19 crc kubenswrapper[4771]: I1001 15:06:19.165659 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-n6g7s" event={"ID":"96d1876b-3e09-4899-8b04-a49c88ebf65d","Type":"ContainerStarted","Data":"372779e19fa13d2ecbf964a09a3fa958009605c2b6b486a9cd4d18709cfa1cd3"} Oct 01 15:06:19 crc kubenswrapper[4771]: I1001 15:06:19.166992 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-wjfnq" event={"ID":"153105d4-1f8e-43f2-bcea-0f3a36598eb0","Type":"ContainerStarted","Data":"d8100f611637eaa7e3140b075636ec51b1f49959c23921a8b7c1c3a7180fd4dd"} Oct 01 15:06:19 crc kubenswrapper[4771]: I1001 15:06:19.168310 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-9zpvt" event={"ID":"fdd5e5a8-3303-4cee-ad76-d47d1a0da067","Type":"ContainerStarted","Data":"483a3c4a0ec8fb6426a1aa3cb75b999631a8bd7cc228886a1c25452e8da84acb"} Oct 01 15:06:22 crc kubenswrapper[4771]: I1001 15:06:22.187499 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-n6g7s" event={"ID":"96d1876b-3e09-4899-8b04-a49c88ebf65d","Type":"ContainerStarted","Data":"43452475f478b33a6ffa53cd88364eff507048c37e6d37a8a8020b5f2cabbcf9"} Oct 01 15:06:23 crc kubenswrapper[4771]: I1001 15:06:23.195236 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-9zpvt" event={"ID":"fdd5e5a8-3303-4cee-ad76-d47d1a0da067","Type":"ContainerStarted","Data":"1a6ce7309b80e0735950e11797d32047469dd19575e3bfd3a94244b6e1d2b28a"} Oct 01 15:06:23 crc kubenswrapper[4771]: I1001 15:06:23.195388 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-9zpvt" Oct 01 15:06:23 crc kubenswrapper[4771]: I1001 15:06:23.197099 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-wjfnq" event={"ID":"153105d4-1f8e-43f2-bcea-0f3a36598eb0","Type":"ContainerStarted","Data":"d0fda8c3b00048050b77e8f6b397da92cd94d93bbfd5668db1e8803e3cf8e904"} Oct 01 15:06:23 crc kubenswrapper[4771]: I1001 15:06:23.213639 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-9zpvt" podStartSLOduration=1.90794666 podStartE2EDuration="5.21362243s" podCreationTimestamp="2025-10-01 15:06:18 +0000 UTC" firstStartedPulling="2025-10-01 15:06:18.650493273 +0000 UTC m=+623.269668444" lastFinishedPulling="2025-10-01 15:06:21.956169043 +0000 UTC m=+626.575344214" observedRunningTime="2025-10-01 15:06:23.212478133 +0000 UTC m=+627.831653334" watchObservedRunningTime="2025-10-01 15:06:23.21362243 +0000 UTC m=+627.832797601" Oct 01 15:06:23 crc kubenswrapper[4771]: I1001 15:06:23.241329 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-n6g7s" podStartSLOduration=2.105095591 podStartE2EDuration="5.241306966s" podCreationTimestamp="2025-10-01 15:06:18 +0000 UTC" firstStartedPulling="2025-10-01 15:06:18.82002956 +0000 UTC m=+623.439204741" lastFinishedPulling="2025-10-01 15:06:21.956240915 +0000 UTC m=+626.575416116" observedRunningTime="2025-10-01 15:06:23.238695941 +0000 UTC m=+627.857871122" watchObservedRunningTime="2025-10-01 15:06:23.241306966 +0000 UTC m=+627.860482177" Oct 01 15:06:23 crc kubenswrapper[4771]: I1001 15:06:23.267540 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-wjfnq" podStartSLOduration=2.060850486 podStartE2EDuration="5.267512585s" podCreationTimestamp="2025-10-01 15:06:18 +0000 UTC" firstStartedPulling="2025-10-01 15:06:18.81235123 +0000 UTC m=+623.431526411" lastFinishedPulling="2025-10-01 15:06:22.019013329 +0000 UTC m=+626.638188510" observedRunningTime="2025-10-01 15:06:23.259710881 +0000 UTC m=+627.878886072" watchObservedRunningTime="2025-10-01 15:06:23.267512585 +0000 UTC m=+627.886687776" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.377277 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-9zpvt" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.452615 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-j7ntp"] Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.453271 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerName="ovn-controller" containerID="cri-o://4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183" gracePeriod=30 Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.453354 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerName="nbdb" containerID="cri-o://eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6" gracePeriod=30 Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.453436 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerName="northd" containerID="cri-o://1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101" gracePeriod=30 Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.453506 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70" gracePeriod=30 Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.453562 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerName="kube-rbac-proxy-node" containerID="cri-o://c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f" gracePeriod=30 Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.453618 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerName="ovn-acl-logging" containerID="cri-o://1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b" gracePeriod=30 Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.454072 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerName="sbdb" containerID="cri-o://4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32" gracePeriod=30 Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.505881 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerName="ovnkube-controller" containerID="cri-o://4a303f76a4308e3dfb264f405721e20c795c4324b793808feadb388e6589dae3" gracePeriod=30 Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.792779 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j7ntp_a061b8e2-74a8-4953-bfa2-5090a2f70459/ovnkube-controller/3.log" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.795643 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j7ntp_a061b8e2-74a8-4953-bfa2-5090a2f70459/ovn-acl-logging/0.log" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.796258 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j7ntp_a061b8e2-74a8-4953-bfa2-5090a2f70459/ovn-controller/0.log" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.796843 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.814953 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-host-run-netns\") pod \"a061b8e2-74a8-4953-bfa2-5090a2f70459\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.814995 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-node-log\") pod \"a061b8e2-74a8-4953-bfa2-5090a2f70459\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.815014 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-host-cni-bin\") pod \"a061b8e2-74a8-4953-bfa2-5090a2f70459\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.815042 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a061b8e2-74a8-4953-bfa2-5090a2f70459-env-overrides\") pod \"a061b8e2-74a8-4953-bfa2-5090a2f70459\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.815066 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-run-openvswitch\") pod \"a061b8e2-74a8-4953-bfa2-5090a2f70459\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.815083 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnspx\" (UniqueName: \"kubernetes.io/projected/a061b8e2-74a8-4953-bfa2-5090a2f70459-kube-api-access-nnspx\") pod \"a061b8e2-74a8-4953-bfa2-5090a2f70459\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.815099 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-systemd-units\") pod \"a061b8e2-74a8-4953-bfa2-5090a2f70459\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.815114 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-run-ovn\") pod \"a061b8e2-74a8-4953-bfa2-5090a2f70459\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.815130 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a061b8e2-74a8-4953-bfa2-5090a2f70459-ovnkube-config\") pod \"a061b8e2-74a8-4953-bfa2-5090a2f70459\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.815122 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "a061b8e2-74a8-4953-bfa2-5090a2f70459" (UID: "a061b8e2-74a8-4953-bfa2-5090a2f70459"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.815152 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-host-cni-netd\") pod \"a061b8e2-74a8-4953-bfa2-5090a2f70459\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.815177 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "a061b8e2-74a8-4953-bfa2-5090a2f70459" (UID: "a061b8e2-74a8-4953-bfa2-5090a2f70459"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.815188 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "a061b8e2-74a8-4953-bfa2-5090a2f70459" (UID: "a061b8e2-74a8-4953-bfa2-5090a2f70459"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.815225 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-host-slash\") pod \"a061b8e2-74a8-4953-bfa2-5090a2f70459\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.815228 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-node-log" (OuterVolumeSpecName: "node-log") pod "a061b8e2-74a8-4953-bfa2-5090a2f70459" (UID: "a061b8e2-74a8-4953-bfa2-5090a2f70459"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.815245 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "a061b8e2-74a8-4953-bfa2-5090a2f70459" (UID: "a061b8e2-74a8-4953-bfa2-5090a2f70459"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.815256 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "a061b8e2-74a8-4953-bfa2-5090a2f70459" (UID: "a061b8e2-74a8-4953-bfa2-5090a2f70459"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.815307 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-log-socket\") pod \"a061b8e2-74a8-4953-bfa2-5090a2f70459\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.815326 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-host-slash" (OuterVolumeSpecName: "host-slash") pod "a061b8e2-74a8-4953-bfa2-5090a2f70459" (UID: "a061b8e2-74a8-4953-bfa2-5090a2f70459"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.815342 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-host-run-ovn-kubernetes\") pod \"a061b8e2-74a8-4953-bfa2-5090a2f70459\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.815356 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-log-socket" (OuterVolumeSpecName: "log-socket") pod "a061b8e2-74a8-4953-bfa2-5090a2f70459" (UID: "a061b8e2-74a8-4953-bfa2-5090a2f70459"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.815368 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-etc-openvswitch\") pod \"a061b8e2-74a8-4953-bfa2-5090a2f70459\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.815385 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "a061b8e2-74a8-4953-bfa2-5090a2f70459" (UID: "a061b8e2-74a8-4953-bfa2-5090a2f70459"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.815402 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a061b8e2-74a8-4953-bfa2-5090a2f70459-ovn-node-metrics-cert\") pod \"a061b8e2-74a8-4953-bfa2-5090a2f70459\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.815413 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "a061b8e2-74a8-4953-bfa2-5090a2f70459" (UID: "a061b8e2-74a8-4953-bfa2-5090a2f70459"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.815429 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a061b8e2-74a8-4953-bfa2-5090a2f70459-ovnkube-script-lib\") pod \"a061b8e2-74a8-4953-bfa2-5090a2f70459\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.815461 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-host-kubelet\") pod \"a061b8e2-74a8-4953-bfa2-5090a2f70459\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.815481 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-run-systemd\") pod \"a061b8e2-74a8-4953-bfa2-5090a2f70459\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.815503 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-var-lib-openvswitch\") pod \"a061b8e2-74a8-4953-bfa2-5090a2f70459\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.815528 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-host-var-lib-cni-networks-ovn-kubernetes\") pod \"a061b8e2-74a8-4953-bfa2-5090a2f70459\" (UID: \"a061b8e2-74a8-4953-bfa2-5090a2f70459\") " Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.815832 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "a061b8e2-74a8-4953-bfa2-5090a2f70459" (UID: "a061b8e2-74a8-4953-bfa2-5090a2f70459"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.815949 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "a061b8e2-74a8-4953-bfa2-5090a2f70459" (UID: "a061b8e2-74a8-4953-bfa2-5090a2f70459"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.816003 4771 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.816024 4771 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-node-log\") on node \"crc\" DevicePath \"\"" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.816062 4771 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.816078 4771 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.816089 4771 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.816100 4771 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.816110 4771 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-host-slash\") on node \"crc\" DevicePath \"\"" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.816121 4771 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-log-socket\") on node \"crc\" DevicePath \"\"" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.816133 4771 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.816016 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a061b8e2-74a8-4953-bfa2-5090a2f70459-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "a061b8e2-74a8-4953-bfa2-5090a2f70459" (UID: "a061b8e2-74a8-4953-bfa2-5090a2f70459"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.816143 4771 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.816207 4771 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.815160 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "a061b8e2-74a8-4953-bfa2-5090a2f70459" (UID: "a061b8e2-74a8-4953-bfa2-5090a2f70459"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.816079 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a061b8e2-74a8-4953-bfa2-5090a2f70459-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "a061b8e2-74a8-4953-bfa2-5090a2f70459" (UID: "a061b8e2-74a8-4953-bfa2-5090a2f70459"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.816067 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "a061b8e2-74a8-4953-bfa2-5090a2f70459" (UID: "a061b8e2-74a8-4953-bfa2-5090a2f70459"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.816121 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a061b8e2-74a8-4953-bfa2-5090a2f70459-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "a061b8e2-74a8-4953-bfa2-5090a2f70459" (UID: "a061b8e2-74a8-4953-bfa2-5090a2f70459"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.821528 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a061b8e2-74a8-4953-bfa2-5090a2f70459-kube-api-access-nnspx" (OuterVolumeSpecName: "kube-api-access-nnspx") pod "a061b8e2-74a8-4953-bfa2-5090a2f70459" (UID: "a061b8e2-74a8-4953-bfa2-5090a2f70459"). InnerVolumeSpecName "kube-api-access-nnspx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.821896 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a061b8e2-74a8-4953-bfa2-5090a2f70459-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "a061b8e2-74a8-4953-bfa2-5090a2f70459" (UID: "a061b8e2-74a8-4953-bfa2-5090a2f70459"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.840627 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "a061b8e2-74a8-4953-bfa2-5090a2f70459" (UID: "a061b8e2-74a8-4953-bfa2-5090a2f70459"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.856477 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-j8plj"] Oct 01 15:06:28 crc kubenswrapper[4771]: E1001 15:06:28.856873 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerName="kube-rbac-proxy-ovn-metrics" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.856912 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerName="kube-rbac-proxy-ovn-metrics" Oct 01 15:06:28 crc kubenswrapper[4771]: E1001 15:06:28.856936 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerName="sbdb" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.856949 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerName="sbdb" Oct 01 15:06:28 crc kubenswrapper[4771]: E1001 15:06:28.856972 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerName="ovnkube-controller" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.856985 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerName="ovnkube-controller" Oct 01 15:06:28 crc kubenswrapper[4771]: E1001 15:06:28.857001 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerName="ovnkube-controller" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.857013 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerName="ovnkube-controller" Oct 01 15:06:28 crc kubenswrapper[4771]: E1001 15:06:28.857030 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerName="kube-rbac-proxy-node" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.857042 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerName="kube-rbac-proxy-node" Oct 01 15:06:28 crc kubenswrapper[4771]: E1001 15:06:28.857061 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerName="kubecfg-setup" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.857074 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerName="kubecfg-setup" Oct 01 15:06:28 crc kubenswrapper[4771]: E1001 15:06:28.857090 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerName="ovnkube-controller" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.857102 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerName="ovnkube-controller" Oct 01 15:06:28 crc kubenswrapper[4771]: E1001 15:06:28.857117 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerName="ovn-controller" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.857128 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerName="ovn-controller" Oct 01 15:06:28 crc kubenswrapper[4771]: E1001 15:06:28.857146 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerName="northd" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.857158 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerName="northd" Oct 01 15:06:28 crc kubenswrapper[4771]: E1001 15:06:28.857175 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerName="nbdb" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.857187 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerName="nbdb" Oct 01 15:06:28 crc kubenswrapper[4771]: E1001 15:06:28.857201 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerName="ovn-acl-logging" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.857214 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerName="ovn-acl-logging" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.857374 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerName="ovn-acl-logging" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.857391 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerName="ovnkube-controller" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.857404 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerName="kube-rbac-proxy-ovn-metrics" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.857422 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerName="nbdb" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.857440 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerName="ovnkube-controller" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.857458 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerName="ovnkube-controller" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.857476 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerName="ovnkube-controller" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.857494 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerName="northd" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.857510 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerName="ovn-controller" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.857528 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerName="sbdb" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.857542 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerName="kube-rbac-proxy-node" Oct 01 15:06:28 crc kubenswrapper[4771]: E1001 15:06:28.857818 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerName="ovnkube-controller" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.857847 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerName="ovnkube-controller" Oct 01 15:06:28 crc kubenswrapper[4771]: E1001 15:06:28.857872 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerName="ovnkube-controller" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.857884 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerName="ovnkube-controller" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.858042 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerName="ovnkube-controller" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.861139 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.917014 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.917372 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-run-ovn\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.917411 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-ovn-node-metrics-cert\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.917451 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-host-slash\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.917492 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-host-cni-netd\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.917520 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-env-overrides\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.917605 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-var-lib-openvswitch\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.917647 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-etc-openvswitch\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.917681 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-host-run-netns\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.917915 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rxq2\" (UniqueName: \"kubernetes.io/projected/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-kube-api-access-5rxq2\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.917955 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-node-log\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.917988 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-host-kubelet\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.918063 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-log-socket\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.918107 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-ovnkube-script-lib\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.918157 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-systemd-units\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.918188 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-host-cni-bin\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.918220 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-ovnkube-config\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.918249 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-run-openvswitch\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.918298 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-host-run-ovn-kubernetes\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.918394 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-run-systemd\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.918518 4771 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a061b8e2-74a8-4953-bfa2-5090a2f70459-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.918539 4771 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a061b8e2-74a8-4953-bfa2-5090a2f70459-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.918552 4771 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.918565 4771 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.918578 4771 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.918593 4771 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a061b8e2-74a8-4953-bfa2-5090a2f70459-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.918606 4771 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a061b8e2-74a8-4953-bfa2-5090a2f70459-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.918621 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnspx\" (UniqueName: \"kubernetes.io/projected/a061b8e2-74a8-4953-bfa2-5090a2f70459-kube-api-access-nnspx\") on node \"crc\" DevicePath \"\"" Oct 01 15:06:28 crc kubenswrapper[4771]: I1001 15:06:28.918633 4771 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a061b8e2-74a8-4953-bfa2-5090a2f70459-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.019769 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-ovnkube-script-lib\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.019862 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-systemd-units\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.019908 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-host-cni-bin\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.019958 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-ovnkube-config\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.019995 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-run-openvswitch\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.020009 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-systemd-units\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.020050 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-host-run-ovn-kubernetes\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.020067 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-host-cni-bin\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.020094 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-run-openvswitch\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.020105 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-run-systemd\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.020141 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-host-run-ovn-kubernetes\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.020173 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-run-systemd\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.020190 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.020232 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.020245 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-run-ovn\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.020287 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-run-ovn\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.020297 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-ovn-node-metrics-cert\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.020341 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-host-slash\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.020403 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-host-cni-netd\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.020407 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-host-slash\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.020434 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-env-overrides\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.020466 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-host-cni-netd\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.020509 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-var-lib-openvswitch\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.020540 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-etc-openvswitch\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.020570 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-host-run-netns\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.020625 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-node-log\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.020657 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rxq2\" (UniqueName: \"kubernetes.io/projected/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-kube-api-access-5rxq2\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.020663 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-var-lib-openvswitch\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.020715 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-etc-openvswitch\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.020795 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-ovnkube-config\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.020811 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-host-kubelet\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.020715 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-host-kubelet\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.020841 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-ovnkube-script-lib\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.020856 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-node-log\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.020869 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-log-socket\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.020908 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-host-run-netns\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.020986 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-env-overrides\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.020889 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-log-socket\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.025648 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-ovn-node-metrics-cert\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.039368 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rxq2\" (UniqueName: \"kubernetes.io/projected/86a1c36c-3910-41d0-bb5b-b2603e3c78dc-kube-api-access-5rxq2\") pod \"ovnkube-node-j8plj\" (UID: \"86a1c36c-3910-41d0-bb5b-b2603e3c78dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.177385 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:29 crc kubenswrapper[4771]: W1001 15:06:29.210078 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86a1c36c_3910_41d0_bb5b_b2603e3c78dc.slice/crio-15ae7f642626cd9560548116654c302b24039681f8a704eb81840df21b4b463b WatchSource:0}: Error finding container 15ae7f642626cd9560548116654c302b24039681f8a704eb81840df21b4b463b: Status 404 returned error can't find the container with id 15ae7f642626cd9560548116654c302b24039681f8a704eb81840df21b4b463b Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.234443 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" event={"ID":"86a1c36c-3910-41d0-bb5b-b2603e3c78dc","Type":"ContainerStarted","Data":"15ae7f642626cd9560548116654c302b24039681f8a704eb81840df21b4b463b"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.236425 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9lvcz_c96a3328-c79b-4528-b9b5-badbc7380dd6/kube-multus/2.log" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.237126 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9lvcz_c96a3328-c79b-4528-b9b5-badbc7380dd6/kube-multus/1.log" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.237177 4771 generic.go:334] "Generic (PLEG): container finished" podID="c96a3328-c79b-4528-b9b5-badbc7380dd6" containerID="2d79a918c545b25fd3949c7e72a6a5446d3d66da50fda7363410f26ddf35b04e" exitCode=2 Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.237262 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9lvcz" event={"ID":"c96a3328-c79b-4528-b9b5-badbc7380dd6","Type":"ContainerDied","Data":"2d79a918c545b25fd3949c7e72a6a5446d3d66da50fda7363410f26ddf35b04e"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.237296 4771 scope.go:117] "RemoveContainer" containerID="4fb4d8406dba14d03e2f5ab3e220aadd2d4181a22563e3e178108c5d8e1b4e2b" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.238308 4771 scope.go:117] "RemoveContainer" containerID="2d79a918c545b25fd3949c7e72a6a5446d3d66da50fda7363410f26ddf35b04e" Oct 01 15:06:29 crc kubenswrapper[4771]: E1001 15:06:29.238807 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-9lvcz_openshift-multus(c96a3328-c79b-4528-b9b5-badbc7380dd6)\"" pod="openshift-multus/multus-9lvcz" podUID="c96a3328-c79b-4528-b9b5-badbc7380dd6" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.241006 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j7ntp_a061b8e2-74a8-4953-bfa2-5090a2f70459/ovnkube-controller/3.log" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.246594 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j7ntp_a061b8e2-74a8-4953-bfa2-5090a2f70459/ovn-acl-logging/0.log" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247135 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j7ntp_a061b8e2-74a8-4953-bfa2-5090a2f70459/ovn-controller/0.log" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247547 4771 generic.go:334] "Generic (PLEG): container finished" podID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerID="4a303f76a4308e3dfb264f405721e20c795c4324b793808feadb388e6589dae3" exitCode=0 Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247568 4771 generic.go:334] "Generic (PLEG): container finished" podID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerID="4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32" exitCode=0 Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247575 4771 generic.go:334] "Generic (PLEG): container finished" podID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerID="eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6" exitCode=0 Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247581 4771 generic.go:334] "Generic (PLEG): container finished" podID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerID="1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101" exitCode=0 Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247587 4771 generic.go:334] "Generic (PLEG): container finished" podID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerID="211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70" exitCode=0 Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247594 4771 generic.go:334] "Generic (PLEG): container finished" podID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerID="c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f" exitCode=0 Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247600 4771 generic.go:334] "Generic (PLEG): container finished" podID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerID="1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b" exitCode=143 Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247607 4771 generic.go:334] "Generic (PLEG): container finished" podID="a061b8e2-74a8-4953-bfa2-5090a2f70459" containerID="4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183" exitCode=143 Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247627 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" event={"ID":"a061b8e2-74a8-4953-bfa2-5090a2f70459","Type":"ContainerDied","Data":"4a303f76a4308e3dfb264f405721e20c795c4324b793808feadb388e6589dae3"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247652 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" event={"ID":"a061b8e2-74a8-4953-bfa2-5090a2f70459","Type":"ContainerDied","Data":"4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247663 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" event={"ID":"a061b8e2-74a8-4953-bfa2-5090a2f70459","Type":"ContainerDied","Data":"eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247672 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" event={"ID":"a061b8e2-74a8-4953-bfa2-5090a2f70459","Type":"ContainerDied","Data":"1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247681 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" event={"ID":"a061b8e2-74a8-4953-bfa2-5090a2f70459","Type":"ContainerDied","Data":"211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247691 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" event={"ID":"a061b8e2-74a8-4953-bfa2-5090a2f70459","Type":"ContainerDied","Data":"c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247701 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4a303f76a4308e3dfb264f405721e20c795c4324b793808feadb388e6589dae3"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247710 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b73262e3265b2bd1a144d98c79c0de8e14fcd6af11f724dee4758e1424506b86"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247716 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247722 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247750 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247759 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247764 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247769 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247774 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247779 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247786 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" event={"ID":"a061b8e2-74a8-4953-bfa2-5090a2f70459","Type":"ContainerDied","Data":"1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247795 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4a303f76a4308e3dfb264f405721e20c795c4324b793808feadb388e6589dae3"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247800 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b73262e3265b2bd1a144d98c79c0de8e14fcd6af11f724dee4758e1424506b86"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247805 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247810 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247815 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247820 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247825 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247830 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247835 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247840 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247847 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" event={"ID":"a061b8e2-74a8-4953-bfa2-5090a2f70459","Type":"ContainerDied","Data":"4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247855 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4a303f76a4308e3dfb264f405721e20c795c4324b793808feadb388e6589dae3"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247862 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b73262e3265b2bd1a144d98c79c0de8e14fcd6af11f724dee4758e1424506b86"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247868 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247874 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247880 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247886 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247891 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247898 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247903 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247911 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247919 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" event={"ID":"a061b8e2-74a8-4953-bfa2-5090a2f70459","Type":"ContainerDied","Data":"6f55c7ea12abf59b4a544c6ba7573cdcbc6029096e356a953a56fbd258cc3dd1"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247928 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4a303f76a4308e3dfb264f405721e20c795c4324b793808feadb388e6589dae3"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247935 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b73262e3265b2bd1a144d98c79c0de8e14fcd6af11f724dee4758e1424506b86"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247941 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247948 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247954 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247960 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247966 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247972 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247978 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.247984 4771 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b"} Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.248063 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j7ntp" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.303050 4771 scope.go:117] "RemoveContainer" containerID="4a303f76a4308e3dfb264f405721e20c795c4324b793808feadb388e6589dae3" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.316368 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-j7ntp"] Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.319545 4771 scope.go:117] "RemoveContainer" containerID="b73262e3265b2bd1a144d98c79c0de8e14fcd6af11f724dee4758e1424506b86" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.322500 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-j7ntp"] Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.341795 4771 scope.go:117] "RemoveContainer" containerID="4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.361979 4771 scope.go:117] "RemoveContainer" containerID="eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.426212 4771 scope.go:117] "RemoveContainer" containerID="1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.458778 4771 scope.go:117] "RemoveContainer" containerID="211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.473378 4771 scope.go:117] "RemoveContainer" containerID="c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.487516 4771 scope.go:117] "RemoveContainer" containerID="1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.510430 4771 scope.go:117] "RemoveContainer" containerID="4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.533401 4771 scope.go:117] "RemoveContainer" containerID="834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.547416 4771 scope.go:117] "RemoveContainer" containerID="4a303f76a4308e3dfb264f405721e20c795c4324b793808feadb388e6589dae3" Oct 01 15:06:29 crc kubenswrapper[4771]: E1001 15:06:29.547984 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a303f76a4308e3dfb264f405721e20c795c4324b793808feadb388e6589dae3\": container with ID starting with 4a303f76a4308e3dfb264f405721e20c795c4324b793808feadb388e6589dae3 not found: ID does not exist" containerID="4a303f76a4308e3dfb264f405721e20c795c4324b793808feadb388e6589dae3" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.548027 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a303f76a4308e3dfb264f405721e20c795c4324b793808feadb388e6589dae3"} err="failed to get container status \"4a303f76a4308e3dfb264f405721e20c795c4324b793808feadb388e6589dae3\": rpc error: code = NotFound desc = could not find container \"4a303f76a4308e3dfb264f405721e20c795c4324b793808feadb388e6589dae3\": container with ID starting with 4a303f76a4308e3dfb264f405721e20c795c4324b793808feadb388e6589dae3 not found: ID does not exist" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.548054 4771 scope.go:117] "RemoveContainer" containerID="b73262e3265b2bd1a144d98c79c0de8e14fcd6af11f724dee4758e1424506b86" Oct 01 15:06:29 crc kubenswrapper[4771]: E1001 15:06:29.548361 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b73262e3265b2bd1a144d98c79c0de8e14fcd6af11f724dee4758e1424506b86\": container with ID starting with b73262e3265b2bd1a144d98c79c0de8e14fcd6af11f724dee4758e1424506b86 not found: ID does not exist" containerID="b73262e3265b2bd1a144d98c79c0de8e14fcd6af11f724dee4758e1424506b86" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.548418 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b73262e3265b2bd1a144d98c79c0de8e14fcd6af11f724dee4758e1424506b86"} err="failed to get container status \"b73262e3265b2bd1a144d98c79c0de8e14fcd6af11f724dee4758e1424506b86\": rpc error: code = NotFound desc = could not find container \"b73262e3265b2bd1a144d98c79c0de8e14fcd6af11f724dee4758e1424506b86\": container with ID starting with b73262e3265b2bd1a144d98c79c0de8e14fcd6af11f724dee4758e1424506b86 not found: ID does not exist" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.548464 4771 scope.go:117] "RemoveContainer" containerID="4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32" Oct 01 15:06:29 crc kubenswrapper[4771]: E1001 15:06:29.548837 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32\": container with ID starting with 4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32 not found: ID does not exist" containerID="4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.548861 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32"} err="failed to get container status \"4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32\": rpc error: code = NotFound desc = could not find container \"4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32\": container with ID starting with 4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32 not found: ID does not exist" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.548878 4771 scope.go:117] "RemoveContainer" containerID="eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6" Oct 01 15:06:29 crc kubenswrapper[4771]: E1001 15:06:29.549246 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6\": container with ID starting with eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6 not found: ID does not exist" containerID="eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.549299 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6"} err="failed to get container status \"eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6\": rpc error: code = NotFound desc = could not find container \"eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6\": container with ID starting with eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6 not found: ID does not exist" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.549312 4771 scope.go:117] "RemoveContainer" containerID="1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101" Oct 01 15:06:29 crc kubenswrapper[4771]: E1001 15:06:29.549537 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101\": container with ID starting with 1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101 not found: ID does not exist" containerID="1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.549562 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101"} err="failed to get container status \"1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101\": rpc error: code = NotFound desc = could not find container \"1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101\": container with ID starting with 1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101 not found: ID does not exist" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.549577 4771 scope.go:117] "RemoveContainer" containerID="211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70" Oct 01 15:06:29 crc kubenswrapper[4771]: E1001 15:06:29.549871 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70\": container with ID starting with 211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70 not found: ID does not exist" containerID="211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.549893 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70"} err="failed to get container status \"211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70\": rpc error: code = NotFound desc = could not find container \"211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70\": container with ID starting with 211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70 not found: ID does not exist" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.549907 4771 scope.go:117] "RemoveContainer" containerID="c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f" Oct 01 15:06:29 crc kubenswrapper[4771]: E1001 15:06:29.550134 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f\": container with ID starting with c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f not found: ID does not exist" containerID="c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.550155 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f"} err="failed to get container status \"c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f\": rpc error: code = NotFound desc = could not find container \"c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f\": container with ID starting with c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f not found: ID does not exist" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.550166 4771 scope.go:117] "RemoveContainer" containerID="1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b" Oct 01 15:06:29 crc kubenswrapper[4771]: E1001 15:06:29.550476 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b\": container with ID starting with 1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b not found: ID does not exist" containerID="1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.550505 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b"} err="failed to get container status \"1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b\": rpc error: code = NotFound desc = could not find container \"1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b\": container with ID starting with 1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b not found: ID does not exist" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.550516 4771 scope.go:117] "RemoveContainer" containerID="4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183" Oct 01 15:06:29 crc kubenswrapper[4771]: E1001 15:06:29.550854 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183\": container with ID starting with 4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183 not found: ID does not exist" containerID="4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.550950 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183"} err="failed to get container status \"4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183\": rpc error: code = NotFound desc = could not find container \"4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183\": container with ID starting with 4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183 not found: ID does not exist" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.550980 4771 scope.go:117] "RemoveContainer" containerID="834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b" Oct 01 15:06:29 crc kubenswrapper[4771]: E1001 15:06:29.551244 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\": container with ID starting with 834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b not found: ID does not exist" containerID="834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.551290 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b"} err="failed to get container status \"834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\": rpc error: code = NotFound desc = could not find container \"834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\": container with ID starting with 834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b not found: ID does not exist" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.551303 4771 scope.go:117] "RemoveContainer" containerID="4a303f76a4308e3dfb264f405721e20c795c4324b793808feadb388e6589dae3" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.551488 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a303f76a4308e3dfb264f405721e20c795c4324b793808feadb388e6589dae3"} err="failed to get container status \"4a303f76a4308e3dfb264f405721e20c795c4324b793808feadb388e6589dae3\": rpc error: code = NotFound desc = could not find container \"4a303f76a4308e3dfb264f405721e20c795c4324b793808feadb388e6589dae3\": container with ID starting with 4a303f76a4308e3dfb264f405721e20c795c4324b793808feadb388e6589dae3 not found: ID does not exist" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.551536 4771 scope.go:117] "RemoveContainer" containerID="b73262e3265b2bd1a144d98c79c0de8e14fcd6af11f724dee4758e1424506b86" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.551870 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b73262e3265b2bd1a144d98c79c0de8e14fcd6af11f724dee4758e1424506b86"} err="failed to get container status \"b73262e3265b2bd1a144d98c79c0de8e14fcd6af11f724dee4758e1424506b86\": rpc error: code = NotFound desc = could not find container \"b73262e3265b2bd1a144d98c79c0de8e14fcd6af11f724dee4758e1424506b86\": container with ID starting with b73262e3265b2bd1a144d98c79c0de8e14fcd6af11f724dee4758e1424506b86 not found: ID does not exist" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.551891 4771 scope.go:117] "RemoveContainer" containerID="4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.552119 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32"} err="failed to get container status \"4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32\": rpc error: code = NotFound desc = could not find container \"4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32\": container with ID starting with 4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32 not found: ID does not exist" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.552144 4771 scope.go:117] "RemoveContainer" containerID="eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.552342 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6"} err="failed to get container status \"eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6\": rpc error: code = NotFound desc = could not find container \"eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6\": container with ID starting with eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6 not found: ID does not exist" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.552364 4771 scope.go:117] "RemoveContainer" containerID="1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.552566 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101"} err="failed to get container status \"1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101\": rpc error: code = NotFound desc = could not find container \"1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101\": container with ID starting with 1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101 not found: ID does not exist" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.552591 4771 scope.go:117] "RemoveContainer" containerID="211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.552890 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70"} err="failed to get container status \"211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70\": rpc error: code = NotFound desc = could not find container \"211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70\": container with ID starting with 211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70 not found: ID does not exist" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.552910 4771 scope.go:117] "RemoveContainer" containerID="c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.553125 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f"} err="failed to get container status \"c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f\": rpc error: code = NotFound desc = could not find container \"c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f\": container with ID starting with c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f not found: ID does not exist" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.553146 4771 scope.go:117] "RemoveContainer" containerID="1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.553399 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b"} err="failed to get container status \"1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b\": rpc error: code = NotFound desc = could not find container \"1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b\": container with ID starting with 1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b not found: ID does not exist" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.553418 4771 scope.go:117] "RemoveContainer" containerID="4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.553618 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183"} err="failed to get container status \"4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183\": rpc error: code = NotFound desc = could not find container \"4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183\": container with ID starting with 4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183 not found: ID does not exist" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.553638 4771 scope.go:117] "RemoveContainer" containerID="834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.553847 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b"} err="failed to get container status \"834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\": rpc error: code = NotFound desc = could not find container \"834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\": container with ID starting with 834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b not found: ID does not exist" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.553870 4771 scope.go:117] "RemoveContainer" containerID="4a303f76a4308e3dfb264f405721e20c795c4324b793808feadb388e6589dae3" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.554068 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a303f76a4308e3dfb264f405721e20c795c4324b793808feadb388e6589dae3"} err="failed to get container status \"4a303f76a4308e3dfb264f405721e20c795c4324b793808feadb388e6589dae3\": rpc error: code = NotFound desc = could not find container \"4a303f76a4308e3dfb264f405721e20c795c4324b793808feadb388e6589dae3\": container with ID starting with 4a303f76a4308e3dfb264f405721e20c795c4324b793808feadb388e6589dae3 not found: ID does not exist" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.554089 4771 scope.go:117] "RemoveContainer" containerID="b73262e3265b2bd1a144d98c79c0de8e14fcd6af11f724dee4758e1424506b86" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.554268 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b73262e3265b2bd1a144d98c79c0de8e14fcd6af11f724dee4758e1424506b86"} err="failed to get container status \"b73262e3265b2bd1a144d98c79c0de8e14fcd6af11f724dee4758e1424506b86\": rpc error: code = NotFound desc = could not find container \"b73262e3265b2bd1a144d98c79c0de8e14fcd6af11f724dee4758e1424506b86\": container with ID starting with b73262e3265b2bd1a144d98c79c0de8e14fcd6af11f724dee4758e1424506b86 not found: ID does not exist" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.554294 4771 scope.go:117] "RemoveContainer" containerID="4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.554485 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32"} err="failed to get container status \"4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32\": rpc error: code = NotFound desc = could not find container \"4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32\": container with ID starting with 4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32 not found: ID does not exist" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.554507 4771 scope.go:117] "RemoveContainer" containerID="eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.554705 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6"} err="failed to get container status \"eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6\": rpc error: code = NotFound desc = could not find container \"eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6\": container with ID starting with eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6 not found: ID does not exist" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.554726 4771 scope.go:117] "RemoveContainer" containerID="1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.554967 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101"} err="failed to get container status \"1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101\": rpc error: code = NotFound desc = could not find container \"1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101\": container with ID starting with 1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101 not found: ID does not exist" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.554987 4771 scope.go:117] "RemoveContainer" containerID="211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.555249 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70"} err="failed to get container status \"211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70\": rpc error: code = NotFound desc = could not find container \"211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70\": container with ID starting with 211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70 not found: ID does not exist" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.555269 4771 scope.go:117] "RemoveContainer" containerID="c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.555483 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f"} err="failed to get container status \"c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f\": rpc error: code = NotFound desc = could not find container \"c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f\": container with ID starting with c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f not found: ID does not exist" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.555502 4771 scope.go:117] "RemoveContainer" containerID="1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.555777 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b"} err="failed to get container status \"1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b\": rpc error: code = NotFound desc = could not find container \"1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b\": container with ID starting with 1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b not found: ID does not exist" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.555798 4771 scope.go:117] "RemoveContainer" containerID="4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.556006 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183"} err="failed to get container status \"4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183\": rpc error: code = NotFound desc = could not find container \"4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183\": container with ID starting with 4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183 not found: ID does not exist" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.556029 4771 scope.go:117] "RemoveContainer" containerID="834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.556224 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b"} err="failed to get container status \"834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\": rpc error: code = NotFound desc = could not find container \"834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\": container with ID starting with 834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b not found: ID does not exist" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.556244 4771 scope.go:117] "RemoveContainer" containerID="4a303f76a4308e3dfb264f405721e20c795c4324b793808feadb388e6589dae3" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.556581 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a303f76a4308e3dfb264f405721e20c795c4324b793808feadb388e6589dae3"} err="failed to get container status \"4a303f76a4308e3dfb264f405721e20c795c4324b793808feadb388e6589dae3\": rpc error: code = NotFound desc = could not find container \"4a303f76a4308e3dfb264f405721e20c795c4324b793808feadb388e6589dae3\": container with ID starting with 4a303f76a4308e3dfb264f405721e20c795c4324b793808feadb388e6589dae3 not found: ID does not exist" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.556603 4771 scope.go:117] "RemoveContainer" containerID="b73262e3265b2bd1a144d98c79c0de8e14fcd6af11f724dee4758e1424506b86" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.556817 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b73262e3265b2bd1a144d98c79c0de8e14fcd6af11f724dee4758e1424506b86"} err="failed to get container status \"b73262e3265b2bd1a144d98c79c0de8e14fcd6af11f724dee4758e1424506b86\": rpc error: code = NotFound desc = could not find container \"b73262e3265b2bd1a144d98c79c0de8e14fcd6af11f724dee4758e1424506b86\": container with ID starting with b73262e3265b2bd1a144d98c79c0de8e14fcd6af11f724dee4758e1424506b86 not found: ID does not exist" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.556839 4771 scope.go:117] "RemoveContainer" containerID="4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.557040 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32"} err="failed to get container status \"4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32\": rpc error: code = NotFound desc = could not find container \"4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32\": container with ID starting with 4072e97e62fb7bbf4f218a06828b14170ed4dbb27d34adbe2e1f3107a7de3f32 not found: ID does not exist" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.557061 4771 scope.go:117] "RemoveContainer" containerID="eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.557266 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6"} err="failed to get container status \"eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6\": rpc error: code = NotFound desc = could not find container \"eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6\": container with ID starting with eef1e0816239b0e1354c294609b6279316de35cef6ab6a3a03b1f745d5ce9ef6 not found: ID does not exist" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.557288 4771 scope.go:117] "RemoveContainer" containerID="1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.557483 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101"} err="failed to get container status \"1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101\": rpc error: code = NotFound desc = could not find container \"1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101\": container with ID starting with 1d5d6c1484c062e2667f0b0398d47a99370ff5ffbed721f4006bb235a9a76101 not found: ID does not exist" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.557507 4771 scope.go:117] "RemoveContainer" containerID="211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.557706 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70"} err="failed to get container status \"211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70\": rpc error: code = NotFound desc = could not find container \"211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70\": container with ID starting with 211c96aa09f0f45042eaecdeae56ae87948446a2359f61fbd31b66d7744aae70 not found: ID does not exist" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.557753 4771 scope.go:117] "RemoveContainer" containerID="c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.557984 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f"} err="failed to get container status \"c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f\": rpc error: code = NotFound desc = could not find container \"c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f\": container with ID starting with c3f13ead91f7acaa2de694744ffe9dd3b49080be1c7fc1cb3f15bdd32a8fc14f not found: ID does not exist" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.558005 4771 scope.go:117] "RemoveContainer" containerID="1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.558234 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b"} err="failed to get container status \"1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b\": rpc error: code = NotFound desc = could not find container \"1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b\": container with ID starting with 1e4fb64eda62b0940a3bef63ef85bb410e1ba5d1ecf88370a0ea8608c4b5a63b not found: ID does not exist" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.558312 4771 scope.go:117] "RemoveContainer" containerID="4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.558805 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183"} err="failed to get container status \"4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183\": rpc error: code = NotFound desc = could not find container \"4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183\": container with ID starting with 4e6af6accbf59bb6f6830eb1d87ec63ef6b8feaa8924d157d2095b5d5c902183 not found: ID does not exist" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.558822 4771 scope.go:117] "RemoveContainer" containerID="834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.559168 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b"} err="failed to get container status \"834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\": rpc error: code = NotFound desc = could not find container \"834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b\": container with ID starting with 834ebac2eb7bf06b911f2750e74c4d6b2e2a4460807b32f75a6ec2662f9b3a2b not found: ID does not exist" Oct 01 15:06:29 crc kubenswrapper[4771]: I1001 15:06:29.997635 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a061b8e2-74a8-4953-bfa2-5090a2f70459" path="/var/lib/kubelet/pods/a061b8e2-74a8-4953-bfa2-5090a2f70459/volumes" Oct 01 15:06:30 crc kubenswrapper[4771]: I1001 15:06:30.255234 4771 generic.go:334] "Generic (PLEG): container finished" podID="86a1c36c-3910-41d0-bb5b-b2603e3c78dc" containerID="3b1efc2299c70b81a143aa85ceff8ce7fa67b1c86ce71c3d5c47976aa3bb595e" exitCode=0 Oct 01 15:06:30 crc kubenswrapper[4771]: I1001 15:06:30.255311 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" event={"ID":"86a1c36c-3910-41d0-bb5b-b2603e3c78dc","Type":"ContainerDied","Data":"3b1efc2299c70b81a143aa85ceff8ce7fa67b1c86ce71c3d5c47976aa3bb595e"} Oct 01 15:06:30 crc kubenswrapper[4771]: I1001 15:06:30.257935 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9lvcz_c96a3328-c79b-4528-b9b5-badbc7380dd6/kube-multus/2.log" Oct 01 15:06:31 crc kubenswrapper[4771]: I1001 15:06:31.269237 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" event={"ID":"86a1c36c-3910-41d0-bb5b-b2603e3c78dc","Type":"ContainerStarted","Data":"17559f0d195f57453ca907d5cfeb4e529e3225dbaad4ab279d1bc9f0a8cf6122"} Oct 01 15:06:31 crc kubenswrapper[4771]: I1001 15:06:31.269626 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" event={"ID":"86a1c36c-3910-41d0-bb5b-b2603e3c78dc","Type":"ContainerStarted","Data":"8c8e6a0d9fa2a06567a65bb1be54bd939316c140ec68a7d3cd8645425de63730"} Oct 01 15:06:31 crc kubenswrapper[4771]: I1001 15:06:31.269642 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" event={"ID":"86a1c36c-3910-41d0-bb5b-b2603e3c78dc","Type":"ContainerStarted","Data":"026854e5b0f4536e2fd9e1686f13a7adc2db9ac184f83c1bd25971b658bb4c55"} Oct 01 15:06:31 crc kubenswrapper[4771]: I1001 15:06:31.269652 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" event={"ID":"86a1c36c-3910-41d0-bb5b-b2603e3c78dc","Type":"ContainerStarted","Data":"38cbeeb1a2a5f40142d55ed749b91edc89de622644a6834182dd46e92a703448"} Oct 01 15:06:31 crc kubenswrapper[4771]: I1001 15:06:31.269661 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" event={"ID":"86a1c36c-3910-41d0-bb5b-b2603e3c78dc","Type":"ContainerStarted","Data":"23136ddbb96f6224e35f3afe839a765c4e939e459743972895702901e07dadec"} Oct 01 15:06:32 crc kubenswrapper[4771]: I1001 15:06:32.280240 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" event={"ID":"86a1c36c-3910-41d0-bb5b-b2603e3c78dc","Type":"ContainerStarted","Data":"fcae7f144c264355770dc2066cc5eca1dd36304900dda113dbc7f760011e5ab0"} Oct 01 15:06:34 crc kubenswrapper[4771]: I1001 15:06:34.302306 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" event={"ID":"86a1c36c-3910-41d0-bb5b-b2603e3c78dc","Type":"ContainerStarted","Data":"687a97d6524737f2359c9531b5dd65edc9bdc851b93ef656eff7c9de41d77711"} Oct 01 15:06:36 crc kubenswrapper[4771]: I1001 15:06:36.316006 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" event={"ID":"86a1c36c-3910-41d0-bb5b-b2603e3c78dc","Type":"ContainerStarted","Data":"298a2cb2c4838b7c6ffb8658210bcb7651a1dd43e3d2de0863e90f4f574a151b"} Oct 01 15:06:36 crc kubenswrapper[4771]: I1001 15:06:36.316438 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:36 crc kubenswrapper[4771]: I1001 15:06:36.316463 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:36 crc kubenswrapper[4771]: I1001 15:06:36.316480 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:36 crc kubenswrapper[4771]: I1001 15:06:36.347461 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:36 crc kubenswrapper[4771]: I1001 15:06:36.350133 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:06:36 crc kubenswrapper[4771]: I1001 15:06:36.369632 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" podStartSLOduration=8.36961122 podStartE2EDuration="8.36961122s" podCreationTimestamp="2025-10-01 15:06:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:06:36.360078164 +0000 UTC m=+640.979253345" watchObservedRunningTime="2025-10-01 15:06:36.36961122 +0000 UTC m=+640.988786411" Oct 01 15:06:44 crc kubenswrapper[4771]: I1001 15:06:44.985531 4771 scope.go:117] "RemoveContainer" containerID="2d79a918c545b25fd3949c7e72a6a5446d3d66da50fda7363410f26ddf35b04e" Oct 01 15:06:44 crc kubenswrapper[4771]: E1001 15:06:44.986846 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-9lvcz_openshift-multus(c96a3328-c79b-4528-b9b5-badbc7380dd6)\"" pod="openshift-multus/multus-9lvcz" podUID="c96a3328-c79b-4528-b9b5-badbc7380dd6" Oct 01 15:06:55 crc kubenswrapper[4771]: I1001 15:06:55.990147 4771 scope.go:117] "RemoveContainer" containerID="2d79a918c545b25fd3949c7e72a6a5446d3d66da50fda7363410f26ddf35b04e" Oct 01 15:06:56 crc kubenswrapper[4771]: I1001 15:06:56.452866 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-9lvcz_c96a3328-c79b-4528-b9b5-badbc7380dd6/kube-multus/2.log" Oct 01 15:06:56 crc kubenswrapper[4771]: I1001 15:06:56.453620 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-9lvcz" event={"ID":"c96a3328-c79b-4528-b9b5-badbc7380dd6","Type":"ContainerStarted","Data":"4c9a15f3e85f2ad56941eb243c545a6a11622914490642b7d99945cbc43fb0c1"} Oct 01 15:06:59 crc kubenswrapper[4771]: I1001 15:06:59.214531 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j8plj" Oct 01 15:07:07 crc kubenswrapper[4771]: I1001 15:07:07.302669 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbbx8p"] Oct 01 15:07:07 crc kubenswrapper[4771]: I1001 15:07:07.305417 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbbx8p" Oct 01 15:07:07 crc kubenswrapper[4771]: I1001 15:07:07.307406 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 01 15:07:07 crc kubenswrapper[4771]: I1001 15:07:07.307531 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbbx8p"] Oct 01 15:07:07 crc kubenswrapper[4771]: I1001 15:07:07.346875 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f9a434b3-f6c6-441a-bc5f-0731967288da-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbbx8p\" (UID: \"f9a434b3-f6c6-441a-bc5f-0731967288da\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbbx8p" Oct 01 15:07:07 crc kubenswrapper[4771]: I1001 15:07:07.347001 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5krdd\" (UniqueName: \"kubernetes.io/projected/f9a434b3-f6c6-441a-bc5f-0731967288da-kube-api-access-5krdd\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbbx8p\" (UID: \"f9a434b3-f6c6-441a-bc5f-0731967288da\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbbx8p" Oct 01 15:07:07 crc kubenswrapper[4771]: I1001 15:07:07.347145 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f9a434b3-f6c6-441a-bc5f-0731967288da-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbbx8p\" (UID: \"f9a434b3-f6c6-441a-bc5f-0731967288da\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbbx8p" Oct 01 15:07:07 crc kubenswrapper[4771]: I1001 15:07:07.448243 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5krdd\" (UniqueName: \"kubernetes.io/projected/f9a434b3-f6c6-441a-bc5f-0731967288da-kube-api-access-5krdd\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbbx8p\" (UID: \"f9a434b3-f6c6-441a-bc5f-0731967288da\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbbx8p" Oct 01 15:07:07 crc kubenswrapper[4771]: I1001 15:07:07.448409 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f9a434b3-f6c6-441a-bc5f-0731967288da-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbbx8p\" (UID: \"f9a434b3-f6c6-441a-bc5f-0731967288da\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbbx8p" Oct 01 15:07:07 crc kubenswrapper[4771]: I1001 15:07:07.448519 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f9a434b3-f6c6-441a-bc5f-0731967288da-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbbx8p\" (UID: \"f9a434b3-f6c6-441a-bc5f-0731967288da\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbbx8p" Oct 01 15:07:07 crc kubenswrapper[4771]: I1001 15:07:07.449311 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f9a434b3-f6c6-441a-bc5f-0731967288da-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbbx8p\" (UID: \"f9a434b3-f6c6-441a-bc5f-0731967288da\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbbx8p" Oct 01 15:07:07 crc kubenswrapper[4771]: I1001 15:07:07.449442 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f9a434b3-f6c6-441a-bc5f-0731967288da-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbbx8p\" (UID: \"f9a434b3-f6c6-441a-bc5f-0731967288da\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbbx8p" Oct 01 15:07:07 crc kubenswrapper[4771]: I1001 15:07:07.482215 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5krdd\" (UniqueName: \"kubernetes.io/projected/f9a434b3-f6c6-441a-bc5f-0731967288da-kube-api-access-5krdd\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbbx8p\" (UID: \"f9a434b3-f6c6-441a-bc5f-0731967288da\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbbx8p" Oct 01 15:07:07 crc kubenswrapper[4771]: I1001 15:07:07.628160 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbbx8p" Oct 01 15:07:07 crc kubenswrapper[4771]: I1001 15:07:07.873922 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbbx8p"] Oct 01 15:07:08 crc kubenswrapper[4771]: I1001 15:07:08.538573 4771 generic.go:334] "Generic (PLEG): container finished" podID="f9a434b3-f6c6-441a-bc5f-0731967288da" containerID="033ca5f82ba9569c94cd8766df66a80e0f16cbc9c4aebfbbea419d46876aab3d" exitCode=0 Oct 01 15:07:08 crc kubenswrapper[4771]: I1001 15:07:08.538647 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbbx8p" event={"ID":"f9a434b3-f6c6-441a-bc5f-0731967288da","Type":"ContainerDied","Data":"033ca5f82ba9569c94cd8766df66a80e0f16cbc9c4aebfbbea419d46876aab3d"} Oct 01 15:07:08 crc kubenswrapper[4771]: I1001 15:07:08.539025 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbbx8p" event={"ID":"f9a434b3-f6c6-441a-bc5f-0731967288da","Type":"ContainerStarted","Data":"79681bce31b24f3c90f0180f6e3761d788180c72bdb0df8c182e180b162778ff"} Oct 01 15:07:10 crc kubenswrapper[4771]: I1001 15:07:10.559361 4771 generic.go:334] "Generic (PLEG): container finished" podID="f9a434b3-f6c6-441a-bc5f-0731967288da" containerID="dae00f3c629cd9f37586f57664a6eff9059be0f93709370f43feed1897a67f1e" exitCode=0 Oct 01 15:07:10 crc kubenswrapper[4771]: I1001 15:07:10.559467 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbbx8p" event={"ID":"f9a434b3-f6c6-441a-bc5f-0731967288da","Type":"ContainerDied","Data":"dae00f3c629cd9f37586f57664a6eff9059be0f93709370f43feed1897a67f1e"} Oct 01 15:07:11 crc kubenswrapper[4771]: I1001 15:07:11.575494 4771 generic.go:334] "Generic (PLEG): container finished" podID="f9a434b3-f6c6-441a-bc5f-0731967288da" containerID="f5a455d88144af3823115472a78c3cd00e99aa9fee330bcd5db8d9f19fcbea24" exitCode=0 Oct 01 15:07:11 crc kubenswrapper[4771]: I1001 15:07:11.575561 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbbx8p" event={"ID":"f9a434b3-f6c6-441a-bc5f-0731967288da","Type":"ContainerDied","Data":"f5a455d88144af3823115472a78c3cd00e99aa9fee330bcd5db8d9f19fcbea24"} Oct 01 15:07:12 crc kubenswrapper[4771]: I1001 15:07:12.893505 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbbx8p" Oct 01 15:07:13 crc kubenswrapper[4771]: I1001 15:07:13.068824 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f9a434b3-f6c6-441a-bc5f-0731967288da-util\") pod \"f9a434b3-f6c6-441a-bc5f-0731967288da\" (UID: \"f9a434b3-f6c6-441a-bc5f-0731967288da\") " Oct 01 15:07:13 crc kubenswrapper[4771]: I1001 15:07:13.068896 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5krdd\" (UniqueName: \"kubernetes.io/projected/f9a434b3-f6c6-441a-bc5f-0731967288da-kube-api-access-5krdd\") pod \"f9a434b3-f6c6-441a-bc5f-0731967288da\" (UID: \"f9a434b3-f6c6-441a-bc5f-0731967288da\") " Oct 01 15:07:13 crc kubenswrapper[4771]: I1001 15:07:13.068933 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f9a434b3-f6c6-441a-bc5f-0731967288da-bundle\") pod \"f9a434b3-f6c6-441a-bc5f-0731967288da\" (UID: \"f9a434b3-f6c6-441a-bc5f-0731967288da\") " Oct 01 15:07:13 crc kubenswrapper[4771]: I1001 15:07:13.070271 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9a434b3-f6c6-441a-bc5f-0731967288da-bundle" (OuterVolumeSpecName: "bundle") pod "f9a434b3-f6c6-441a-bc5f-0731967288da" (UID: "f9a434b3-f6c6-441a-bc5f-0731967288da"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:07:13 crc kubenswrapper[4771]: I1001 15:07:13.077692 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9a434b3-f6c6-441a-bc5f-0731967288da-kube-api-access-5krdd" (OuterVolumeSpecName: "kube-api-access-5krdd") pod "f9a434b3-f6c6-441a-bc5f-0731967288da" (UID: "f9a434b3-f6c6-441a-bc5f-0731967288da"). InnerVolumeSpecName "kube-api-access-5krdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:07:13 crc kubenswrapper[4771]: I1001 15:07:13.101372 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9a434b3-f6c6-441a-bc5f-0731967288da-util" (OuterVolumeSpecName: "util") pod "f9a434b3-f6c6-441a-bc5f-0731967288da" (UID: "f9a434b3-f6c6-441a-bc5f-0731967288da"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:07:13 crc kubenswrapper[4771]: I1001 15:07:13.171186 4771 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f9a434b3-f6c6-441a-bc5f-0731967288da-util\") on node \"crc\" DevicePath \"\"" Oct 01 15:07:13 crc kubenswrapper[4771]: I1001 15:07:13.171255 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5krdd\" (UniqueName: \"kubernetes.io/projected/f9a434b3-f6c6-441a-bc5f-0731967288da-kube-api-access-5krdd\") on node \"crc\" DevicePath \"\"" Oct 01 15:07:13 crc kubenswrapper[4771]: I1001 15:07:13.171284 4771 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f9a434b3-f6c6-441a-bc5f-0731967288da-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:07:13 crc kubenswrapper[4771]: I1001 15:07:13.592629 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbbx8p" event={"ID":"f9a434b3-f6c6-441a-bc5f-0731967288da","Type":"ContainerDied","Data":"79681bce31b24f3c90f0180f6e3761d788180c72bdb0df8c182e180b162778ff"} Oct 01 15:07:13 crc kubenswrapper[4771]: I1001 15:07:13.592704 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79681bce31b24f3c90f0180f6e3761d788180c72bdb0df8c182e180b162778ff" Oct 01 15:07:13 crc kubenswrapper[4771]: I1001 15:07:13.592763 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbbx8p" Oct 01 15:07:15 crc kubenswrapper[4771]: I1001 15:07:15.117661 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-9twbv"] Oct 01 15:07:15 crc kubenswrapper[4771]: E1001 15:07:15.117870 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9a434b3-f6c6-441a-bc5f-0731967288da" containerName="extract" Oct 01 15:07:15 crc kubenswrapper[4771]: I1001 15:07:15.117881 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9a434b3-f6c6-441a-bc5f-0731967288da" containerName="extract" Oct 01 15:07:15 crc kubenswrapper[4771]: E1001 15:07:15.117892 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9a434b3-f6c6-441a-bc5f-0731967288da" containerName="pull" Oct 01 15:07:15 crc kubenswrapper[4771]: I1001 15:07:15.117898 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9a434b3-f6c6-441a-bc5f-0731967288da" containerName="pull" Oct 01 15:07:15 crc kubenswrapper[4771]: E1001 15:07:15.117909 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9a434b3-f6c6-441a-bc5f-0731967288da" containerName="util" Oct 01 15:07:15 crc kubenswrapper[4771]: I1001 15:07:15.117915 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9a434b3-f6c6-441a-bc5f-0731967288da" containerName="util" Oct 01 15:07:15 crc kubenswrapper[4771]: I1001 15:07:15.118001 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9a434b3-f6c6-441a-bc5f-0731967288da" containerName="extract" Oct 01 15:07:15 crc kubenswrapper[4771]: I1001 15:07:15.118341 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-9twbv" Oct 01 15:07:15 crc kubenswrapper[4771]: I1001 15:07:15.120692 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 01 15:07:15 crc kubenswrapper[4771]: I1001 15:07:15.120950 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-gfg22" Oct 01 15:07:15 crc kubenswrapper[4771]: I1001 15:07:15.121080 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 01 15:07:15 crc kubenswrapper[4771]: I1001 15:07:15.135876 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-9twbv"] Oct 01 15:07:15 crc kubenswrapper[4771]: I1001 15:07:15.300119 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6pls\" (UniqueName: \"kubernetes.io/projected/88f55c4c-a1d9-4751-9776-562464717201-kube-api-access-g6pls\") pod \"nmstate-operator-5d6f6cfd66-9twbv\" (UID: \"88f55c4c-a1d9-4751-9776-562464717201\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-9twbv" Oct 01 15:07:15 crc kubenswrapper[4771]: I1001 15:07:15.401297 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6pls\" (UniqueName: \"kubernetes.io/projected/88f55c4c-a1d9-4751-9776-562464717201-kube-api-access-g6pls\") pod \"nmstate-operator-5d6f6cfd66-9twbv\" (UID: \"88f55c4c-a1d9-4751-9776-562464717201\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-9twbv" Oct 01 15:07:15 crc kubenswrapper[4771]: I1001 15:07:15.434213 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6pls\" (UniqueName: \"kubernetes.io/projected/88f55c4c-a1d9-4751-9776-562464717201-kube-api-access-g6pls\") pod \"nmstate-operator-5d6f6cfd66-9twbv\" (UID: \"88f55c4c-a1d9-4751-9776-562464717201\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-9twbv" Oct 01 15:07:15 crc kubenswrapper[4771]: I1001 15:07:15.732467 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-9twbv" Oct 01 15:07:16 crc kubenswrapper[4771]: I1001 15:07:16.068245 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-9twbv"] Oct 01 15:07:16 crc kubenswrapper[4771]: I1001 15:07:16.613702 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-9twbv" event={"ID":"88f55c4c-a1d9-4751-9776-562464717201","Type":"ContainerStarted","Data":"e8f18ce10c8e8a9ba3cd44ddc71de9bdc3b56c4800dc20bdf5bab7f8b2d1d95e"} Oct 01 15:07:19 crc kubenswrapper[4771]: I1001 15:07:19.633332 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-9twbv" event={"ID":"88f55c4c-a1d9-4751-9776-562464717201","Type":"ContainerStarted","Data":"0a6cbcbcf21ba6488c1447274b87ad6f79ced3826f80b85ab99cae5fdadbef02"} Oct 01 15:07:19 crc kubenswrapper[4771]: I1001 15:07:19.668338 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-9twbv" podStartSLOduration=2.160084981 podStartE2EDuration="4.668309305s" podCreationTimestamp="2025-10-01 15:07:15 +0000 UTC" firstStartedPulling="2025-10-01 15:07:16.080616806 +0000 UTC m=+680.699791977" lastFinishedPulling="2025-10-01 15:07:18.58884112 +0000 UTC m=+683.208016301" observedRunningTime="2025-10-01 15:07:19.667510806 +0000 UTC m=+684.286686077" watchObservedRunningTime="2025-10-01 15:07:19.668309305 +0000 UTC m=+684.287484526" Oct 01 15:07:20 crc kubenswrapper[4771]: I1001 15:07:20.598611 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-wpddr"] Oct 01 15:07:20 crc kubenswrapper[4771]: I1001 15:07:20.599588 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-wpddr" Oct 01 15:07:20 crc kubenswrapper[4771]: I1001 15:07:20.602849 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-8hnp5" Oct 01 15:07:20 crc kubenswrapper[4771]: I1001 15:07:20.615683 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-rx6d2"] Oct 01 15:07:20 crc kubenswrapper[4771]: I1001 15:07:20.617482 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-rx6d2" Oct 01 15:07:20 crc kubenswrapper[4771]: I1001 15:07:20.619693 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 01 15:07:20 crc kubenswrapper[4771]: I1001 15:07:20.623481 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-wpddr"] Oct 01 15:07:20 crc kubenswrapper[4771]: I1001 15:07:20.630835 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-vqfvd"] Oct 01 15:07:20 crc kubenswrapper[4771]: I1001 15:07:20.631625 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-vqfvd" Oct 01 15:07:20 crc kubenswrapper[4771]: I1001 15:07:20.637333 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-rx6d2"] Oct 01 15:07:20 crc kubenswrapper[4771]: I1001 15:07:20.677877 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqwvc\" (UniqueName: \"kubernetes.io/projected/e9d87ba9-d0d9-4647-a69b-4a114140b6be-kube-api-access-mqwvc\") pod \"nmstate-webhook-6d689559c5-rx6d2\" (UID: \"e9d87ba9-d0d9-4647-a69b-4a114140b6be\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-rx6d2" Oct 01 15:07:20 crc kubenswrapper[4771]: I1001 15:07:20.677937 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e9d87ba9-d0d9-4647-a69b-4a114140b6be-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-rx6d2\" (UID: \"e9d87ba9-d0d9-4647-a69b-4a114140b6be\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-rx6d2" Oct 01 15:07:20 crc kubenswrapper[4771]: I1001 15:07:20.678018 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/a8a82139-4b56-419e-a4e4-143e3246ec96-ovs-socket\") pod \"nmstate-handler-vqfvd\" (UID: \"a8a82139-4b56-419e-a4e4-143e3246ec96\") " pod="openshift-nmstate/nmstate-handler-vqfvd" Oct 01 15:07:20 crc kubenswrapper[4771]: I1001 15:07:20.678051 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/a8a82139-4b56-419e-a4e4-143e3246ec96-dbus-socket\") pod \"nmstate-handler-vqfvd\" (UID: \"a8a82139-4b56-419e-a4e4-143e3246ec96\") " pod="openshift-nmstate/nmstate-handler-vqfvd" Oct 01 15:07:20 crc kubenswrapper[4771]: I1001 15:07:20.678101 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pl72\" (UniqueName: \"kubernetes.io/projected/5082e8a7-dbba-4c99-9c6a-f35f64310963-kube-api-access-6pl72\") pod \"nmstate-metrics-58fcddf996-wpddr\" (UID: \"5082e8a7-dbba-4c99-9c6a-f35f64310963\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-wpddr" Oct 01 15:07:20 crc kubenswrapper[4771]: I1001 15:07:20.678130 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/a8a82139-4b56-419e-a4e4-143e3246ec96-nmstate-lock\") pod \"nmstate-handler-vqfvd\" (UID: \"a8a82139-4b56-419e-a4e4-143e3246ec96\") " pod="openshift-nmstate/nmstate-handler-vqfvd" Oct 01 15:07:20 crc kubenswrapper[4771]: I1001 15:07:20.678176 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdzcj\" (UniqueName: \"kubernetes.io/projected/a8a82139-4b56-419e-a4e4-143e3246ec96-kube-api-access-rdzcj\") pod \"nmstate-handler-vqfvd\" (UID: \"a8a82139-4b56-419e-a4e4-143e3246ec96\") " pod="openshift-nmstate/nmstate-handler-vqfvd" Oct 01 15:07:20 crc kubenswrapper[4771]: I1001 15:07:20.765648 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-nghbt"] Oct 01 15:07:20 crc kubenswrapper[4771]: I1001 15:07:20.766868 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-nghbt" Oct 01 15:07:20 crc kubenswrapper[4771]: I1001 15:07:20.770303 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-nghbt"] Oct 01 15:07:20 crc kubenswrapper[4771]: I1001 15:07:20.770323 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 01 15:07:20 crc kubenswrapper[4771]: I1001 15:07:20.770350 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 01 15:07:20 crc kubenswrapper[4771]: I1001 15:07:20.770428 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-rttt9" Oct 01 15:07:20 crc kubenswrapper[4771]: I1001 15:07:20.781015 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/a8a82139-4b56-419e-a4e4-143e3246ec96-dbus-socket\") pod \"nmstate-handler-vqfvd\" (UID: \"a8a82139-4b56-419e-a4e4-143e3246ec96\") " pod="openshift-nmstate/nmstate-handler-vqfvd" Oct 01 15:07:20 crc kubenswrapper[4771]: I1001 15:07:20.781052 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4shr8\" (UniqueName: \"kubernetes.io/projected/2d196fd3-9a93-4f81-b0ad-fefca77240a5-kube-api-access-4shr8\") pod \"nmstate-console-plugin-864bb6dfb5-nghbt\" (UID: \"2d196fd3-9a93-4f81-b0ad-fefca77240a5\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-nghbt" Oct 01 15:07:20 crc kubenswrapper[4771]: I1001 15:07:20.781076 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pl72\" (UniqueName: \"kubernetes.io/projected/5082e8a7-dbba-4c99-9c6a-f35f64310963-kube-api-access-6pl72\") pod \"nmstate-metrics-58fcddf996-wpddr\" (UID: \"5082e8a7-dbba-4c99-9c6a-f35f64310963\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-wpddr" Oct 01 15:07:20 crc kubenswrapper[4771]: I1001 15:07:20.781092 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/a8a82139-4b56-419e-a4e4-143e3246ec96-nmstate-lock\") pod \"nmstate-handler-vqfvd\" (UID: \"a8a82139-4b56-419e-a4e4-143e3246ec96\") " pod="openshift-nmstate/nmstate-handler-vqfvd" Oct 01 15:07:20 crc kubenswrapper[4771]: I1001 15:07:20.781109 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2d196fd3-9a93-4f81-b0ad-fefca77240a5-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-nghbt\" (UID: \"2d196fd3-9a93-4f81-b0ad-fefca77240a5\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-nghbt" Oct 01 15:07:20 crc kubenswrapper[4771]: I1001 15:07:20.781131 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdzcj\" (UniqueName: \"kubernetes.io/projected/a8a82139-4b56-419e-a4e4-143e3246ec96-kube-api-access-rdzcj\") pod \"nmstate-handler-vqfvd\" (UID: \"a8a82139-4b56-419e-a4e4-143e3246ec96\") " pod="openshift-nmstate/nmstate-handler-vqfvd" Oct 01 15:07:20 crc kubenswrapper[4771]: I1001 15:07:20.781169 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqwvc\" (UniqueName: \"kubernetes.io/projected/e9d87ba9-d0d9-4647-a69b-4a114140b6be-kube-api-access-mqwvc\") pod \"nmstate-webhook-6d689559c5-rx6d2\" (UID: \"e9d87ba9-d0d9-4647-a69b-4a114140b6be\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-rx6d2" Oct 01 15:07:20 crc kubenswrapper[4771]: I1001 15:07:20.781188 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e9d87ba9-d0d9-4647-a69b-4a114140b6be-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-rx6d2\" (UID: \"e9d87ba9-d0d9-4647-a69b-4a114140b6be\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-rx6d2" Oct 01 15:07:20 crc kubenswrapper[4771]: I1001 15:07:20.781208 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2d196fd3-9a93-4f81-b0ad-fefca77240a5-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-nghbt\" (UID: \"2d196fd3-9a93-4f81-b0ad-fefca77240a5\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-nghbt" Oct 01 15:07:20 crc kubenswrapper[4771]: I1001 15:07:20.781233 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/a8a82139-4b56-419e-a4e4-143e3246ec96-ovs-socket\") pod \"nmstate-handler-vqfvd\" (UID: \"a8a82139-4b56-419e-a4e4-143e3246ec96\") " pod="openshift-nmstate/nmstate-handler-vqfvd" Oct 01 15:07:20 crc kubenswrapper[4771]: I1001 15:07:20.781283 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/a8a82139-4b56-419e-a4e4-143e3246ec96-ovs-socket\") pod \"nmstate-handler-vqfvd\" (UID: \"a8a82139-4b56-419e-a4e4-143e3246ec96\") " pod="openshift-nmstate/nmstate-handler-vqfvd" Oct 01 15:07:20 crc kubenswrapper[4771]: I1001 15:07:20.781462 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/a8a82139-4b56-419e-a4e4-143e3246ec96-dbus-socket\") pod \"nmstate-handler-vqfvd\" (UID: \"a8a82139-4b56-419e-a4e4-143e3246ec96\") " pod="openshift-nmstate/nmstate-handler-vqfvd" Oct 01 15:07:20 crc kubenswrapper[4771]: I1001 15:07:20.781520 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/a8a82139-4b56-419e-a4e4-143e3246ec96-nmstate-lock\") pod \"nmstate-handler-vqfvd\" (UID: \"a8a82139-4b56-419e-a4e4-143e3246ec96\") " pod="openshift-nmstate/nmstate-handler-vqfvd" Oct 01 15:07:20 crc kubenswrapper[4771]: E1001 15:07:20.782053 4771 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Oct 01 15:07:20 crc kubenswrapper[4771]: E1001 15:07:20.782102 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9d87ba9-d0d9-4647-a69b-4a114140b6be-tls-key-pair podName:e9d87ba9-d0d9-4647-a69b-4a114140b6be nodeName:}" failed. No retries permitted until 2025-10-01 15:07:21.28208605 +0000 UTC m=+685.901261221 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/e9d87ba9-d0d9-4647-a69b-4a114140b6be-tls-key-pair") pod "nmstate-webhook-6d689559c5-rx6d2" (UID: "e9d87ba9-d0d9-4647-a69b-4a114140b6be") : secret "openshift-nmstate-webhook" not found Oct 01 15:07:20 crc kubenswrapper[4771]: I1001 15:07:20.800159 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pl72\" (UniqueName: \"kubernetes.io/projected/5082e8a7-dbba-4c99-9c6a-f35f64310963-kube-api-access-6pl72\") pod \"nmstate-metrics-58fcddf996-wpddr\" (UID: \"5082e8a7-dbba-4c99-9c6a-f35f64310963\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-wpddr" Oct 01 15:07:20 crc kubenswrapper[4771]: I1001 15:07:20.800300 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdzcj\" (UniqueName: \"kubernetes.io/projected/a8a82139-4b56-419e-a4e4-143e3246ec96-kube-api-access-rdzcj\") pod \"nmstate-handler-vqfvd\" (UID: \"a8a82139-4b56-419e-a4e4-143e3246ec96\") " pod="openshift-nmstate/nmstate-handler-vqfvd" Oct 01 15:07:20 crc kubenswrapper[4771]: I1001 15:07:20.800542 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqwvc\" (UniqueName: \"kubernetes.io/projected/e9d87ba9-d0d9-4647-a69b-4a114140b6be-kube-api-access-mqwvc\") pod \"nmstate-webhook-6d689559c5-rx6d2\" (UID: \"e9d87ba9-d0d9-4647-a69b-4a114140b6be\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-rx6d2" Oct 01 15:07:20 crc kubenswrapper[4771]: I1001 15:07:20.882273 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4shr8\" (UniqueName: \"kubernetes.io/projected/2d196fd3-9a93-4f81-b0ad-fefca77240a5-kube-api-access-4shr8\") pod \"nmstate-console-plugin-864bb6dfb5-nghbt\" (UID: \"2d196fd3-9a93-4f81-b0ad-fefca77240a5\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-nghbt" Oct 01 15:07:20 crc kubenswrapper[4771]: I1001 15:07:20.882342 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2d196fd3-9a93-4f81-b0ad-fefca77240a5-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-nghbt\" (UID: \"2d196fd3-9a93-4f81-b0ad-fefca77240a5\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-nghbt" Oct 01 15:07:20 crc kubenswrapper[4771]: I1001 15:07:20.882429 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2d196fd3-9a93-4f81-b0ad-fefca77240a5-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-nghbt\" (UID: \"2d196fd3-9a93-4f81-b0ad-fefca77240a5\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-nghbt" Oct 01 15:07:20 crc kubenswrapper[4771]: I1001 15:07:20.883642 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2d196fd3-9a93-4f81-b0ad-fefca77240a5-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-nghbt\" (UID: \"2d196fd3-9a93-4f81-b0ad-fefca77240a5\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-nghbt" Oct 01 15:07:20 crc kubenswrapper[4771]: I1001 15:07:20.888351 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2d196fd3-9a93-4f81-b0ad-fefca77240a5-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-nghbt\" (UID: \"2d196fd3-9a93-4f81-b0ad-fefca77240a5\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-nghbt" Oct 01 15:07:20 crc kubenswrapper[4771]: I1001 15:07:20.921302 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-wpddr" Oct 01 15:07:20 crc kubenswrapper[4771]: I1001 15:07:20.933653 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4shr8\" (UniqueName: \"kubernetes.io/projected/2d196fd3-9a93-4f81-b0ad-fefca77240a5-kube-api-access-4shr8\") pod \"nmstate-console-plugin-864bb6dfb5-nghbt\" (UID: \"2d196fd3-9a93-4f81-b0ad-fefca77240a5\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-nghbt" Oct 01 15:07:20 crc kubenswrapper[4771]: I1001 15:07:20.960149 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-vqfvd" Oct 01 15:07:21 crc kubenswrapper[4771]: I1001 15:07:21.019216 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7fbcd55fbb-lkc96"] Oct 01 15:07:21 crc kubenswrapper[4771]: I1001 15:07:21.019910 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7fbcd55fbb-lkc96" Oct 01 15:07:21 crc kubenswrapper[4771]: I1001 15:07:21.036304 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7fbcd55fbb-lkc96"] Oct 01 15:07:21 crc kubenswrapper[4771]: I1001 15:07:21.087576 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d57ee87-74fd-4f48-a959-85a5aafa1a68-trusted-ca-bundle\") pod \"console-7fbcd55fbb-lkc96\" (UID: \"6d57ee87-74fd-4f48-a959-85a5aafa1a68\") " pod="openshift-console/console-7fbcd55fbb-lkc96" Oct 01 15:07:21 crc kubenswrapper[4771]: I1001 15:07:21.087623 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6d57ee87-74fd-4f48-a959-85a5aafa1a68-console-oauth-config\") pod \"console-7fbcd55fbb-lkc96\" (UID: \"6d57ee87-74fd-4f48-a959-85a5aafa1a68\") " pod="openshift-console/console-7fbcd55fbb-lkc96" Oct 01 15:07:21 crc kubenswrapper[4771]: I1001 15:07:21.087662 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2c68\" (UniqueName: \"kubernetes.io/projected/6d57ee87-74fd-4f48-a959-85a5aafa1a68-kube-api-access-n2c68\") pod \"console-7fbcd55fbb-lkc96\" (UID: \"6d57ee87-74fd-4f48-a959-85a5aafa1a68\") " pod="openshift-console/console-7fbcd55fbb-lkc96" Oct 01 15:07:21 crc kubenswrapper[4771]: I1001 15:07:21.087690 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6d57ee87-74fd-4f48-a959-85a5aafa1a68-oauth-serving-cert\") pod \"console-7fbcd55fbb-lkc96\" (UID: \"6d57ee87-74fd-4f48-a959-85a5aafa1a68\") " pod="openshift-console/console-7fbcd55fbb-lkc96" Oct 01 15:07:21 crc kubenswrapper[4771]: I1001 15:07:21.087770 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6d57ee87-74fd-4f48-a959-85a5aafa1a68-console-config\") pod \"console-7fbcd55fbb-lkc96\" (UID: \"6d57ee87-74fd-4f48-a959-85a5aafa1a68\") " pod="openshift-console/console-7fbcd55fbb-lkc96" Oct 01 15:07:21 crc kubenswrapper[4771]: I1001 15:07:21.087797 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6d57ee87-74fd-4f48-a959-85a5aafa1a68-service-ca\") pod \"console-7fbcd55fbb-lkc96\" (UID: \"6d57ee87-74fd-4f48-a959-85a5aafa1a68\") " pod="openshift-console/console-7fbcd55fbb-lkc96" Oct 01 15:07:21 crc kubenswrapper[4771]: I1001 15:07:21.087844 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6d57ee87-74fd-4f48-a959-85a5aafa1a68-console-serving-cert\") pod \"console-7fbcd55fbb-lkc96\" (UID: \"6d57ee87-74fd-4f48-a959-85a5aafa1a68\") " pod="openshift-console/console-7fbcd55fbb-lkc96" Oct 01 15:07:21 crc kubenswrapper[4771]: I1001 15:07:21.095330 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-nghbt" Oct 01 15:07:21 crc kubenswrapper[4771]: I1001 15:07:21.189379 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6d57ee87-74fd-4f48-a959-85a5aafa1a68-oauth-serving-cert\") pod \"console-7fbcd55fbb-lkc96\" (UID: \"6d57ee87-74fd-4f48-a959-85a5aafa1a68\") " pod="openshift-console/console-7fbcd55fbb-lkc96" Oct 01 15:07:21 crc kubenswrapper[4771]: I1001 15:07:21.189706 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6d57ee87-74fd-4f48-a959-85a5aafa1a68-console-config\") pod \"console-7fbcd55fbb-lkc96\" (UID: \"6d57ee87-74fd-4f48-a959-85a5aafa1a68\") " pod="openshift-console/console-7fbcd55fbb-lkc96" Oct 01 15:07:21 crc kubenswrapper[4771]: I1001 15:07:21.189747 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6d57ee87-74fd-4f48-a959-85a5aafa1a68-service-ca\") pod \"console-7fbcd55fbb-lkc96\" (UID: \"6d57ee87-74fd-4f48-a959-85a5aafa1a68\") " pod="openshift-console/console-7fbcd55fbb-lkc96" Oct 01 15:07:21 crc kubenswrapper[4771]: I1001 15:07:21.189804 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6d57ee87-74fd-4f48-a959-85a5aafa1a68-console-serving-cert\") pod \"console-7fbcd55fbb-lkc96\" (UID: \"6d57ee87-74fd-4f48-a959-85a5aafa1a68\") " pod="openshift-console/console-7fbcd55fbb-lkc96" Oct 01 15:07:21 crc kubenswrapper[4771]: I1001 15:07:21.189848 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d57ee87-74fd-4f48-a959-85a5aafa1a68-trusted-ca-bundle\") pod \"console-7fbcd55fbb-lkc96\" (UID: \"6d57ee87-74fd-4f48-a959-85a5aafa1a68\") " pod="openshift-console/console-7fbcd55fbb-lkc96" Oct 01 15:07:21 crc kubenswrapper[4771]: I1001 15:07:21.189866 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6d57ee87-74fd-4f48-a959-85a5aafa1a68-console-oauth-config\") pod \"console-7fbcd55fbb-lkc96\" (UID: \"6d57ee87-74fd-4f48-a959-85a5aafa1a68\") " pod="openshift-console/console-7fbcd55fbb-lkc96" Oct 01 15:07:21 crc kubenswrapper[4771]: I1001 15:07:21.189891 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2c68\" (UniqueName: \"kubernetes.io/projected/6d57ee87-74fd-4f48-a959-85a5aafa1a68-kube-api-access-n2c68\") pod \"console-7fbcd55fbb-lkc96\" (UID: \"6d57ee87-74fd-4f48-a959-85a5aafa1a68\") " pod="openshift-console/console-7fbcd55fbb-lkc96" Oct 01 15:07:21 crc kubenswrapper[4771]: I1001 15:07:21.190170 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6d57ee87-74fd-4f48-a959-85a5aafa1a68-oauth-serving-cert\") pod \"console-7fbcd55fbb-lkc96\" (UID: \"6d57ee87-74fd-4f48-a959-85a5aafa1a68\") " pod="openshift-console/console-7fbcd55fbb-lkc96" Oct 01 15:07:21 crc kubenswrapper[4771]: I1001 15:07:21.191756 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6d57ee87-74fd-4f48-a959-85a5aafa1a68-console-config\") pod \"console-7fbcd55fbb-lkc96\" (UID: \"6d57ee87-74fd-4f48-a959-85a5aafa1a68\") " pod="openshift-console/console-7fbcd55fbb-lkc96" Oct 01 15:07:21 crc kubenswrapper[4771]: I1001 15:07:21.192340 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6d57ee87-74fd-4f48-a959-85a5aafa1a68-trusted-ca-bundle\") pod \"console-7fbcd55fbb-lkc96\" (UID: \"6d57ee87-74fd-4f48-a959-85a5aafa1a68\") " pod="openshift-console/console-7fbcd55fbb-lkc96" Oct 01 15:07:21 crc kubenswrapper[4771]: I1001 15:07:21.192403 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6d57ee87-74fd-4f48-a959-85a5aafa1a68-service-ca\") pod \"console-7fbcd55fbb-lkc96\" (UID: \"6d57ee87-74fd-4f48-a959-85a5aafa1a68\") " pod="openshift-console/console-7fbcd55fbb-lkc96" Oct 01 15:07:21 crc kubenswrapper[4771]: I1001 15:07:21.194884 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6d57ee87-74fd-4f48-a959-85a5aafa1a68-console-serving-cert\") pod \"console-7fbcd55fbb-lkc96\" (UID: \"6d57ee87-74fd-4f48-a959-85a5aafa1a68\") " pod="openshift-console/console-7fbcd55fbb-lkc96" Oct 01 15:07:21 crc kubenswrapper[4771]: I1001 15:07:21.194942 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6d57ee87-74fd-4f48-a959-85a5aafa1a68-console-oauth-config\") pod \"console-7fbcd55fbb-lkc96\" (UID: \"6d57ee87-74fd-4f48-a959-85a5aafa1a68\") " pod="openshift-console/console-7fbcd55fbb-lkc96" Oct 01 15:07:21 crc kubenswrapper[4771]: I1001 15:07:21.207877 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2c68\" (UniqueName: \"kubernetes.io/projected/6d57ee87-74fd-4f48-a959-85a5aafa1a68-kube-api-access-n2c68\") pod \"console-7fbcd55fbb-lkc96\" (UID: \"6d57ee87-74fd-4f48-a959-85a5aafa1a68\") " pod="openshift-console/console-7fbcd55fbb-lkc96" Oct 01 15:07:21 crc kubenswrapper[4771]: I1001 15:07:21.291038 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e9d87ba9-d0d9-4647-a69b-4a114140b6be-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-rx6d2\" (UID: \"e9d87ba9-d0d9-4647-a69b-4a114140b6be\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-rx6d2" Oct 01 15:07:21 crc kubenswrapper[4771]: I1001 15:07:21.296261 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e9d87ba9-d0d9-4647-a69b-4a114140b6be-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-rx6d2\" (UID: \"e9d87ba9-d0d9-4647-a69b-4a114140b6be\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-rx6d2" Oct 01 15:07:21 crc kubenswrapper[4771]: I1001 15:07:21.367148 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7fbcd55fbb-lkc96" Oct 01 15:07:21 crc kubenswrapper[4771]: I1001 15:07:21.413854 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-wpddr"] Oct 01 15:07:21 crc kubenswrapper[4771]: W1001 15:07:21.427676 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5082e8a7_dbba_4c99_9c6a_f35f64310963.slice/crio-6aead23b9025999144eac6bfc5caeca7796179a1469efd5f2e4ac74a7f2fe071 WatchSource:0}: Error finding container 6aead23b9025999144eac6bfc5caeca7796179a1469efd5f2e4ac74a7f2fe071: Status 404 returned error can't find the container with id 6aead23b9025999144eac6bfc5caeca7796179a1469efd5f2e4ac74a7f2fe071 Oct 01 15:07:21 crc kubenswrapper[4771]: I1001 15:07:21.488549 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-nghbt"] Oct 01 15:07:21 crc kubenswrapper[4771]: W1001 15:07:21.491205 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d196fd3_9a93_4f81_b0ad_fefca77240a5.slice/crio-305443b320552914e5340efea1a5b40c68165916145af7c8302a497562dbe1de WatchSource:0}: Error finding container 305443b320552914e5340efea1a5b40c68165916145af7c8302a497562dbe1de: Status 404 returned error can't find the container with id 305443b320552914e5340efea1a5b40c68165916145af7c8302a497562dbe1de Oct 01 15:07:21 crc kubenswrapper[4771]: I1001 15:07:21.549118 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-rx6d2" Oct 01 15:07:21 crc kubenswrapper[4771]: I1001 15:07:21.568649 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7fbcd55fbb-lkc96"] Oct 01 15:07:21 crc kubenswrapper[4771]: W1001 15:07:21.572873 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d57ee87_74fd_4f48_a959_85a5aafa1a68.slice/crio-32aaf4e0eaaf5f23b29274c0f056e3c7c2069e6236a6be341cd2fd2c5fe6519a WatchSource:0}: Error finding container 32aaf4e0eaaf5f23b29274c0f056e3c7c2069e6236a6be341cd2fd2c5fe6519a: Status 404 returned error can't find the container with id 32aaf4e0eaaf5f23b29274c0f056e3c7c2069e6236a6be341cd2fd2c5fe6519a Oct 01 15:07:21 crc kubenswrapper[4771]: I1001 15:07:21.655282 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7fbcd55fbb-lkc96" event={"ID":"6d57ee87-74fd-4f48-a959-85a5aafa1a68","Type":"ContainerStarted","Data":"32aaf4e0eaaf5f23b29274c0f056e3c7c2069e6236a6be341cd2fd2c5fe6519a"} Oct 01 15:07:21 crc kubenswrapper[4771]: I1001 15:07:21.657236 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-nghbt" event={"ID":"2d196fd3-9a93-4f81-b0ad-fefca77240a5","Type":"ContainerStarted","Data":"305443b320552914e5340efea1a5b40c68165916145af7c8302a497562dbe1de"} Oct 01 15:07:21 crc kubenswrapper[4771]: I1001 15:07:21.658636 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-vqfvd" event={"ID":"a8a82139-4b56-419e-a4e4-143e3246ec96","Type":"ContainerStarted","Data":"e548eaf5af4c8d778d4e64f5accc524bc4334a803f02fc03af0dae220753b58e"} Oct 01 15:07:21 crc kubenswrapper[4771]: I1001 15:07:21.660060 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-wpddr" event={"ID":"5082e8a7-dbba-4c99-9c6a-f35f64310963","Type":"ContainerStarted","Data":"6aead23b9025999144eac6bfc5caeca7796179a1469efd5f2e4ac74a7f2fe071"} Oct 01 15:07:21 crc kubenswrapper[4771]: I1001 15:07:21.979989 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-rx6d2"] Oct 01 15:07:21 crc kubenswrapper[4771]: W1001 15:07:21.984526 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9d87ba9_d0d9_4647_a69b_4a114140b6be.slice/crio-41ad0235129da8f0dbd8bd54e4d1e394411945b263c221896097af81bddb7cc6 WatchSource:0}: Error finding container 41ad0235129da8f0dbd8bd54e4d1e394411945b263c221896097af81bddb7cc6: Status 404 returned error can't find the container with id 41ad0235129da8f0dbd8bd54e4d1e394411945b263c221896097af81bddb7cc6 Oct 01 15:07:22 crc kubenswrapper[4771]: I1001 15:07:22.669673 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-rx6d2" event={"ID":"e9d87ba9-d0d9-4647-a69b-4a114140b6be","Type":"ContainerStarted","Data":"41ad0235129da8f0dbd8bd54e4d1e394411945b263c221896097af81bddb7cc6"} Oct 01 15:07:22 crc kubenswrapper[4771]: I1001 15:07:22.673840 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7fbcd55fbb-lkc96" event={"ID":"6d57ee87-74fd-4f48-a959-85a5aafa1a68","Type":"ContainerStarted","Data":"d009bf0ed4f21935e75ad0a663f5634ae21861d35d4f6ed8d4ca29b3cec0a36a"} Oct 01 15:07:22 crc kubenswrapper[4771]: I1001 15:07:22.694528 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7fbcd55fbb-lkc96" podStartSLOduration=1.6945092179999999 podStartE2EDuration="1.694509218s" podCreationTimestamp="2025-10-01 15:07:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:07:22.690357757 +0000 UTC m=+687.309532948" watchObservedRunningTime="2025-10-01 15:07:22.694509218 +0000 UTC m=+687.313684389" Oct 01 15:07:24 crc kubenswrapper[4771]: I1001 15:07:24.693376 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-nghbt" event={"ID":"2d196fd3-9a93-4f81-b0ad-fefca77240a5","Type":"ContainerStarted","Data":"56edacee3850f3378b85c3fd7dbfe0aee92fb370a0d8e4abeb770849f08ef8c6"} Oct 01 15:07:24 crc kubenswrapper[4771]: I1001 15:07:24.696535 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-vqfvd" event={"ID":"a8a82139-4b56-419e-a4e4-143e3246ec96","Type":"ContainerStarted","Data":"bbfafb5716173648250ba05ffa7af9da04cac4abdad73874ef606b8e591d0832"} Oct 01 15:07:24 crc kubenswrapper[4771]: I1001 15:07:24.696653 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-vqfvd" Oct 01 15:07:24 crc kubenswrapper[4771]: I1001 15:07:24.698480 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-rx6d2" event={"ID":"e9d87ba9-d0d9-4647-a69b-4a114140b6be","Type":"ContainerStarted","Data":"aabbeb9084db33637dc0e4a5a32dde4fc8f6c5846e794050ddef3329b5a9dbd2"} Oct 01 15:07:24 crc kubenswrapper[4771]: I1001 15:07:24.698611 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6d689559c5-rx6d2" Oct 01 15:07:24 crc kubenswrapper[4771]: I1001 15:07:24.699815 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-wpddr" event={"ID":"5082e8a7-dbba-4c99-9c6a-f35f64310963","Type":"ContainerStarted","Data":"42909f1f45a828c1da015c059b8fb5abfca4c9678da7443726ed814ff9f0d7ba"} Oct 01 15:07:24 crc kubenswrapper[4771]: I1001 15:07:24.715534 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-nghbt" podStartSLOduration=1.688381228 podStartE2EDuration="4.715515804s" podCreationTimestamp="2025-10-01 15:07:20 +0000 UTC" firstStartedPulling="2025-10-01 15:07:21.493096515 +0000 UTC m=+686.112271696" lastFinishedPulling="2025-10-01 15:07:24.520231101 +0000 UTC m=+689.139406272" observedRunningTime="2025-10-01 15:07:24.711523865 +0000 UTC m=+689.330699046" watchObservedRunningTime="2025-10-01 15:07:24.715515804 +0000 UTC m=+689.334690985" Oct 01 15:07:24 crc kubenswrapper[4771]: I1001 15:07:24.731437 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-vqfvd" podStartSLOduration=2.038157905 podStartE2EDuration="4.731395697s" podCreationTimestamp="2025-10-01 15:07:20 +0000 UTC" firstStartedPulling="2025-10-01 15:07:20.980951311 +0000 UTC m=+685.600126482" lastFinishedPulling="2025-10-01 15:07:23.674189063 +0000 UTC m=+688.293364274" observedRunningTime="2025-10-01 15:07:24.724105267 +0000 UTC m=+689.343280448" watchObservedRunningTime="2025-10-01 15:07:24.731395697 +0000 UTC m=+689.350570908" Oct 01 15:07:26 crc kubenswrapper[4771]: I1001 15:07:26.003816 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6d689559c5-rx6d2" podStartSLOduration=4.316225933 podStartE2EDuration="6.003799287s" podCreationTimestamp="2025-10-01 15:07:20 +0000 UTC" firstStartedPulling="2025-10-01 15:07:21.988121807 +0000 UTC m=+686.607296988" lastFinishedPulling="2025-10-01 15:07:23.675695131 +0000 UTC m=+688.294870342" observedRunningTime="2025-10-01 15:07:24.746440179 +0000 UTC m=+689.365615430" watchObservedRunningTime="2025-10-01 15:07:26.003799287 +0000 UTC m=+690.622974448" Oct 01 15:07:27 crc kubenswrapper[4771]: I1001 15:07:27.722963 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-wpddr" event={"ID":"5082e8a7-dbba-4c99-9c6a-f35f64310963","Type":"ContainerStarted","Data":"5fad14f50a6f75a688caa57afd3d43264292e5b8421b04cdb92647cf2a08c741"} Oct 01 15:07:30 crc kubenswrapper[4771]: I1001 15:07:30.993370 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-vqfvd" Oct 01 15:07:31 crc kubenswrapper[4771]: I1001 15:07:31.015021 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58fcddf996-wpddr" podStartSLOduration=5.811737893 podStartE2EDuration="11.014994833s" podCreationTimestamp="2025-10-01 15:07:20 +0000 UTC" firstStartedPulling="2025-10-01 15:07:21.430925597 +0000 UTC m=+686.050100768" lastFinishedPulling="2025-10-01 15:07:26.634182537 +0000 UTC m=+691.253357708" observedRunningTime="2025-10-01 15:07:27.750267018 +0000 UTC m=+692.369442269" watchObservedRunningTime="2025-10-01 15:07:31.014994833 +0000 UTC m=+695.634170034" Oct 01 15:07:31 crc kubenswrapper[4771]: I1001 15:07:31.367647 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7fbcd55fbb-lkc96" Oct 01 15:07:31 crc kubenswrapper[4771]: I1001 15:07:31.367925 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7fbcd55fbb-lkc96" Oct 01 15:07:31 crc kubenswrapper[4771]: I1001 15:07:31.377029 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7fbcd55fbb-lkc96" Oct 01 15:07:31 crc kubenswrapper[4771]: I1001 15:07:31.756035 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7fbcd55fbb-lkc96" Oct 01 15:07:31 crc kubenswrapper[4771]: I1001 15:07:31.841316 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-szwtc"] Oct 01 15:07:41 crc kubenswrapper[4771]: I1001 15:07:41.558922 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6d689559c5-rx6d2" Oct 01 15:07:55 crc kubenswrapper[4771]: I1001 15:07:55.956157 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96g4kvv"] Oct 01 15:07:55 crc kubenswrapper[4771]: I1001 15:07:55.957943 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96g4kvv" Oct 01 15:07:55 crc kubenswrapper[4771]: I1001 15:07:55.960867 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 01 15:07:55 crc kubenswrapper[4771]: I1001 15:07:55.969675 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96g4kvv"] Oct 01 15:07:56 crc kubenswrapper[4771]: I1001 15:07:56.016392 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pdnd\" (UniqueName: \"kubernetes.io/projected/5c4805aa-64a3-4354-b3ab-ab48935503cf-kube-api-access-5pdnd\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96g4kvv\" (UID: \"5c4805aa-64a3-4354-b3ab-ab48935503cf\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96g4kvv" Oct 01 15:07:56 crc kubenswrapper[4771]: I1001 15:07:56.016645 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5c4805aa-64a3-4354-b3ab-ab48935503cf-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96g4kvv\" (UID: \"5c4805aa-64a3-4354-b3ab-ab48935503cf\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96g4kvv" Oct 01 15:07:56 crc kubenswrapper[4771]: I1001 15:07:56.016666 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5c4805aa-64a3-4354-b3ab-ab48935503cf-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96g4kvv\" (UID: \"5c4805aa-64a3-4354-b3ab-ab48935503cf\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96g4kvv" Oct 01 15:07:56 crc kubenswrapper[4771]: I1001 15:07:56.117448 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pdnd\" (UniqueName: \"kubernetes.io/projected/5c4805aa-64a3-4354-b3ab-ab48935503cf-kube-api-access-5pdnd\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96g4kvv\" (UID: \"5c4805aa-64a3-4354-b3ab-ab48935503cf\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96g4kvv" Oct 01 15:07:56 crc kubenswrapper[4771]: I1001 15:07:56.117550 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5c4805aa-64a3-4354-b3ab-ab48935503cf-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96g4kvv\" (UID: \"5c4805aa-64a3-4354-b3ab-ab48935503cf\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96g4kvv" Oct 01 15:07:56 crc kubenswrapper[4771]: I1001 15:07:56.117576 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5c4805aa-64a3-4354-b3ab-ab48935503cf-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96g4kvv\" (UID: \"5c4805aa-64a3-4354-b3ab-ab48935503cf\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96g4kvv" Oct 01 15:07:56 crc kubenswrapper[4771]: I1001 15:07:56.118045 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5c4805aa-64a3-4354-b3ab-ab48935503cf-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96g4kvv\" (UID: \"5c4805aa-64a3-4354-b3ab-ab48935503cf\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96g4kvv" Oct 01 15:07:56 crc kubenswrapper[4771]: I1001 15:07:56.118294 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5c4805aa-64a3-4354-b3ab-ab48935503cf-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96g4kvv\" (UID: \"5c4805aa-64a3-4354-b3ab-ab48935503cf\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96g4kvv" Oct 01 15:07:56 crc kubenswrapper[4771]: I1001 15:07:56.145853 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pdnd\" (UniqueName: \"kubernetes.io/projected/5c4805aa-64a3-4354-b3ab-ab48935503cf-kube-api-access-5pdnd\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96g4kvv\" (UID: \"5c4805aa-64a3-4354-b3ab-ab48935503cf\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96g4kvv" Oct 01 15:07:56 crc kubenswrapper[4771]: I1001 15:07:56.273530 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96g4kvv" Oct 01 15:07:56 crc kubenswrapper[4771]: I1001 15:07:56.514185 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96g4kvv"] Oct 01 15:07:56 crc kubenswrapper[4771]: W1001 15:07:56.526963 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c4805aa_64a3_4354_b3ab_ab48935503cf.slice/crio-976cf674d2852e34b80695c9b61d29b7aed6a9eb9e5208f4c1c9c839aa603f74 WatchSource:0}: Error finding container 976cf674d2852e34b80695c9b61d29b7aed6a9eb9e5208f4c1c9c839aa603f74: Status 404 returned error can't find the container with id 976cf674d2852e34b80695c9b61d29b7aed6a9eb9e5208f4c1c9c839aa603f74 Oct 01 15:07:56 crc kubenswrapper[4771]: I1001 15:07:56.905365 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-szwtc" podUID="9db1abd4-f11c-45e1-9341-6c818c3e3579" containerName="console" containerID="cri-o://92c36edd0d4b7281ca084e8fbdcddfb24d1fec792c615fa8ffc97c2f42cafbc5" gracePeriod=15 Oct 01 15:07:56 crc kubenswrapper[4771]: I1001 15:07:56.920393 4771 generic.go:334] "Generic (PLEG): container finished" podID="5c4805aa-64a3-4354-b3ab-ab48935503cf" containerID="9ce2a34edc7bb732f08a43ac09656907cb04f2f81da142f64d544ec895a58238" exitCode=0 Oct 01 15:07:56 crc kubenswrapper[4771]: I1001 15:07:56.920479 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96g4kvv" event={"ID":"5c4805aa-64a3-4354-b3ab-ab48935503cf","Type":"ContainerDied","Data":"9ce2a34edc7bb732f08a43ac09656907cb04f2f81da142f64d544ec895a58238"} Oct 01 15:07:56 crc kubenswrapper[4771]: I1001 15:07:56.920527 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96g4kvv" event={"ID":"5c4805aa-64a3-4354-b3ab-ab48935503cf","Type":"ContainerStarted","Data":"976cf674d2852e34b80695c9b61d29b7aed6a9eb9e5208f4c1c9c839aa603f74"} Oct 01 15:07:57 crc kubenswrapper[4771]: I1001 15:07:57.348470 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-szwtc_9db1abd4-f11c-45e1-9341-6c818c3e3579/console/0.log" Oct 01 15:07:57 crc kubenswrapper[4771]: I1001 15:07:57.348570 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-szwtc" Oct 01 15:07:57 crc kubenswrapper[4771]: I1001 15:07:57.535303 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h95c4\" (UniqueName: \"kubernetes.io/projected/9db1abd4-f11c-45e1-9341-6c818c3e3579-kube-api-access-h95c4\") pod \"9db1abd4-f11c-45e1-9341-6c818c3e3579\" (UID: \"9db1abd4-f11c-45e1-9341-6c818c3e3579\") " Oct 01 15:07:57 crc kubenswrapper[4771]: I1001 15:07:57.535383 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9db1abd4-f11c-45e1-9341-6c818c3e3579-trusted-ca-bundle\") pod \"9db1abd4-f11c-45e1-9341-6c818c3e3579\" (UID: \"9db1abd4-f11c-45e1-9341-6c818c3e3579\") " Oct 01 15:07:57 crc kubenswrapper[4771]: I1001 15:07:57.535438 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9db1abd4-f11c-45e1-9341-6c818c3e3579-console-config\") pod \"9db1abd4-f11c-45e1-9341-6c818c3e3579\" (UID: \"9db1abd4-f11c-45e1-9341-6c818c3e3579\") " Oct 01 15:07:57 crc kubenswrapper[4771]: I1001 15:07:57.535472 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9db1abd4-f11c-45e1-9341-6c818c3e3579-console-serving-cert\") pod \"9db1abd4-f11c-45e1-9341-6c818c3e3579\" (UID: \"9db1abd4-f11c-45e1-9341-6c818c3e3579\") " Oct 01 15:07:57 crc kubenswrapper[4771]: I1001 15:07:57.535524 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9db1abd4-f11c-45e1-9341-6c818c3e3579-console-oauth-config\") pod \"9db1abd4-f11c-45e1-9341-6c818c3e3579\" (UID: \"9db1abd4-f11c-45e1-9341-6c818c3e3579\") " Oct 01 15:07:57 crc kubenswrapper[4771]: I1001 15:07:57.535561 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9db1abd4-f11c-45e1-9341-6c818c3e3579-service-ca\") pod \"9db1abd4-f11c-45e1-9341-6c818c3e3579\" (UID: \"9db1abd4-f11c-45e1-9341-6c818c3e3579\") " Oct 01 15:07:57 crc kubenswrapper[4771]: I1001 15:07:57.535596 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9db1abd4-f11c-45e1-9341-6c818c3e3579-oauth-serving-cert\") pod \"9db1abd4-f11c-45e1-9341-6c818c3e3579\" (UID: \"9db1abd4-f11c-45e1-9341-6c818c3e3579\") " Oct 01 15:07:57 crc kubenswrapper[4771]: I1001 15:07:57.536807 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9db1abd4-f11c-45e1-9341-6c818c3e3579-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "9db1abd4-f11c-45e1-9341-6c818c3e3579" (UID: "9db1abd4-f11c-45e1-9341-6c818c3e3579"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:07:57 crc kubenswrapper[4771]: I1001 15:07:57.536842 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9db1abd4-f11c-45e1-9341-6c818c3e3579-console-config" (OuterVolumeSpecName: "console-config") pod "9db1abd4-f11c-45e1-9341-6c818c3e3579" (UID: "9db1abd4-f11c-45e1-9341-6c818c3e3579"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:07:57 crc kubenswrapper[4771]: I1001 15:07:57.536853 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9db1abd4-f11c-45e1-9341-6c818c3e3579-service-ca" (OuterVolumeSpecName: "service-ca") pod "9db1abd4-f11c-45e1-9341-6c818c3e3579" (UID: "9db1abd4-f11c-45e1-9341-6c818c3e3579"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:07:57 crc kubenswrapper[4771]: I1001 15:07:57.536875 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9db1abd4-f11c-45e1-9341-6c818c3e3579-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9db1abd4-f11c-45e1-9341-6c818c3e3579" (UID: "9db1abd4-f11c-45e1-9341-6c818c3e3579"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:07:57 crc kubenswrapper[4771]: I1001 15:07:57.543663 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9db1abd4-f11c-45e1-9341-6c818c3e3579-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9db1abd4-f11c-45e1-9341-6c818c3e3579" (UID: "9db1abd4-f11c-45e1-9341-6c818c3e3579"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:07:57 crc kubenswrapper[4771]: I1001 15:07:57.543600 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9db1abd4-f11c-45e1-9341-6c818c3e3579-kube-api-access-h95c4" (OuterVolumeSpecName: "kube-api-access-h95c4") pod "9db1abd4-f11c-45e1-9341-6c818c3e3579" (UID: "9db1abd4-f11c-45e1-9341-6c818c3e3579"). InnerVolumeSpecName "kube-api-access-h95c4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:07:57 crc kubenswrapper[4771]: I1001 15:07:57.544019 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9db1abd4-f11c-45e1-9341-6c818c3e3579-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9db1abd4-f11c-45e1-9341-6c818c3e3579" (UID: "9db1abd4-f11c-45e1-9341-6c818c3e3579"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:07:57 crc kubenswrapper[4771]: I1001 15:07:57.637888 4771 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9db1abd4-f11c-45e1-9341-6c818c3e3579-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:07:57 crc kubenswrapper[4771]: I1001 15:07:57.638103 4771 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9db1abd4-f11c-45e1-9341-6c818c3e3579-console-config\") on node \"crc\" DevicePath \"\"" Oct 01 15:07:57 crc kubenswrapper[4771]: I1001 15:07:57.638124 4771 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9db1abd4-f11c-45e1-9341-6c818c3e3579-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 15:07:57 crc kubenswrapper[4771]: I1001 15:07:57.638173 4771 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9db1abd4-f11c-45e1-9341-6c818c3e3579-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 01 15:07:57 crc kubenswrapper[4771]: I1001 15:07:57.638190 4771 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9db1abd4-f11c-45e1-9341-6c818c3e3579-service-ca\") on node \"crc\" DevicePath \"\"" Oct 01 15:07:57 crc kubenswrapper[4771]: I1001 15:07:57.638206 4771 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9db1abd4-f11c-45e1-9341-6c818c3e3579-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 15:07:57 crc kubenswrapper[4771]: I1001 15:07:57.638223 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h95c4\" (UniqueName: \"kubernetes.io/projected/9db1abd4-f11c-45e1-9341-6c818c3e3579-kube-api-access-h95c4\") on node \"crc\" DevicePath \"\"" Oct 01 15:07:57 crc kubenswrapper[4771]: I1001 15:07:57.935926 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-szwtc_9db1abd4-f11c-45e1-9341-6c818c3e3579/console/0.log" Oct 01 15:07:57 crc kubenswrapper[4771]: I1001 15:07:57.936012 4771 generic.go:334] "Generic (PLEG): container finished" podID="9db1abd4-f11c-45e1-9341-6c818c3e3579" containerID="92c36edd0d4b7281ca084e8fbdcddfb24d1fec792c615fa8ffc97c2f42cafbc5" exitCode=2 Oct 01 15:07:57 crc kubenswrapper[4771]: I1001 15:07:57.936055 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-szwtc" event={"ID":"9db1abd4-f11c-45e1-9341-6c818c3e3579","Type":"ContainerDied","Data":"92c36edd0d4b7281ca084e8fbdcddfb24d1fec792c615fa8ffc97c2f42cafbc5"} Oct 01 15:07:57 crc kubenswrapper[4771]: I1001 15:07:57.936097 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-szwtc" event={"ID":"9db1abd4-f11c-45e1-9341-6c818c3e3579","Type":"ContainerDied","Data":"6744bdf07ecad79d6adfb13deec5c69bf3eddb2b16dca4d7fcbe2ede99c007ee"} Oct 01 15:07:57 crc kubenswrapper[4771]: I1001 15:07:57.936125 4771 scope.go:117] "RemoveContainer" containerID="92c36edd0d4b7281ca084e8fbdcddfb24d1fec792c615fa8ffc97c2f42cafbc5" Oct 01 15:07:57 crc kubenswrapper[4771]: I1001 15:07:57.936147 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-szwtc" Oct 01 15:07:57 crc kubenswrapper[4771]: I1001 15:07:57.964889 4771 scope.go:117] "RemoveContainer" containerID="92c36edd0d4b7281ca084e8fbdcddfb24d1fec792c615fa8ffc97c2f42cafbc5" Oct 01 15:07:57 crc kubenswrapper[4771]: E1001 15:07:57.965461 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92c36edd0d4b7281ca084e8fbdcddfb24d1fec792c615fa8ffc97c2f42cafbc5\": container with ID starting with 92c36edd0d4b7281ca084e8fbdcddfb24d1fec792c615fa8ffc97c2f42cafbc5 not found: ID does not exist" containerID="92c36edd0d4b7281ca084e8fbdcddfb24d1fec792c615fa8ffc97c2f42cafbc5" Oct 01 15:07:57 crc kubenswrapper[4771]: I1001 15:07:57.965526 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92c36edd0d4b7281ca084e8fbdcddfb24d1fec792c615fa8ffc97c2f42cafbc5"} err="failed to get container status \"92c36edd0d4b7281ca084e8fbdcddfb24d1fec792c615fa8ffc97c2f42cafbc5\": rpc error: code = NotFound desc = could not find container \"92c36edd0d4b7281ca084e8fbdcddfb24d1fec792c615fa8ffc97c2f42cafbc5\": container with ID starting with 92c36edd0d4b7281ca084e8fbdcddfb24d1fec792c615fa8ffc97c2f42cafbc5 not found: ID does not exist" Oct 01 15:07:57 crc kubenswrapper[4771]: I1001 15:07:57.996118 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-szwtc"] Oct 01 15:07:57 crc kubenswrapper[4771]: I1001 15:07:57.996163 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-szwtc"] Oct 01 15:07:58 crc kubenswrapper[4771]: I1001 15:07:58.953994 4771 generic.go:334] "Generic (PLEG): container finished" podID="5c4805aa-64a3-4354-b3ab-ab48935503cf" containerID="a1f2b5fcd1909720be7d9b722c5401ae15c413c08f615e7122edb7e6036dc4b4" exitCode=0 Oct 01 15:07:58 crc kubenswrapper[4771]: I1001 15:07:58.954260 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96g4kvv" event={"ID":"5c4805aa-64a3-4354-b3ab-ab48935503cf","Type":"ContainerDied","Data":"a1f2b5fcd1909720be7d9b722c5401ae15c413c08f615e7122edb7e6036dc4b4"} Oct 01 15:07:59 crc kubenswrapper[4771]: I1001 15:07:59.968027 4771 generic.go:334] "Generic (PLEG): container finished" podID="5c4805aa-64a3-4354-b3ab-ab48935503cf" containerID="1ffbe8021cd10a846a23d2861085b0afdac1b7f2621d620b4614b98d8fabd150" exitCode=0 Oct 01 15:07:59 crc kubenswrapper[4771]: I1001 15:07:59.968105 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96g4kvv" event={"ID":"5c4805aa-64a3-4354-b3ab-ab48935503cf","Type":"ContainerDied","Data":"1ffbe8021cd10a846a23d2861085b0afdac1b7f2621d620b4614b98d8fabd150"} Oct 01 15:08:00 crc kubenswrapper[4771]: I1001 15:08:00.001156 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9db1abd4-f11c-45e1-9341-6c818c3e3579" path="/var/lib/kubelet/pods/9db1abd4-f11c-45e1-9341-6c818c3e3579/volumes" Oct 01 15:08:01 crc kubenswrapper[4771]: I1001 15:08:01.262125 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96g4kvv" Oct 01 15:08:01 crc kubenswrapper[4771]: I1001 15:08:01.414838 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5c4805aa-64a3-4354-b3ab-ab48935503cf-bundle\") pod \"5c4805aa-64a3-4354-b3ab-ab48935503cf\" (UID: \"5c4805aa-64a3-4354-b3ab-ab48935503cf\") " Oct 01 15:08:01 crc kubenswrapper[4771]: I1001 15:08:01.414938 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5c4805aa-64a3-4354-b3ab-ab48935503cf-util\") pod \"5c4805aa-64a3-4354-b3ab-ab48935503cf\" (UID: \"5c4805aa-64a3-4354-b3ab-ab48935503cf\") " Oct 01 15:08:01 crc kubenswrapper[4771]: I1001 15:08:01.414969 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pdnd\" (UniqueName: \"kubernetes.io/projected/5c4805aa-64a3-4354-b3ab-ab48935503cf-kube-api-access-5pdnd\") pod \"5c4805aa-64a3-4354-b3ab-ab48935503cf\" (UID: \"5c4805aa-64a3-4354-b3ab-ab48935503cf\") " Oct 01 15:08:01 crc kubenswrapper[4771]: I1001 15:08:01.416459 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c4805aa-64a3-4354-b3ab-ab48935503cf-bundle" (OuterVolumeSpecName: "bundle") pod "5c4805aa-64a3-4354-b3ab-ab48935503cf" (UID: "5c4805aa-64a3-4354-b3ab-ab48935503cf"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:08:01 crc kubenswrapper[4771]: I1001 15:08:01.427647 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c4805aa-64a3-4354-b3ab-ab48935503cf-kube-api-access-5pdnd" (OuterVolumeSpecName: "kube-api-access-5pdnd") pod "5c4805aa-64a3-4354-b3ab-ab48935503cf" (UID: "5c4805aa-64a3-4354-b3ab-ab48935503cf"). InnerVolumeSpecName "kube-api-access-5pdnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:08:01 crc kubenswrapper[4771]: I1001 15:08:01.448925 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c4805aa-64a3-4354-b3ab-ab48935503cf-util" (OuterVolumeSpecName: "util") pod "5c4805aa-64a3-4354-b3ab-ab48935503cf" (UID: "5c4805aa-64a3-4354-b3ab-ab48935503cf"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:08:01 crc kubenswrapper[4771]: I1001 15:08:01.517147 4771 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/5c4805aa-64a3-4354-b3ab-ab48935503cf-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:08:01 crc kubenswrapper[4771]: I1001 15:08:01.517210 4771 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/5c4805aa-64a3-4354-b3ab-ab48935503cf-util\") on node \"crc\" DevicePath \"\"" Oct 01 15:08:01 crc kubenswrapper[4771]: I1001 15:08:01.517236 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pdnd\" (UniqueName: \"kubernetes.io/projected/5c4805aa-64a3-4354-b3ab-ab48935503cf-kube-api-access-5pdnd\") on node \"crc\" DevicePath \"\"" Oct 01 15:08:01 crc kubenswrapper[4771]: I1001 15:08:01.987668 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96g4kvv" Oct 01 15:08:02 crc kubenswrapper[4771]: I1001 15:08:02.001914 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96g4kvv" event={"ID":"5c4805aa-64a3-4354-b3ab-ab48935503cf","Type":"ContainerDied","Data":"976cf674d2852e34b80695c9b61d29b7aed6a9eb9e5208f4c1c9c839aa603f74"} Oct 01 15:08:02 crc kubenswrapper[4771]: I1001 15:08:02.002113 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="976cf674d2852e34b80695c9b61d29b7aed6a9eb9e5208f4c1c9c839aa603f74" Oct 01 15:08:09 crc kubenswrapper[4771]: I1001 15:08:09.789910 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-554dcf567c-bnmzr"] Oct 01 15:08:09 crc kubenswrapper[4771]: E1001 15:08:09.790467 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c4805aa-64a3-4354-b3ab-ab48935503cf" containerName="extract" Oct 01 15:08:09 crc kubenswrapper[4771]: I1001 15:08:09.790477 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c4805aa-64a3-4354-b3ab-ab48935503cf" containerName="extract" Oct 01 15:08:09 crc kubenswrapper[4771]: E1001 15:08:09.790488 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9db1abd4-f11c-45e1-9341-6c818c3e3579" containerName="console" Oct 01 15:08:09 crc kubenswrapper[4771]: I1001 15:08:09.790494 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="9db1abd4-f11c-45e1-9341-6c818c3e3579" containerName="console" Oct 01 15:08:09 crc kubenswrapper[4771]: E1001 15:08:09.790502 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c4805aa-64a3-4354-b3ab-ab48935503cf" containerName="util" Oct 01 15:08:09 crc kubenswrapper[4771]: I1001 15:08:09.790508 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c4805aa-64a3-4354-b3ab-ab48935503cf" containerName="util" Oct 01 15:08:09 crc kubenswrapper[4771]: E1001 15:08:09.790522 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c4805aa-64a3-4354-b3ab-ab48935503cf" containerName="pull" Oct 01 15:08:09 crc kubenswrapper[4771]: I1001 15:08:09.790527 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c4805aa-64a3-4354-b3ab-ab48935503cf" containerName="pull" Oct 01 15:08:09 crc kubenswrapper[4771]: I1001 15:08:09.790631 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c4805aa-64a3-4354-b3ab-ab48935503cf" containerName="extract" Oct 01 15:08:09 crc kubenswrapper[4771]: I1001 15:08:09.790640 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="9db1abd4-f11c-45e1-9341-6c818c3e3579" containerName="console" Oct 01 15:08:09 crc kubenswrapper[4771]: I1001 15:08:09.790980 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-554dcf567c-bnmzr" Oct 01 15:08:09 crc kubenswrapper[4771]: I1001 15:08:09.793757 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 01 15:08:09 crc kubenswrapper[4771]: I1001 15:08:09.794091 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 01 15:08:09 crc kubenswrapper[4771]: I1001 15:08:09.794445 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 01 15:08:09 crc kubenswrapper[4771]: I1001 15:08:09.794844 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 01 15:08:09 crc kubenswrapper[4771]: I1001 15:08:09.795461 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-rlwtq" Oct 01 15:08:09 crc kubenswrapper[4771]: I1001 15:08:09.817401 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/35c6bc93-608d-4534-9ccd-493ea57f189d-apiservice-cert\") pod \"metallb-operator-controller-manager-554dcf567c-bnmzr\" (UID: \"35c6bc93-608d-4534-9ccd-493ea57f189d\") " pod="metallb-system/metallb-operator-controller-manager-554dcf567c-bnmzr" Oct 01 15:08:09 crc kubenswrapper[4771]: I1001 15:08:09.817464 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/35c6bc93-608d-4534-9ccd-493ea57f189d-webhook-cert\") pod \"metallb-operator-controller-manager-554dcf567c-bnmzr\" (UID: \"35c6bc93-608d-4534-9ccd-493ea57f189d\") " pod="metallb-system/metallb-operator-controller-manager-554dcf567c-bnmzr" Oct 01 15:08:09 crc kubenswrapper[4771]: I1001 15:08:09.817491 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5nv9\" (UniqueName: \"kubernetes.io/projected/35c6bc93-608d-4534-9ccd-493ea57f189d-kube-api-access-r5nv9\") pod \"metallb-operator-controller-manager-554dcf567c-bnmzr\" (UID: \"35c6bc93-608d-4534-9ccd-493ea57f189d\") " pod="metallb-system/metallb-operator-controller-manager-554dcf567c-bnmzr" Oct 01 15:08:09 crc kubenswrapper[4771]: I1001 15:08:09.840825 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-554dcf567c-bnmzr"] Oct 01 15:08:09 crc kubenswrapper[4771]: I1001 15:08:09.918832 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/35c6bc93-608d-4534-9ccd-493ea57f189d-apiservice-cert\") pod \"metallb-operator-controller-manager-554dcf567c-bnmzr\" (UID: \"35c6bc93-608d-4534-9ccd-493ea57f189d\") " pod="metallb-system/metallb-operator-controller-manager-554dcf567c-bnmzr" Oct 01 15:08:09 crc kubenswrapper[4771]: I1001 15:08:09.919223 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/35c6bc93-608d-4534-9ccd-493ea57f189d-webhook-cert\") pod \"metallb-operator-controller-manager-554dcf567c-bnmzr\" (UID: \"35c6bc93-608d-4534-9ccd-493ea57f189d\") " pod="metallb-system/metallb-operator-controller-manager-554dcf567c-bnmzr" Oct 01 15:08:09 crc kubenswrapper[4771]: I1001 15:08:09.919359 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5nv9\" (UniqueName: \"kubernetes.io/projected/35c6bc93-608d-4534-9ccd-493ea57f189d-kube-api-access-r5nv9\") pod \"metallb-operator-controller-manager-554dcf567c-bnmzr\" (UID: \"35c6bc93-608d-4534-9ccd-493ea57f189d\") " pod="metallb-system/metallb-operator-controller-manager-554dcf567c-bnmzr" Oct 01 15:08:09 crc kubenswrapper[4771]: I1001 15:08:09.924791 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/35c6bc93-608d-4534-9ccd-493ea57f189d-webhook-cert\") pod \"metallb-operator-controller-manager-554dcf567c-bnmzr\" (UID: \"35c6bc93-608d-4534-9ccd-493ea57f189d\") " pod="metallb-system/metallb-operator-controller-manager-554dcf567c-bnmzr" Oct 01 15:08:09 crc kubenswrapper[4771]: I1001 15:08:09.925399 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/35c6bc93-608d-4534-9ccd-493ea57f189d-apiservice-cert\") pod \"metallb-operator-controller-manager-554dcf567c-bnmzr\" (UID: \"35c6bc93-608d-4534-9ccd-493ea57f189d\") " pod="metallb-system/metallb-operator-controller-manager-554dcf567c-bnmzr" Oct 01 15:08:09 crc kubenswrapper[4771]: I1001 15:08:09.950273 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5nv9\" (UniqueName: \"kubernetes.io/projected/35c6bc93-608d-4534-9ccd-493ea57f189d-kube-api-access-r5nv9\") pod \"metallb-operator-controller-manager-554dcf567c-bnmzr\" (UID: \"35c6bc93-608d-4534-9ccd-493ea57f189d\") " pod="metallb-system/metallb-operator-controller-manager-554dcf567c-bnmzr" Oct 01 15:08:10 crc kubenswrapper[4771]: I1001 15:08:10.104824 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-554dcf567c-bnmzr" Oct 01 15:08:10 crc kubenswrapper[4771]: I1001 15:08:10.142545 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-bd9669845-gnppv"] Oct 01 15:08:10 crc kubenswrapper[4771]: I1001 15:08:10.143359 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-bd9669845-gnppv" Oct 01 15:08:10 crc kubenswrapper[4771]: I1001 15:08:10.149249 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 01 15:08:10 crc kubenswrapper[4771]: I1001 15:08:10.149238 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 01 15:08:10 crc kubenswrapper[4771]: I1001 15:08:10.149571 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-lv2gv" Oct 01 15:08:10 crc kubenswrapper[4771]: I1001 15:08:10.171292 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-bd9669845-gnppv"] Oct 01 15:08:10 crc kubenswrapper[4771]: I1001 15:08:10.221977 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b09aca49-c227-4561-9f87-df661ec6d85c-webhook-cert\") pod \"metallb-operator-webhook-server-bd9669845-gnppv\" (UID: \"b09aca49-c227-4561-9f87-df661ec6d85c\") " pod="metallb-system/metallb-operator-webhook-server-bd9669845-gnppv" Oct 01 15:08:10 crc kubenswrapper[4771]: I1001 15:08:10.222347 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b09aca49-c227-4561-9f87-df661ec6d85c-apiservice-cert\") pod \"metallb-operator-webhook-server-bd9669845-gnppv\" (UID: \"b09aca49-c227-4561-9f87-df661ec6d85c\") " pod="metallb-system/metallb-operator-webhook-server-bd9669845-gnppv" Oct 01 15:08:10 crc kubenswrapper[4771]: I1001 15:08:10.222413 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m75wg\" (UniqueName: \"kubernetes.io/projected/b09aca49-c227-4561-9f87-df661ec6d85c-kube-api-access-m75wg\") pod \"metallb-operator-webhook-server-bd9669845-gnppv\" (UID: \"b09aca49-c227-4561-9f87-df661ec6d85c\") " pod="metallb-system/metallb-operator-webhook-server-bd9669845-gnppv" Oct 01 15:08:10 crc kubenswrapper[4771]: I1001 15:08:10.323117 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b09aca49-c227-4561-9f87-df661ec6d85c-webhook-cert\") pod \"metallb-operator-webhook-server-bd9669845-gnppv\" (UID: \"b09aca49-c227-4561-9f87-df661ec6d85c\") " pod="metallb-system/metallb-operator-webhook-server-bd9669845-gnppv" Oct 01 15:08:10 crc kubenswrapper[4771]: I1001 15:08:10.323184 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b09aca49-c227-4561-9f87-df661ec6d85c-apiservice-cert\") pod \"metallb-operator-webhook-server-bd9669845-gnppv\" (UID: \"b09aca49-c227-4561-9f87-df661ec6d85c\") " pod="metallb-system/metallb-operator-webhook-server-bd9669845-gnppv" Oct 01 15:08:10 crc kubenswrapper[4771]: I1001 15:08:10.323224 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m75wg\" (UniqueName: \"kubernetes.io/projected/b09aca49-c227-4561-9f87-df661ec6d85c-kube-api-access-m75wg\") pod \"metallb-operator-webhook-server-bd9669845-gnppv\" (UID: \"b09aca49-c227-4561-9f87-df661ec6d85c\") " pod="metallb-system/metallb-operator-webhook-server-bd9669845-gnppv" Oct 01 15:08:10 crc kubenswrapper[4771]: I1001 15:08:10.327113 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b09aca49-c227-4561-9f87-df661ec6d85c-apiservice-cert\") pod \"metallb-operator-webhook-server-bd9669845-gnppv\" (UID: \"b09aca49-c227-4561-9f87-df661ec6d85c\") " pod="metallb-system/metallb-operator-webhook-server-bd9669845-gnppv" Oct 01 15:08:10 crc kubenswrapper[4771]: I1001 15:08:10.327316 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b09aca49-c227-4561-9f87-df661ec6d85c-webhook-cert\") pod \"metallb-operator-webhook-server-bd9669845-gnppv\" (UID: \"b09aca49-c227-4561-9f87-df661ec6d85c\") " pod="metallb-system/metallb-operator-webhook-server-bd9669845-gnppv" Oct 01 15:08:10 crc kubenswrapper[4771]: I1001 15:08:10.340766 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m75wg\" (UniqueName: \"kubernetes.io/projected/b09aca49-c227-4561-9f87-df661ec6d85c-kube-api-access-m75wg\") pod \"metallb-operator-webhook-server-bd9669845-gnppv\" (UID: \"b09aca49-c227-4561-9f87-df661ec6d85c\") " pod="metallb-system/metallb-operator-webhook-server-bd9669845-gnppv" Oct 01 15:08:10 crc kubenswrapper[4771]: I1001 15:08:10.368563 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-554dcf567c-bnmzr"] Oct 01 15:08:10 crc kubenswrapper[4771]: I1001 15:08:10.475826 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-bd9669845-gnppv" Oct 01 15:08:10 crc kubenswrapper[4771]: I1001 15:08:10.747653 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-bd9669845-gnppv"] Oct 01 15:08:10 crc kubenswrapper[4771]: W1001 15:08:10.752344 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb09aca49_c227_4561_9f87_df661ec6d85c.slice/crio-f8b735ea0b19ad3d47e30fb2d17392321755acc73d523d64e69b22f71c3229f7 WatchSource:0}: Error finding container f8b735ea0b19ad3d47e30fb2d17392321755acc73d523d64e69b22f71c3229f7: Status 404 returned error can't find the container with id f8b735ea0b19ad3d47e30fb2d17392321755acc73d523d64e69b22f71c3229f7 Oct 01 15:08:11 crc kubenswrapper[4771]: I1001 15:08:11.035922 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-bd9669845-gnppv" event={"ID":"b09aca49-c227-4561-9f87-df661ec6d85c","Type":"ContainerStarted","Data":"f8b735ea0b19ad3d47e30fb2d17392321755acc73d523d64e69b22f71c3229f7"} Oct 01 15:08:11 crc kubenswrapper[4771]: I1001 15:08:11.038086 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-554dcf567c-bnmzr" event={"ID":"35c6bc93-608d-4534-9ccd-493ea57f189d","Type":"ContainerStarted","Data":"399a511b99b527bf1e025d3abb35ed2ce96bac3d4deff79088cf5d5d1818adf4"} Oct 01 15:08:12 crc kubenswrapper[4771]: I1001 15:08:12.177358 4771 patch_prober.go:28] interesting pod/machine-config-daemon-vck47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 15:08:12 crc kubenswrapper[4771]: I1001 15:08:12.177435 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 15:08:14 crc kubenswrapper[4771]: I1001 15:08:14.056505 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-554dcf567c-bnmzr" event={"ID":"35c6bc93-608d-4534-9ccd-493ea57f189d","Type":"ContainerStarted","Data":"ddbbe5615ac00c3006ba6ee31edcd4802618ccfacae07fc892791ca6649229ee"} Oct 01 15:08:14 crc kubenswrapper[4771]: I1001 15:08:14.057307 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-554dcf567c-bnmzr" Oct 01 15:08:14 crc kubenswrapper[4771]: I1001 15:08:14.075121 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-554dcf567c-bnmzr" podStartSLOduration=1.776107674 podStartE2EDuration="5.075107087s" podCreationTimestamp="2025-10-01 15:08:09 +0000 UTC" firstStartedPulling="2025-10-01 15:08:10.379410613 +0000 UTC m=+734.998585784" lastFinishedPulling="2025-10-01 15:08:13.678410016 +0000 UTC m=+738.297585197" observedRunningTime="2025-10-01 15:08:14.073770954 +0000 UTC m=+738.692946125" watchObservedRunningTime="2025-10-01 15:08:14.075107087 +0000 UTC m=+738.694282258" Oct 01 15:08:16 crc kubenswrapper[4771]: I1001 15:08:16.068116 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-bd9669845-gnppv" event={"ID":"b09aca49-c227-4561-9f87-df661ec6d85c","Type":"ContainerStarted","Data":"1980226db17b3b5c1f308a9613e6802a425ff357af70cbc4136ad53ca3ad8415"} Oct 01 15:08:17 crc kubenswrapper[4771]: I1001 15:08:17.073662 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-bd9669845-gnppv" Oct 01 15:08:30 crc kubenswrapper[4771]: I1001 15:08:30.116501 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-bd9669845-gnppv" podStartSLOduration=15.23777254 podStartE2EDuration="20.116484363s" podCreationTimestamp="2025-10-01 15:08:10 +0000 UTC" firstStartedPulling="2025-10-01 15:08:10.754987459 +0000 UTC m=+735.374162630" lastFinishedPulling="2025-10-01 15:08:15.633699282 +0000 UTC m=+740.252874453" observedRunningTime="2025-10-01 15:08:16.084399754 +0000 UTC m=+740.703574975" watchObservedRunningTime="2025-10-01 15:08:30.116484363 +0000 UTC m=+754.735659534" Oct 01 15:08:30 crc kubenswrapper[4771]: I1001 15:08:30.118384 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mjvfw"] Oct 01 15:08:30 crc kubenswrapper[4771]: I1001 15:08:30.118885 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-mjvfw" podUID="cd8df04a-9a5e-4784-ac97-9782d936fa5e" containerName="controller-manager" containerID="cri-o://4c3e9c976c2eb0ace6d070c414635b01ca1e0cd35e2030f5d695a1b9a0bd35af" gracePeriod=30 Oct 01 15:08:30 crc kubenswrapper[4771]: I1001 15:08:30.151566 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsjtg"] Oct 01 15:08:30 crc kubenswrapper[4771]: I1001 15:08:30.151837 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsjtg" podUID="335db6c7-efb8-4055-aacf-8262b4ec5b91" containerName="route-controller-manager" containerID="cri-o://f4d36301a185aa3ca50c02164c022c1580b0b325c55edb2d3ea1c466358e2d8c" gracePeriod=30 Oct 01 15:08:30 crc kubenswrapper[4771]: I1001 15:08:30.480948 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-bd9669845-gnppv" Oct 01 15:08:30 crc kubenswrapper[4771]: I1001 15:08:30.511079 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-mjvfw" Oct 01 15:08:30 crc kubenswrapper[4771]: I1001 15:08:30.598815 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd8df04a-9a5e-4784-ac97-9782d936fa5e-config\") pod \"cd8df04a-9a5e-4784-ac97-9782d936fa5e\" (UID: \"cd8df04a-9a5e-4784-ac97-9782d936fa5e\") " Oct 01 15:08:30 crc kubenswrapper[4771]: I1001 15:08:30.599682 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd8df04a-9a5e-4784-ac97-9782d936fa5e-config" (OuterVolumeSpecName: "config") pod "cd8df04a-9a5e-4784-ac97-9782d936fa5e" (UID: "cd8df04a-9a5e-4784-ac97-9782d936fa5e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:08:30 crc kubenswrapper[4771]: I1001 15:08:30.609318 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsjtg" Oct 01 15:08:30 crc kubenswrapper[4771]: I1001 15:08:30.699651 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd8df04a-9a5e-4784-ac97-9782d936fa5e-serving-cert\") pod \"cd8df04a-9a5e-4784-ac97-9782d936fa5e\" (UID: \"cd8df04a-9a5e-4784-ac97-9782d936fa5e\") " Oct 01 15:08:30 crc kubenswrapper[4771]: I1001 15:08:30.699696 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/335db6c7-efb8-4055-aacf-8262b4ec5b91-client-ca\") pod \"335db6c7-efb8-4055-aacf-8262b4ec5b91\" (UID: \"335db6c7-efb8-4055-aacf-8262b4ec5b91\") " Oct 01 15:08:30 crc kubenswrapper[4771]: I1001 15:08:30.699833 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cd8df04a-9a5e-4784-ac97-9782d936fa5e-proxy-ca-bundles\") pod \"cd8df04a-9a5e-4784-ac97-9782d936fa5e\" (UID: \"cd8df04a-9a5e-4784-ac97-9782d936fa5e\") " Oct 01 15:08:30 crc kubenswrapper[4771]: I1001 15:08:30.699857 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwcdw\" (UniqueName: \"kubernetes.io/projected/cd8df04a-9a5e-4784-ac97-9782d936fa5e-kube-api-access-dwcdw\") pod \"cd8df04a-9a5e-4784-ac97-9782d936fa5e\" (UID: \"cd8df04a-9a5e-4784-ac97-9782d936fa5e\") " Oct 01 15:08:30 crc kubenswrapper[4771]: I1001 15:08:30.699877 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/335db6c7-efb8-4055-aacf-8262b4ec5b91-config\") pod \"335db6c7-efb8-4055-aacf-8262b4ec5b91\" (UID: \"335db6c7-efb8-4055-aacf-8262b4ec5b91\") " Oct 01 15:08:30 crc kubenswrapper[4771]: I1001 15:08:30.699895 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/335db6c7-efb8-4055-aacf-8262b4ec5b91-serving-cert\") pod \"335db6c7-efb8-4055-aacf-8262b4ec5b91\" (UID: \"335db6c7-efb8-4055-aacf-8262b4ec5b91\") " Oct 01 15:08:30 crc kubenswrapper[4771]: I1001 15:08:30.699915 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cd8df04a-9a5e-4784-ac97-9782d936fa5e-client-ca\") pod \"cd8df04a-9a5e-4784-ac97-9782d936fa5e\" (UID: \"cd8df04a-9a5e-4784-ac97-9782d936fa5e\") " Oct 01 15:08:30 crc kubenswrapper[4771]: I1001 15:08:30.699947 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2n5gh\" (UniqueName: \"kubernetes.io/projected/335db6c7-efb8-4055-aacf-8262b4ec5b91-kube-api-access-2n5gh\") pod \"335db6c7-efb8-4055-aacf-8262b4ec5b91\" (UID: \"335db6c7-efb8-4055-aacf-8262b4ec5b91\") " Oct 01 15:08:30 crc kubenswrapper[4771]: I1001 15:08:30.700104 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd8df04a-9a5e-4784-ac97-9782d936fa5e-config\") on node \"crc\" DevicePath \"\"" Oct 01 15:08:30 crc kubenswrapper[4771]: I1001 15:08:30.700191 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd8df04a-9a5e-4784-ac97-9782d936fa5e-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "cd8df04a-9a5e-4784-ac97-9782d936fa5e" (UID: "cd8df04a-9a5e-4784-ac97-9782d936fa5e"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:08:30 crc kubenswrapper[4771]: I1001 15:08:30.700565 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd8df04a-9a5e-4784-ac97-9782d936fa5e-client-ca" (OuterVolumeSpecName: "client-ca") pod "cd8df04a-9a5e-4784-ac97-9782d936fa5e" (UID: "cd8df04a-9a5e-4784-ac97-9782d936fa5e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:08:30 crc kubenswrapper[4771]: I1001 15:08:30.700676 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/335db6c7-efb8-4055-aacf-8262b4ec5b91-client-ca" (OuterVolumeSpecName: "client-ca") pod "335db6c7-efb8-4055-aacf-8262b4ec5b91" (UID: "335db6c7-efb8-4055-aacf-8262b4ec5b91"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:08:30 crc kubenswrapper[4771]: I1001 15:08:30.700750 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/335db6c7-efb8-4055-aacf-8262b4ec5b91-config" (OuterVolumeSpecName: "config") pod "335db6c7-efb8-4055-aacf-8262b4ec5b91" (UID: "335db6c7-efb8-4055-aacf-8262b4ec5b91"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:08:30 crc kubenswrapper[4771]: I1001 15:08:30.710190 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/335db6c7-efb8-4055-aacf-8262b4ec5b91-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "335db6c7-efb8-4055-aacf-8262b4ec5b91" (UID: "335db6c7-efb8-4055-aacf-8262b4ec5b91"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:08:30 crc kubenswrapper[4771]: I1001 15:08:30.710523 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/335db6c7-efb8-4055-aacf-8262b4ec5b91-kube-api-access-2n5gh" (OuterVolumeSpecName: "kube-api-access-2n5gh") pod "335db6c7-efb8-4055-aacf-8262b4ec5b91" (UID: "335db6c7-efb8-4055-aacf-8262b4ec5b91"). InnerVolumeSpecName "kube-api-access-2n5gh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:08:30 crc kubenswrapper[4771]: I1001 15:08:30.710821 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd8df04a-9a5e-4784-ac97-9782d936fa5e-kube-api-access-dwcdw" (OuterVolumeSpecName: "kube-api-access-dwcdw") pod "cd8df04a-9a5e-4784-ac97-9782d936fa5e" (UID: "cd8df04a-9a5e-4784-ac97-9782d936fa5e"). InnerVolumeSpecName "kube-api-access-dwcdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:08:30 crc kubenswrapper[4771]: I1001 15:08:30.711039 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd8df04a-9a5e-4784-ac97-9782d936fa5e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "cd8df04a-9a5e-4784-ac97-9782d936fa5e" (UID: "cd8df04a-9a5e-4784-ac97-9782d936fa5e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:08:30 crc kubenswrapper[4771]: I1001 15:08:30.801639 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2n5gh\" (UniqueName: \"kubernetes.io/projected/335db6c7-efb8-4055-aacf-8262b4ec5b91-kube-api-access-2n5gh\") on node \"crc\" DevicePath \"\"" Oct 01 15:08:30 crc kubenswrapper[4771]: I1001 15:08:30.801694 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd8df04a-9a5e-4784-ac97-9782d936fa5e-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 15:08:30 crc kubenswrapper[4771]: I1001 15:08:30.801713 4771 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/335db6c7-efb8-4055-aacf-8262b4ec5b91-client-ca\") on node \"crc\" DevicePath \"\"" Oct 01 15:08:30 crc kubenswrapper[4771]: I1001 15:08:30.801788 4771 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cd8df04a-9a5e-4784-ac97-9782d936fa5e-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 01 15:08:30 crc kubenswrapper[4771]: I1001 15:08:30.801805 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwcdw\" (UniqueName: \"kubernetes.io/projected/cd8df04a-9a5e-4784-ac97-9782d936fa5e-kube-api-access-dwcdw\") on node \"crc\" DevicePath \"\"" Oct 01 15:08:30 crc kubenswrapper[4771]: I1001 15:08:30.801818 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/335db6c7-efb8-4055-aacf-8262b4ec5b91-config\") on node \"crc\" DevicePath \"\"" Oct 01 15:08:30 crc kubenswrapper[4771]: I1001 15:08:30.801830 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/335db6c7-efb8-4055-aacf-8262b4ec5b91-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 15:08:30 crc kubenswrapper[4771]: I1001 15:08:30.801843 4771 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cd8df04a-9a5e-4784-ac97-9782d936fa5e-client-ca\") on node \"crc\" DevicePath \"\"" Oct 01 15:08:31 crc kubenswrapper[4771]: I1001 15:08:31.154798 4771 generic.go:334] "Generic (PLEG): container finished" podID="335db6c7-efb8-4055-aacf-8262b4ec5b91" containerID="f4d36301a185aa3ca50c02164c022c1580b0b325c55edb2d3ea1c466358e2d8c" exitCode=0 Oct 01 15:08:31 crc kubenswrapper[4771]: I1001 15:08:31.154884 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsjtg" event={"ID":"335db6c7-efb8-4055-aacf-8262b4ec5b91","Type":"ContainerDied","Data":"f4d36301a185aa3ca50c02164c022c1580b0b325c55edb2d3ea1c466358e2d8c"} Oct 01 15:08:31 crc kubenswrapper[4771]: I1001 15:08:31.154922 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsjtg" event={"ID":"335db6c7-efb8-4055-aacf-8262b4ec5b91","Type":"ContainerDied","Data":"722b0373c3f9bac7910d2d75be73c212cbb50e883aa52ddf6a0ae21ae5721f77"} Oct 01 15:08:31 crc kubenswrapper[4771]: I1001 15:08:31.154950 4771 scope.go:117] "RemoveContainer" containerID="f4d36301a185aa3ca50c02164c022c1580b0b325c55edb2d3ea1c466358e2d8c" Oct 01 15:08:31 crc kubenswrapper[4771]: I1001 15:08:31.155097 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsjtg" Oct 01 15:08:31 crc kubenswrapper[4771]: I1001 15:08:31.159987 4771 generic.go:334] "Generic (PLEG): container finished" podID="cd8df04a-9a5e-4784-ac97-9782d936fa5e" containerID="4c3e9c976c2eb0ace6d070c414635b01ca1e0cd35e2030f5d695a1b9a0bd35af" exitCode=0 Oct 01 15:08:31 crc kubenswrapper[4771]: I1001 15:08:31.160030 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-mjvfw" event={"ID":"cd8df04a-9a5e-4784-ac97-9782d936fa5e","Type":"ContainerDied","Data":"4c3e9c976c2eb0ace6d070c414635b01ca1e0cd35e2030f5d695a1b9a0bd35af"} Oct 01 15:08:31 crc kubenswrapper[4771]: I1001 15:08:31.160055 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-mjvfw" event={"ID":"cd8df04a-9a5e-4784-ac97-9782d936fa5e","Type":"ContainerDied","Data":"04b1598296b2c2a54e152e156a7b6a4d9a4db06b2f471b4e81adf4cd7f938b26"} Oct 01 15:08:31 crc kubenswrapper[4771]: I1001 15:08:31.160102 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-mjvfw" Oct 01 15:08:31 crc kubenswrapper[4771]: I1001 15:08:31.175747 4771 scope.go:117] "RemoveContainer" containerID="f4d36301a185aa3ca50c02164c022c1580b0b325c55edb2d3ea1c466358e2d8c" Oct 01 15:08:31 crc kubenswrapper[4771]: E1001 15:08:31.176214 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4d36301a185aa3ca50c02164c022c1580b0b325c55edb2d3ea1c466358e2d8c\": container with ID starting with f4d36301a185aa3ca50c02164c022c1580b0b325c55edb2d3ea1c466358e2d8c not found: ID does not exist" containerID="f4d36301a185aa3ca50c02164c022c1580b0b325c55edb2d3ea1c466358e2d8c" Oct 01 15:08:31 crc kubenswrapper[4771]: I1001 15:08:31.176242 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4d36301a185aa3ca50c02164c022c1580b0b325c55edb2d3ea1c466358e2d8c"} err="failed to get container status \"f4d36301a185aa3ca50c02164c022c1580b0b325c55edb2d3ea1c466358e2d8c\": rpc error: code = NotFound desc = could not find container \"f4d36301a185aa3ca50c02164c022c1580b0b325c55edb2d3ea1c466358e2d8c\": container with ID starting with f4d36301a185aa3ca50c02164c022c1580b0b325c55edb2d3ea1c466358e2d8c not found: ID does not exist" Oct 01 15:08:31 crc kubenswrapper[4771]: I1001 15:08:31.176262 4771 scope.go:117] "RemoveContainer" containerID="4c3e9c976c2eb0ace6d070c414635b01ca1e0cd35e2030f5d695a1b9a0bd35af" Oct 01 15:08:31 crc kubenswrapper[4771]: I1001 15:08:31.196642 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mjvfw"] Oct 01 15:08:31 crc kubenswrapper[4771]: I1001 15:08:31.198883 4771 scope.go:117] "RemoveContainer" containerID="4c3e9c976c2eb0ace6d070c414635b01ca1e0cd35e2030f5d695a1b9a0bd35af" Oct 01 15:08:31 crc kubenswrapper[4771]: E1001 15:08:31.199389 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c3e9c976c2eb0ace6d070c414635b01ca1e0cd35e2030f5d695a1b9a0bd35af\": container with ID starting with 4c3e9c976c2eb0ace6d070c414635b01ca1e0cd35e2030f5d695a1b9a0bd35af not found: ID does not exist" containerID="4c3e9c976c2eb0ace6d070c414635b01ca1e0cd35e2030f5d695a1b9a0bd35af" Oct 01 15:08:31 crc kubenswrapper[4771]: I1001 15:08:31.199422 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c3e9c976c2eb0ace6d070c414635b01ca1e0cd35e2030f5d695a1b9a0bd35af"} err="failed to get container status \"4c3e9c976c2eb0ace6d070c414635b01ca1e0cd35e2030f5d695a1b9a0bd35af\": rpc error: code = NotFound desc = could not find container \"4c3e9c976c2eb0ace6d070c414635b01ca1e0cd35e2030f5d695a1b9a0bd35af\": container with ID starting with 4c3e9c976c2eb0ace6d070c414635b01ca1e0cd35e2030f5d695a1b9a0bd35af not found: ID does not exist" Oct 01 15:08:31 crc kubenswrapper[4771]: I1001 15:08:31.205194 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mjvfw"] Oct 01 15:08:31 crc kubenswrapper[4771]: I1001 15:08:31.218789 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsjtg"] Oct 01 15:08:31 crc kubenswrapper[4771]: I1001 15:08:31.232802 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-dsjtg"] Oct 01 15:08:31 crc kubenswrapper[4771]: I1001 15:08:31.991521 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="335db6c7-efb8-4055-aacf-8262b4ec5b91" path="/var/lib/kubelet/pods/335db6c7-efb8-4055-aacf-8262b4ec5b91/volumes" Oct 01 15:08:31 crc kubenswrapper[4771]: I1001 15:08:31.992320 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd8df04a-9a5e-4784-ac97-9782d936fa5e" path="/var/lib/kubelet/pods/cd8df04a-9a5e-4784-ac97-9782d936fa5e/volumes" Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.095566 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7699d4447b-94ncw"] Oct 01 15:08:32 crc kubenswrapper[4771]: E1001 15:08:32.095825 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd8df04a-9a5e-4784-ac97-9782d936fa5e" containerName="controller-manager" Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.095845 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd8df04a-9a5e-4784-ac97-9782d936fa5e" containerName="controller-manager" Oct 01 15:08:32 crc kubenswrapper[4771]: E1001 15:08:32.095867 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="335db6c7-efb8-4055-aacf-8262b4ec5b91" containerName="route-controller-manager" Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.095876 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="335db6c7-efb8-4055-aacf-8262b4ec5b91" containerName="route-controller-manager" Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.096009 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd8df04a-9a5e-4784-ac97-9782d936fa5e" containerName="controller-manager" Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.096037 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="335db6c7-efb8-4055-aacf-8262b4ec5b91" containerName="route-controller-manager" Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.096447 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7699d4447b-94ncw" Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.098855 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.099289 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.099748 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.099869 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.100993 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.102122 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f5856f6cb-t2647"] Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.103255 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f5856f6cb-t2647" Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.106383 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.106410 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.106673 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.106817 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.106928 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.107166 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.107849 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.110431 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7699d4447b-94ncw"] Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.112177 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.132523 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f5856f6cb-t2647"] Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.164757 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7699d4447b-94ncw"] Oct 01 15:08:32 crc kubenswrapper[4771]: E1001 15:08:32.165418 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config kube-api-access-kbpmp proxy-ca-bundles serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-7699d4447b-94ncw" podUID="3cc72c1a-01c7-4b4a-8144-b389ec172c11" Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.185855 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f5856f6cb-t2647"] Oct 01 15:08:32 crc kubenswrapper[4771]: E1001 15:08:32.186322 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config kube-api-access-hgv99 serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-route-controller-manager/route-controller-manager-6f5856f6cb-t2647" podUID="3451b802-59e8-48d9-b5ca-feafeddc6712" Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.222427 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3cc72c1a-01c7-4b4a-8144-b389ec172c11-client-ca\") pod \"controller-manager-7699d4447b-94ncw\" (UID: \"3cc72c1a-01c7-4b4a-8144-b389ec172c11\") " pod="openshift-controller-manager/controller-manager-7699d4447b-94ncw" Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.222472 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3cc72c1a-01c7-4b4a-8144-b389ec172c11-serving-cert\") pod \"controller-manager-7699d4447b-94ncw\" (UID: \"3cc72c1a-01c7-4b4a-8144-b389ec172c11\") " pod="openshift-controller-manager/controller-manager-7699d4447b-94ncw" Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.222502 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgv99\" (UniqueName: \"kubernetes.io/projected/3451b802-59e8-48d9-b5ca-feafeddc6712-kube-api-access-hgv99\") pod \"route-controller-manager-6f5856f6cb-t2647\" (UID: \"3451b802-59e8-48d9-b5ca-feafeddc6712\") " pod="openshift-route-controller-manager/route-controller-manager-6f5856f6cb-t2647" Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.222710 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3cc72c1a-01c7-4b4a-8144-b389ec172c11-proxy-ca-bundles\") pod \"controller-manager-7699d4447b-94ncw\" (UID: \"3cc72c1a-01c7-4b4a-8144-b389ec172c11\") " pod="openshift-controller-manager/controller-manager-7699d4447b-94ncw" Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.222833 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3451b802-59e8-48d9-b5ca-feafeddc6712-serving-cert\") pod \"route-controller-manager-6f5856f6cb-t2647\" (UID: \"3451b802-59e8-48d9-b5ca-feafeddc6712\") " pod="openshift-route-controller-manager/route-controller-manager-6f5856f6cb-t2647" Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.222861 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3451b802-59e8-48d9-b5ca-feafeddc6712-config\") pod \"route-controller-manager-6f5856f6cb-t2647\" (UID: \"3451b802-59e8-48d9-b5ca-feafeddc6712\") " pod="openshift-route-controller-manager/route-controller-manager-6f5856f6cb-t2647" Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.222885 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cc72c1a-01c7-4b4a-8144-b389ec172c11-config\") pod \"controller-manager-7699d4447b-94ncw\" (UID: \"3cc72c1a-01c7-4b4a-8144-b389ec172c11\") " pod="openshift-controller-manager/controller-manager-7699d4447b-94ncw" Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.222976 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbpmp\" (UniqueName: \"kubernetes.io/projected/3cc72c1a-01c7-4b4a-8144-b389ec172c11-kube-api-access-kbpmp\") pod \"controller-manager-7699d4447b-94ncw\" (UID: \"3cc72c1a-01c7-4b4a-8144-b389ec172c11\") " pod="openshift-controller-manager/controller-manager-7699d4447b-94ncw" Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.223019 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3451b802-59e8-48d9-b5ca-feafeddc6712-client-ca\") pod \"route-controller-manager-6f5856f6cb-t2647\" (UID: \"3451b802-59e8-48d9-b5ca-feafeddc6712\") " pod="openshift-route-controller-manager/route-controller-manager-6f5856f6cb-t2647" Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.324029 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3cc72c1a-01c7-4b4a-8144-b389ec172c11-client-ca\") pod \"controller-manager-7699d4447b-94ncw\" (UID: \"3cc72c1a-01c7-4b4a-8144-b389ec172c11\") " pod="openshift-controller-manager/controller-manager-7699d4447b-94ncw" Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.324088 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3cc72c1a-01c7-4b4a-8144-b389ec172c11-serving-cert\") pod \"controller-manager-7699d4447b-94ncw\" (UID: \"3cc72c1a-01c7-4b4a-8144-b389ec172c11\") " pod="openshift-controller-manager/controller-manager-7699d4447b-94ncw" Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.324146 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgv99\" (UniqueName: \"kubernetes.io/projected/3451b802-59e8-48d9-b5ca-feafeddc6712-kube-api-access-hgv99\") pod \"route-controller-manager-6f5856f6cb-t2647\" (UID: \"3451b802-59e8-48d9-b5ca-feafeddc6712\") " pod="openshift-route-controller-manager/route-controller-manager-6f5856f6cb-t2647" Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.324183 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3cc72c1a-01c7-4b4a-8144-b389ec172c11-proxy-ca-bundles\") pod \"controller-manager-7699d4447b-94ncw\" (UID: \"3cc72c1a-01c7-4b4a-8144-b389ec172c11\") " pod="openshift-controller-manager/controller-manager-7699d4447b-94ncw" Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.324226 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3451b802-59e8-48d9-b5ca-feafeddc6712-serving-cert\") pod \"route-controller-manager-6f5856f6cb-t2647\" (UID: \"3451b802-59e8-48d9-b5ca-feafeddc6712\") " pod="openshift-route-controller-manager/route-controller-manager-6f5856f6cb-t2647" Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.324251 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3451b802-59e8-48d9-b5ca-feafeddc6712-config\") pod \"route-controller-manager-6f5856f6cb-t2647\" (UID: \"3451b802-59e8-48d9-b5ca-feafeddc6712\") " pod="openshift-route-controller-manager/route-controller-manager-6f5856f6cb-t2647" Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.324272 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cc72c1a-01c7-4b4a-8144-b389ec172c11-config\") pod \"controller-manager-7699d4447b-94ncw\" (UID: \"3cc72c1a-01c7-4b4a-8144-b389ec172c11\") " pod="openshift-controller-manager/controller-manager-7699d4447b-94ncw" Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.324305 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbpmp\" (UniqueName: \"kubernetes.io/projected/3cc72c1a-01c7-4b4a-8144-b389ec172c11-kube-api-access-kbpmp\") pod \"controller-manager-7699d4447b-94ncw\" (UID: \"3cc72c1a-01c7-4b4a-8144-b389ec172c11\") " pod="openshift-controller-manager/controller-manager-7699d4447b-94ncw" Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.324326 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3451b802-59e8-48d9-b5ca-feafeddc6712-client-ca\") pod \"route-controller-manager-6f5856f6cb-t2647\" (UID: \"3451b802-59e8-48d9-b5ca-feafeddc6712\") " pod="openshift-route-controller-manager/route-controller-manager-6f5856f6cb-t2647" Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.325262 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3cc72c1a-01c7-4b4a-8144-b389ec172c11-client-ca\") pod \"controller-manager-7699d4447b-94ncw\" (UID: \"3cc72c1a-01c7-4b4a-8144-b389ec172c11\") " pod="openshift-controller-manager/controller-manager-7699d4447b-94ncw" Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.325262 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3451b802-59e8-48d9-b5ca-feafeddc6712-client-ca\") pod \"route-controller-manager-6f5856f6cb-t2647\" (UID: \"3451b802-59e8-48d9-b5ca-feafeddc6712\") " pod="openshift-route-controller-manager/route-controller-manager-6f5856f6cb-t2647" Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.326250 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3451b802-59e8-48d9-b5ca-feafeddc6712-config\") pod \"route-controller-manager-6f5856f6cb-t2647\" (UID: \"3451b802-59e8-48d9-b5ca-feafeddc6712\") " pod="openshift-route-controller-manager/route-controller-manager-6f5856f6cb-t2647" Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.326638 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cc72c1a-01c7-4b4a-8144-b389ec172c11-config\") pod \"controller-manager-7699d4447b-94ncw\" (UID: \"3cc72c1a-01c7-4b4a-8144-b389ec172c11\") " pod="openshift-controller-manager/controller-manager-7699d4447b-94ncw" Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.327440 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3cc72c1a-01c7-4b4a-8144-b389ec172c11-proxy-ca-bundles\") pod \"controller-manager-7699d4447b-94ncw\" (UID: \"3cc72c1a-01c7-4b4a-8144-b389ec172c11\") " pod="openshift-controller-manager/controller-manager-7699d4447b-94ncw" Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.329676 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3451b802-59e8-48d9-b5ca-feafeddc6712-serving-cert\") pod \"route-controller-manager-6f5856f6cb-t2647\" (UID: \"3451b802-59e8-48d9-b5ca-feafeddc6712\") " pod="openshift-route-controller-manager/route-controller-manager-6f5856f6cb-t2647" Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.329804 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3cc72c1a-01c7-4b4a-8144-b389ec172c11-serving-cert\") pod \"controller-manager-7699d4447b-94ncw\" (UID: \"3cc72c1a-01c7-4b4a-8144-b389ec172c11\") " pod="openshift-controller-manager/controller-manager-7699d4447b-94ncw" Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.356685 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgv99\" (UniqueName: \"kubernetes.io/projected/3451b802-59e8-48d9-b5ca-feafeddc6712-kube-api-access-hgv99\") pod \"route-controller-manager-6f5856f6cb-t2647\" (UID: \"3451b802-59e8-48d9-b5ca-feafeddc6712\") " pod="openshift-route-controller-manager/route-controller-manager-6f5856f6cb-t2647" Oct 01 15:08:32 crc kubenswrapper[4771]: I1001 15:08:32.359077 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbpmp\" (UniqueName: \"kubernetes.io/projected/3cc72c1a-01c7-4b4a-8144-b389ec172c11-kube-api-access-kbpmp\") pod \"controller-manager-7699d4447b-94ncw\" (UID: \"3cc72c1a-01c7-4b4a-8144-b389ec172c11\") " pod="openshift-controller-manager/controller-manager-7699d4447b-94ncw" Oct 01 15:08:33 crc kubenswrapper[4771]: I1001 15:08:33.173519 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7699d4447b-94ncw" Oct 01 15:08:33 crc kubenswrapper[4771]: I1001 15:08:33.173571 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f5856f6cb-t2647" Oct 01 15:08:33 crc kubenswrapper[4771]: I1001 15:08:33.199920 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7699d4447b-94ncw" Oct 01 15:08:33 crc kubenswrapper[4771]: I1001 15:08:33.207150 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f5856f6cb-t2647" Oct 01 15:08:33 crc kubenswrapper[4771]: I1001 15:08:33.336801 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cc72c1a-01c7-4b4a-8144-b389ec172c11-config\") pod \"3cc72c1a-01c7-4b4a-8144-b389ec172c11\" (UID: \"3cc72c1a-01c7-4b4a-8144-b389ec172c11\") " Oct 01 15:08:33 crc kubenswrapper[4771]: I1001 15:08:33.336859 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3cc72c1a-01c7-4b4a-8144-b389ec172c11-serving-cert\") pod \"3cc72c1a-01c7-4b4a-8144-b389ec172c11\" (UID: \"3cc72c1a-01c7-4b4a-8144-b389ec172c11\") " Oct 01 15:08:33 crc kubenswrapper[4771]: I1001 15:08:33.336907 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgv99\" (UniqueName: \"kubernetes.io/projected/3451b802-59e8-48d9-b5ca-feafeddc6712-kube-api-access-hgv99\") pod \"3451b802-59e8-48d9-b5ca-feafeddc6712\" (UID: \"3451b802-59e8-48d9-b5ca-feafeddc6712\") " Oct 01 15:08:33 crc kubenswrapper[4771]: I1001 15:08:33.336954 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3451b802-59e8-48d9-b5ca-feafeddc6712-config\") pod \"3451b802-59e8-48d9-b5ca-feafeddc6712\" (UID: \"3451b802-59e8-48d9-b5ca-feafeddc6712\") " Oct 01 15:08:33 crc kubenswrapper[4771]: I1001 15:08:33.336980 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3cc72c1a-01c7-4b4a-8144-b389ec172c11-client-ca\") pod \"3cc72c1a-01c7-4b4a-8144-b389ec172c11\" (UID: \"3cc72c1a-01c7-4b4a-8144-b389ec172c11\") " Oct 01 15:08:33 crc kubenswrapper[4771]: I1001 15:08:33.337001 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3451b802-59e8-48d9-b5ca-feafeddc6712-client-ca\") pod \"3451b802-59e8-48d9-b5ca-feafeddc6712\" (UID: \"3451b802-59e8-48d9-b5ca-feafeddc6712\") " Oct 01 15:08:33 crc kubenswrapper[4771]: I1001 15:08:33.337016 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3cc72c1a-01c7-4b4a-8144-b389ec172c11-proxy-ca-bundles\") pod \"3cc72c1a-01c7-4b4a-8144-b389ec172c11\" (UID: \"3cc72c1a-01c7-4b4a-8144-b389ec172c11\") " Oct 01 15:08:33 crc kubenswrapper[4771]: I1001 15:08:33.337037 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbpmp\" (UniqueName: \"kubernetes.io/projected/3cc72c1a-01c7-4b4a-8144-b389ec172c11-kube-api-access-kbpmp\") pod \"3cc72c1a-01c7-4b4a-8144-b389ec172c11\" (UID: \"3cc72c1a-01c7-4b4a-8144-b389ec172c11\") " Oct 01 15:08:33 crc kubenswrapper[4771]: I1001 15:08:33.337053 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3451b802-59e8-48d9-b5ca-feafeddc6712-serving-cert\") pod \"3451b802-59e8-48d9-b5ca-feafeddc6712\" (UID: \"3451b802-59e8-48d9-b5ca-feafeddc6712\") " Oct 01 15:08:33 crc kubenswrapper[4771]: I1001 15:08:33.337551 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3451b802-59e8-48d9-b5ca-feafeddc6712-config" (OuterVolumeSpecName: "config") pod "3451b802-59e8-48d9-b5ca-feafeddc6712" (UID: "3451b802-59e8-48d9-b5ca-feafeddc6712"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:08:33 crc kubenswrapper[4771]: I1001 15:08:33.337616 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cc72c1a-01c7-4b4a-8144-b389ec172c11-config" (OuterVolumeSpecName: "config") pod "3cc72c1a-01c7-4b4a-8144-b389ec172c11" (UID: "3cc72c1a-01c7-4b4a-8144-b389ec172c11"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:08:33 crc kubenswrapper[4771]: I1001 15:08:33.337892 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3451b802-59e8-48d9-b5ca-feafeddc6712-client-ca" (OuterVolumeSpecName: "client-ca") pod "3451b802-59e8-48d9-b5ca-feafeddc6712" (UID: "3451b802-59e8-48d9-b5ca-feafeddc6712"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:08:33 crc kubenswrapper[4771]: I1001 15:08:33.338135 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cc72c1a-01c7-4b4a-8144-b389ec172c11-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "3cc72c1a-01c7-4b4a-8144-b389ec172c11" (UID: "3cc72c1a-01c7-4b4a-8144-b389ec172c11"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:08:33 crc kubenswrapper[4771]: I1001 15:08:33.338160 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cc72c1a-01c7-4b4a-8144-b389ec172c11-client-ca" (OuterVolumeSpecName: "client-ca") pod "3cc72c1a-01c7-4b4a-8144-b389ec172c11" (UID: "3cc72c1a-01c7-4b4a-8144-b389ec172c11"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:08:33 crc kubenswrapper[4771]: I1001 15:08:33.345870 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cc72c1a-01c7-4b4a-8144-b389ec172c11-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3cc72c1a-01c7-4b4a-8144-b389ec172c11" (UID: "3cc72c1a-01c7-4b4a-8144-b389ec172c11"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:08:33 crc kubenswrapper[4771]: I1001 15:08:33.345896 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cc72c1a-01c7-4b4a-8144-b389ec172c11-kube-api-access-kbpmp" (OuterVolumeSpecName: "kube-api-access-kbpmp") pod "3cc72c1a-01c7-4b4a-8144-b389ec172c11" (UID: "3cc72c1a-01c7-4b4a-8144-b389ec172c11"). InnerVolumeSpecName "kube-api-access-kbpmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:08:33 crc kubenswrapper[4771]: I1001 15:08:33.345914 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3451b802-59e8-48d9-b5ca-feafeddc6712-kube-api-access-hgv99" (OuterVolumeSpecName: "kube-api-access-hgv99") pod "3451b802-59e8-48d9-b5ca-feafeddc6712" (UID: "3451b802-59e8-48d9-b5ca-feafeddc6712"). InnerVolumeSpecName "kube-api-access-hgv99". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:08:33 crc kubenswrapper[4771]: I1001 15:08:33.348919 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3451b802-59e8-48d9-b5ca-feafeddc6712-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3451b802-59e8-48d9-b5ca-feafeddc6712" (UID: "3451b802-59e8-48d9-b5ca-feafeddc6712"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:08:33 crc kubenswrapper[4771]: I1001 15:08:33.438460 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3451b802-59e8-48d9-b5ca-feafeddc6712-config\") on node \"crc\" DevicePath \"\"" Oct 01 15:08:33 crc kubenswrapper[4771]: I1001 15:08:33.438499 4771 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3cc72c1a-01c7-4b4a-8144-b389ec172c11-client-ca\") on node \"crc\" DevicePath \"\"" Oct 01 15:08:33 crc kubenswrapper[4771]: I1001 15:08:33.438512 4771 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3451b802-59e8-48d9-b5ca-feafeddc6712-client-ca\") on node \"crc\" DevicePath \"\"" Oct 01 15:08:33 crc kubenswrapper[4771]: I1001 15:08:33.438523 4771 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3cc72c1a-01c7-4b4a-8144-b389ec172c11-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 01 15:08:33 crc kubenswrapper[4771]: I1001 15:08:33.438538 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbpmp\" (UniqueName: \"kubernetes.io/projected/3cc72c1a-01c7-4b4a-8144-b389ec172c11-kube-api-access-kbpmp\") on node \"crc\" DevicePath \"\"" Oct 01 15:08:33 crc kubenswrapper[4771]: I1001 15:08:33.438550 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3451b802-59e8-48d9-b5ca-feafeddc6712-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 15:08:33 crc kubenswrapper[4771]: I1001 15:08:33.438561 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cc72c1a-01c7-4b4a-8144-b389ec172c11-config\") on node \"crc\" DevicePath \"\"" Oct 01 15:08:33 crc kubenswrapper[4771]: I1001 15:08:33.438571 4771 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3cc72c1a-01c7-4b4a-8144-b389ec172c11-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 01 15:08:33 crc kubenswrapper[4771]: I1001 15:08:33.438583 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgv99\" (UniqueName: \"kubernetes.io/projected/3451b802-59e8-48d9-b5ca-feafeddc6712-kube-api-access-hgv99\") on node \"crc\" DevicePath \"\"" Oct 01 15:08:34 crc kubenswrapper[4771]: I1001 15:08:34.177775 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f5856f6cb-t2647" Oct 01 15:08:34 crc kubenswrapper[4771]: I1001 15:08:34.177797 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7699d4447b-94ncw" Oct 01 15:08:34 crc kubenswrapper[4771]: I1001 15:08:34.250531 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7677fc5b54-nd5r5"] Oct 01 15:08:34 crc kubenswrapper[4771]: I1001 15:08:34.251353 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7677fc5b54-nd5r5" Oct 01 15:08:34 crc kubenswrapper[4771]: I1001 15:08:34.254367 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 01 15:08:34 crc kubenswrapper[4771]: I1001 15:08:34.254540 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 01 15:08:34 crc kubenswrapper[4771]: I1001 15:08:34.254653 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 01 15:08:34 crc kubenswrapper[4771]: I1001 15:08:34.254649 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 01 15:08:34 crc kubenswrapper[4771]: I1001 15:08:34.254865 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 01 15:08:34 crc kubenswrapper[4771]: I1001 15:08:34.255577 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 01 15:08:34 crc kubenswrapper[4771]: I1001 15:08:34.264745 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7677fc5b54-nd5r5"] Oct 01 15:08:34 crc kubenswrapper[4771]: I1001 15:08:34.271175 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7699d4447b-94ncw"] Oct 01 15:08:34 crc kubenswrapper[4771]: I1001 15:08:34.275023 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 01 15:08:34 crc kubenswrapper[4771]: I1001 15:08:34.275600 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7699d4447b-94ncw"] Oct 01 15:08:34 crc kubenswrapper[4771]: I1001 15:08:34.310451 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f5856f6cb-t2647"] Oct 01 15:08:34 crc kubenswrapper[4771]: I1001 15:08:34.318041 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f5856f6cb-t2647"] Oct 01 15:08:34 crc kubenswrapper[4771]: I1001 15:08:34.352202 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/22026766-68f7-4944-9530-96bc3f2fb9f6-client-ca\") pod \"controller-manager-7677fc5b54-nd5r5\" (UID: \"22026766-68f7-4944-9530-96bc3f2fb9f6\") " pod="openshift-controller-manager/controller-manager-7677fc5b54-nd5r5" Oct 01 15:08:34 crc kubenswrapper[4771]: I1001 15:08:34.352262 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22026766-68f7-4944-9530-96bc3f2fb9f6-config\") pod \"controller-manager-7677fc5b54-nd5r5\" (UID: \"22026766-68f7-4944-9530-96bc3f2fb9f6\") " pod="openshift-controller-manager/controller-manager-7677fc5b54-nd5r5" Oct 01 15:08:34 crc kubenswrapper[4771]: I1001 15:08:34.352284 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22026766-68f7-4944-9530-96bc3f2fb9f6-serving-cert\") pod \"controller-manager-7677fc5b54-nd5r5\" (UID: \"22026766-68f7-4944-9530-96bc3f2fb9f6\") " pod="openshift-controller-manager/controller-manager-7677fc5b54-nd5r5" Oct 01 15:08:34 crc kubenswrapper[4771]: I1001 15:08:34.352314 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v92qg\" (UniqueName: \"kubernetes.io/projected/22026766-68f7-4944-9530-96bc3f2fb9f6-kube-api-access-v92qg\") pod \"controller-manager-7677fc5b54-nd5r5\" (UID: \"22026766-68f7-4944-9530-96bc3f2fb9f6\") " pod="openshift-controller-manager/controller-manager-7677fc5b54-nd5r5" Oct 01 15:08:34 crc kubenswrapper[4771]: I1001 15:08:34.352345 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/22026766-68f7-4944-9530-96bc3f2fb9f6-proxy-ca-bundles\") pod \"controller-manager-7677fc5b54-nd5r5\" (UID: \"22026766-68f7-4944-9530-96bc3f2fb9f6\") " pod="openshift-controller-manager/controller-manager-7677fc5b54-nd5r5" Oct 01 15:08:34 crc kubenswrapper[4771]: I1001 15:08:34.453339 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/22026766-68f7-4944-9530-96bc3f2fb9f6-proxy-ca-bundles\") pod \"controller-manager-7677fc5b54-nd5r5\" (UID: \"22026766-68f7-4944-9530-96bc3f2fb9f6\") " pod="openshift-controller-manager/controller-manager-7677fc5b54-nd5r5" Oct 01 15:08:34 crc kubenswrapper[4771]: I1001 15:08:34.453457 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/22026766-68f7-4944-9530-96bc3f2fb9f6-client-ca\") pod \"controller-manager-7677fc5b54-nd5r5\" (UID: \"22026766-68f7-4944-9530-96bc3f2fb9f6\") " pod="openshift-controller-manager/controller-manager-7677fc5b54-nd5r5" Oct 01 15:08:34 crc kubenswrapper[4771]: I1001 15:08:34.453500 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22026766-68f7-4944-9530-96bc3f2fb9f6-config\") pod \"controller-manager-7677fc5b54-nd5r5\" (UID: \"22026766-68f7-4944-9530-96bc3f2fb9f6\") " pod="openshift-controller-manager/controller-manager-7677fc5b54-nd5r5" Oct 01 15:08:34 crc kubenswrapper[4771]: I1001 15:08:34.453521 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22026766-68f7-4944-9530-96bc3f2fb9f6-serving-cert\") pod \"controller-manager-7677fc5b54-nd5r5\" (UID: \"22026766-68f7-4944-9530-96bc3f2fb9f6\") " pod="openshift-controller-manager/controller-manager-7677fc5b54-nd5r5" Oct 01 15:08:34 crc kubenswrapper[4771]: I1001 15:08:34.453551 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v92qg\" (UniqueName: \"kubernetes.io/projected/22026766-68f7-4944-9530-96bc3f2fb9f6-kube-api-access-v92qg\") pod \"controller-manager-7677fc5b54-nd5r5\" (UID: \"22026766-68f7-4944-9530-96bc3f2fb9f6\") " pod="openshift-controller-manager/controller-manager-7677fc5b54-nd5r5" Oct 01 15:08:34 crc kubenswrapper[4771]: I1001 15:08:34.454932 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/22026766-68f7-4944-9530-96bc3f2fb9f6-client-ca\") pod \"controller-manager-7677fc5b54-nd5r5\" (UID: \"22026766-68f7-4944-9530-96bc3f2fb9f6\") " pod="openshift-controller-manager/controller-manager-7677fc5b54-nd5r5" Oct 01 15:08:34 crc kubenswrapper[4771]: I1001 15:08:34.455242 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22026766-68f7-4944-9530-96bc3f2fb9f6-config\") pod \"controller-manager-7677fc5b54-nd5r5\" (UID: \"22026766-68f7-4944-9530-96bc3f2fb9f6\") " pod="openshift-controller-manager/controller-manager-7677fc5b54-nd5r5" Oct 01 15:08:34 crc kubenswrapper[4771]: I1001 15:08:34.459713 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22026766-68f7-4944-9530-96bc3f2fb9f6-serving-cert\") pod \"controller-manager-7677fc5b54-nd5r5\" (UID: \"22026766-68f7-4944-9530-96bc3f2fb9f6\") " pod="openshift-controller-manager/controller-manager-7677fc5b54-nd5r5" Oct 01 15:08:34 crc kubenswrapper[4771]: I1001 15:08:34.468415 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/22026766-68f7-4944-9530-96bc3f2fb9f6-proxy-ca-bundles\") pod \"controller-manager-7677fc5b54-nd5r5\" (UID: \"22026766-68f7-4944-9530-96bc3f2fb9f6\") " pod="openshift-controller-manager/controller-manager-7677fc5b54-nd5r5" Oct 01 15:08:34 crc kubenswrapper[4771]: I1001 15:08:34.472659 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v92qg\" (UniqueName: \"kubernetes.io/projected/22026766-68f7-4944-9530-96bc3f2fb9f6-kube-api-access-v92qg\") pod \"controller-manager-7677fc5b54-nd5r5\" (UID: \"22026766-68f7-4944-9530-96bc3f2fb9f6\") " pod="openshift-controller-manager/controller-manager-7677fc5b54-nd5r5" Oct 01 15:08:34 crc kubenswrapper[4771]: I1001 15:08:34.568965 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7677fc5b54-nd5r5" Oct 01 15:08:35 crc kubenswrapper[4771]: I1001 15:08:35.027529 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7677fc5b54-nd5r5"] Oct 01 15:08:35 crc kubenswrapper[4771]: I1001 15:08:35.182636 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7677fc5b54-nd5r5" event={"ID":"22026766-68f7-4944-9530-96bc3f2fb9f6","Type":"ContainerStarted","Data":"aac1a4ebfb578293e4ad62f22b61a83a38b05ee6adafd844ef1bbc325b43c064"} Oct 01 15:08:35 crc kubenswrapper[4771]: I1001 15:08:35.996254 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3451b802-59e8-48d9-b5ca-feafeddc6712" path="/var/lib/kubelet/pods/3451b802-59e8-48d9-b5ca-feafeddc6712/volumes" Oct 01 15:08:35 crc kubenswrapper[4771]: I1001 15:08:35.996703 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cc72c1a-01c7-4b4a-8144-b389ec172c11" path="/var/lib/kubelet/pods/3cc72c1a-01c7-4b4a-8144-b389ec172c11/volumes" Oct 01 15:08:36 crc kubenswrapper[4771]: I1001 15:08:36.191074 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7677fc5b54-nd5r5" event={"ID":"22026766-68f7-4944-9530-96bc3f2fb9f6","Type":"ContainerStarted","Data":"3efc12b21244d933d023597e3611b253fd63f91507513f45c1fbb81a5de6236b"} Oct 01 15:08:36 crc kubenswrapper[4771]: I1001 15:08:36.191671 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7677fc5b54-nd5r5" Oct 01 15:08:36 crc kubenswrapper[4771]: I1001 15:08:36.200547 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7677fc5b54-nd5r5" Oct 01 15:08:36 crc kubenswrapper[4771]: I1001 15:08:36.214580 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7677fc5b54-nd5r5" podStartSLOduration=4.214552536 podStartE2EDuration="4.214552536s" podCreationTimestamp="2025-10-01 15:08:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:08:36.207366088 +0000 UTC m=+760.826541269" watchObservedRunningTime="2025-10-01 15:08:36.214552536 +0000 UTC m=+760.833727707" Oct 01 15:08:37 crc kubenswrapper[4771]: I1001 15:08:37.102611 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5486d5d6b5-xnd6d"] Oct 01 15:08:37 crc kubenswrapper[4771]: I1001 15:08:37.104048 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5486d5d6b5-xnd6d" Oct 01 15:08:37 crc kubenswrapper[4771]: I1001 15:08:37.106070 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 01 15:08:37 crc kubenswrapper[4771]: I1001 15:08:37.107807 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 01 15:08:37 crc kubenswrapper[4771]: I1001 15:08:37.109767 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 01 15:08:37 crc kubenswrapper[4771]: I1001 15:08:37.109778 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 01 15:08:37 crc kubenswrapper[4771]: I1001 15:08:37.109809 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 01 15:08:37 crc kubenswrapper[4771]: I1001 15:08:37.109864 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 01 15:08:37 crc kubenswrapper[4771]: I1001 15:08:37.123067 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5486d5d6b5-xnd6d"] Oct 01 15:08:37 crc kubenswrapper[4771]: I1001 15:08:37.185816 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49808535-0731-4485-a0da-349b2111f4c6-config\") pod \"route-controller-manager-5486d5d6b5-xnd6d\" (UID: \"49808535-0731-4485-a0da-349b2111f4c6\") " pod="openshift-route-controller-manager/route-controller-manager-5486d5d6b5-xnd6d" Oct 01 15:08:37 crc kubenswrapper[4771]: I1001 15:08:37.185866 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvflt\" (UniqueName: \"kubernetes.io/projected/49808535-0731-4485-a0da-349b2111f4c6-kube-api-access-fvflt\") pod \"route-controller-manager-5486d5d6b5-xnd6d\" (UID: \"49808535-0731-4485-a0da-349b2111f4c6\") " pod="openshift-route-controller-manager/route-controller-manager-5486d5d6b5-xnd6d" Oct 01 15:08:37 crc kubenswrapper[4771]: I1001 15:08:37.186041 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/49808535-0731-4485-a0da-349b2111f4c6-client-ca\") pod \"route-controller-manager-5486d5d6b5-xnd6d\" (UID: \"49808535-0731-4485-a0da-349b2111f4c6\") " pod="openshift-route-controller-manager/route-controller-manager-5486d5d6b5-xnd6d" Oct 01 15:08:37 crc kubenswrapper[4771]: I1001 15:08:37.186096 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49808535-0731-4485-a0da-349b2111f4c6-serving-cert\") pod \"route-controller-manager-5486d5d6b5-xnd6d\" (UID: \"49808535-0731-4485-a0da-349b2111f4c6\") " pod="openshift-route-controller-manager/route-controller-manager-5486d5d6b5-xnd6d" Oct 01 15:08:37 crc kubenswrapper[4771]: I1001 15:08:37.287055 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49808535-0731-4485-a0da-349b2111f4c6-config\") pod \"route-controller-manager-5486d5d6b5-xnd6d\" (UID: \"49808535-0731-4485-a0da-349b2111f4c6\") " pod="openshift-route-controller-manager/route-controller-manager-5486d5d6b5-xnd6d" Oct 01 15:08:37 crc kubenswrapper[4771]: I1001 15:08:37.287132 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvflt\" (UniqueName: \"kubernetes.io/projected/49808535-0731-4485-a0da-349b2111f4c6-kube-api-access-fvflt\") pod \"route-controller-manager-5486d5d6b5-xnd6d\" (UID: \"49808535-0731-4485-a0da-349b2111f4c6\") " pod="openshift-route-controller-manager/route-controller-manager-5486d5d6b5-xnd6d" Oct 01 15:08:37 crc kubenswrapper[4771]: I1001 15:08:37.287255 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/49808535-0731-4485-a0da-349b2111f4c6-client-ca\") pod \"route-controller-manager-5486d5d6b5-xnd6d\" (UID: \"49808535-0731-4485-a0da-349b2111f4c6\") " pod="openshift-route-controller-manager/route-controller-manager-5486d5d6b5-xnd6d" Oct 01 15:08:37 crc kubenswrapper[4771]: I1001 15:08:37.287289 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49808535-0731-4485-a0da-349b2111f4c6-serving-cert\") pod \"route-controller-manager-5486d5d6b5-xnd6d\" (UID: \"49808535-0731-4485-a0da-349b2111f4c6\") " pod="openshift-route-controller-manager/route-controller-manager-5486d5d6b5-xnd6d" Oct 01 15:08:37 crc kubenswrapper[4771]: I1001 15:08:37.288588 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49808535-0731-4485-a0da-349b2111f4c6-config\") pod \"route-controller-manager-5486d5d6b5-xnd6d\" (UID: \"49808535-0731-4485-a0da-349b2111f4c6\") " pod="openshift-route-controller-manager/route-controller-manager-5486d5d6b5-xnd6d" Oct 01 15:08:37 crc kubenswrapper[4771]: I1001 15:08:37.289096 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/49808535-0731-4485-a0da-349b2111f4c6-client-ca\") pod \"route-controller-manager-5486d5d6b5-xnd6d\" (UID: \"49808535-0731-4485-a0da-349b2111f4c6\") " pod="openshift-route-controller-manager/route-controller-manager-5486d5d6b5-xnd6d" Oct 01 15:08:37 crc kubenswrapper[4771]: I1001 15:08:37.296451 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49808535-0731-4485-a0da-349b2111f4c6-serving-cert\") pod \"route-controller-manager-5486d5d6b5-xnd6d\" (UID: \"49808535-0731-4485-a0da-349b2111f4c6\") " pod="openshift-route-controller-manager/route-controller-manager-5486d5d6b5-xnd6d" Oct 01 15:08:37 crc kubenswrapper[4771]: I1001 15:08:37.313757 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvflt\" (UniqueName: \"kubernetes.io/projected/49808535-0731-4485-a0da-349b2111f4c6-kube-api-access-fvflt\") pod \"route-controller-manager-5486d5d6b5-xnd6d\" (UID: \"49808535-0731-4485-a0da-349b2111f4c6\") " pod="openshift-route-controller-manager/route-controller-manager-5486d5d6b5-xnd6d" Oct 01 15:08:37 crc kubenswrapper[4771]: I1001 15:08:37.434545 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5486d5d6b5-xnd6d" Oct 01 15:08:37 crc kubenswrapper[4771]: I1001 15:08:37.886436 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5486d5d6b5-xnd6d"] Oct 01 15:08:38 crc kubenswrapper[4771]: I1001 15:08:38.209797 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5486d5d6b5-xnd6d" event={"ID":"49808535-0731-4485-a0da-349b2111f4c6","Type":"ContainerStarted","Data":"04a37c935b9b9e31fc924cb830eebf0fda3a0e9eea65746697d56cc9f675d777"} Oct 01 15:08:38 crc kubenswrapper[4771]: I1001 15:08:38.210136 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5486d5d6b5-xnd6d" event={"ID":"49808535-0731-4485-a0da-349b2111f4c6","Type":"ContainerStarted","Data":"fd53706915791d559f44e32720822cb81940983abd7f6ea823eadcec45b8d55f"} Oct 01 15:08:39 crc kubenswrapper[4771]: I1001 15:08:39.215346 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5486d5d6b5-xnd6d" Oct 01 15:08:39 crc kubenswrapper[4771]: I1001 15:08:39.220601 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5486d5d6b5-xnd6d" Oct 01 15:08:39 crc kubenswrapper[4771]: I1001 15:08:39.237367 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5486d5d6b5-xnd6d" podStartSLOduration=7.237350781 podStartE2EDuration="7.237350781s" podCreationTimestamp="2025-10-01 15:08:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:08:38.227675288 +0000 UTC m=+762.846850469" watchObservedRunningTime="2025-10-01 15:08:39.237350781 +0000 UTC m=+763.856525952" Oct 01 15:08:41 crc kubenswrapper[4771]: I1001 15:08:41.212010 4771 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 01 15:08:42 crc kubenswrapper[4771]: I1001 15:08:42.177771 4771 patch_prober.go:28] interesting pod/machine-config-daemon-vck47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 15:08:42 crc kubenswrapper[4771]: I1001 15:08:42.178075 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 15:08:50 crc kubenswrapper[4771]: I1001 15:08:50.107512 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-554dcf567c-bnmzr" Oct 01 15:08:50 crc kubenswrapper[4771]: I1001 15:08:50.905856 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-mp8gm"] Oct 01 15:08:50 crc kubenswrapper[4771]: I1001 15:08:50.909025 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-mp8gm" Oct 01 15:08:50 crc kubenswrapper[4771]: I1001 15:08:50.912323 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 01 15:08:50 crc kubenswrapper[4771]: I1001 15:08:50.912847 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 01 15:08:50 crc kubenswrapper[4771]: I1001 15:08:50.912882 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-jf2m5" Oct 01 15:08:50 crc kubenswrapper[4771]: I1001 15:08:50.934894 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-4lvs5"] Oct 01 15:08:50 crc kubenswrapper[4771]: I1001 15:08:50.936056 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-4lvs5" Oct 01 15:08:50 crc kubenswrapper[4771]: I1001 15:08:50.940854 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 01 15:08:50 crc kubenswrapper[4771]: I1001 15:08:50.949029 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-4lvs5"] Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.022334 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-s5glz"] Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.023196 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-s5glz" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.025671 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.025887 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.025984 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-w5768" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.028045 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.058712 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-5d688f5ffc-ffb8z"] Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.059747 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-ffb8z" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.062177 4771 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.076959 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/36cbe93e-7162-4367-a756-e731b356fa91-frr-sockets\") pod \"frr-k8s-mp8gm\" (UID: \"36cbe93e-7162-4367-a756-e731b356fa91\") " pod="metallb-system/frr-k8s-mp8gm" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.077017 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/36cbe93e-7162-4367-a756-e731b356fa91-metrics\") pod \"frr-k8s-mp8gm\" (UID: \"36cbe93e-7162-4367-a756-e731b356fa91\") " pod="metallb-system/frr-k8s-mp8gm" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.077076 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/36cbe93e-7162-4367-a756-e731b356fa91-metrics-certs\") pod \"frr-k8s-mp8gm\" (UID: \"36cbe93e-7162-4367-a756-e731b356fa91\") " pod="metallb-system/frr-k8s-mp8gm" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.077119 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/36cbe93e-7162-4367-a756-e731b356fa91-frr-conf\") pod \"frr-k8s-mp8gm\" (UID: \"36cbe93e-7162-4367-a756-e731b356fa91\") " pod="metallb-system/frr-k8s-mp8gm" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.077146 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6f76\" (UniqueName: \"kubernetes.io/projected/36cbe93e-7162-4367-a756-e731b356fa91-kube-api-access-t6f76\") pod \"frr-k8s-mp8gm\" (UID: \"36cbe93e-7162-4367-a756-e731b356fa91\") " pod="metallb-system/frr-k8s-mp8gm" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.077210 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/36cbe93e-7162-4367-a756-e731b356fa91-reloader\") pod \"frr-k8s-mp8gm\" (UID: \"36cbe93e-7162-4367-a756-e731b356fa91\") " pod="metallb-system/frr-k8s-mp8gm" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.077237 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psjk7\" (UniqueName: \"kubernetes.io/projected/d0200ff4-2245-408d-bf5f-28479e049c57-kube-api-access-psjk7\") pod \"frr-k8s-webhook-server-5478bdb765-4lvs5\" (UID: \"d0200ff4-2245-408d-bf5f-28479e049c57\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-4lvs5" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.077289 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/36cbe93e-7162-4367-a756-e731b356fa91-frr-startup\") pod \"frr-k8s-mp8gm\" (UID: \"36cbe93e-7162-4367-a756-e731b356fa91\") " pod="metallb-system/frr-k8s-mp8gm" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.077315 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d0200ff4-2245-408d-bf5f-28479e049c57-cert\") pod \"frr-k8s-webhook-server-5478bdb765-4lvs5\" (UID: \"d0200ff4-2245-408d-bf5f-28479e049c57\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-4lvs5" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.078112 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-ffb8z"] Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.178365 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2cac0e6-5d1c-4914-b9f6-334aeedcf2d4-metrics-certs\") pod \"controller-5d688f5ffc-ffb8z\" (UID: \"e2cac0e6-5d1c-4914-b9f6-334aeedcf2d4\") " pod="metallb-system/controller-5d688f5ffc-ffb8z" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.178424 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/36cbe93e-7162-4367-a756-e731b356fa91-metrics-certs\") pod \"frr-k8s-mp8gm\" (UID: \"36cbe93e-7162-4367-a756-e731b356fa91\") " pod="metallb-system/frr-k8s-mp8gm" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.178444 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/cf712988-695e-4dae-a121-ce52bf39689e-metallb-excludel2\") pod \"speaker-s5glz\" (UID: \"cf712988-695e-4dae-a121-ce52bf39689e\") " pod="metallb-system/speaker-s5glz" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.178470 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/36cbe93e-7162-4367-a756-e731b356fa91-frr-conf\") pod \"frr-k8s-mp8gm\" (UID: \"36cbe93e-7162-4367-a756-e731b356fa91\") " pod="metallb-system/frr-k8s-mp8gm" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.178494 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6f76\" (UniqueName: \"kubernetes.io/projected/36cbe93e-7162-4367-a756-e731b356fa91-kube-api-access-t6f76\") pod \"frr-k8s-mp8gm\" (UID: \"36cbe93e-7162-4367-a756-e731b356fa91\") " pod="metallb-system/frr-k8s-mp8gm" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.178555 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f9gg\" (UniqueName: \"kubernetes.io/projected/e2cac0e6-5d1c-4914-b9f6-334aeedcf2d4-kube-api-access-4f9gg\") pod \"controller-5d688f5ffc-ffb8z\" (UID: \"e2cac0e6-5d1c-4914-b9f6-334aeedcf2d4\") " pod="metallb-system/controller-5d688f5ffc-ffb8z" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.178574 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e2cac0e6-5d1c-4914-b9f6-334aeedcf2d4-cert\") pod \"controller-5d688f5ffc-ffb8z\" (UID: \"e2cac0e6-5d1c-4914-b9f6-334aeedcf2d4\") " pod="metallb-system/controller-5d688f5ffc-ffb8z" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.178594 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/36cbe93e-7162-4367-a756-e731b356fa91-reloader\") pod \"frr-k8s-mp8gm\" (UID: \"36cbe93e-7162-4367-a756-e731b356fa91\") " pod="metallb-system/frr-k8s-mp8gm" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.178611 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psjk7\" (UniqueName: \"kubernetes.io/projected/d0200ff4-2245-408d-bf5f-28479e049c57-kube-api-access-psjk7\") pod \"frr-k8s-webhook-server-5478bdb765-4lvs5\" (UID: \"d0200ff4-2245-408d-bf5f-28479e049c57\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-4lvs5" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.178635 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/36cbe93e-7162-4367-a756-e731b356fa91-frr-startup\") pod \"frr-k8s-mp8gm\" (UID: \"36cbe93e-7162-4367-a756-e731b356fa91\") " pod="metallb-system/frr-k8s-mp8gm" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.178655 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2vsl\" (UniqueName: \"kubernetes.io/projected/cf712988-695e-4dae-a121-ce52bf39689e-kube-api-access-l2vsl\") pod \"speaker-s5glz\" (UID: \"cf712988-695e-4dae-a121-ce52bf39689e\") " pod="metallb-system/speaker-s5glz" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.178671 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d0200ff4-2245-408d-bf5f-28479e049c57-cert\") pod \"frr-k8s-webhook-server-5478bdb765-4lvs5\" (UID: \"d0200ff4-2245-408d-bf5f-28479e049c57\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-4lvs5" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.178689 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/36cbe93e-7162-4367-a756-e731b356fa91-frr-sockets\") pod \"frr-k8s-mp8gm\" (UID: \"36cbe93e-7162-4367-a756-e731b356fa91\") " pod="metallb-system/frr-k8s-mp8gm" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.178706 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/36cbe93e-7162-4367-a756-e731b356fa91-metrics\") pod \"frr-k8s-mp8gm\" (UID: \"36cbe93e-7162-4367-a756-e731b356fa91\") " pod="metallb-system/frr-k8s-mp8gm" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.178752 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/cf712988-695e-4dae-a121-ce52bf39689e-memberlist\") pod \"speaker-s5glz\" (UID: \"cf712988-695e-4dae-a121-ce52bf39689e\") " pod="metallb-system/speaker-s5glz" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.178767 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cf712988-695e-4dae-a121-ce52bf39689e-metrics-certs\") pod \"speaker-s5glz\" (UID: \"cf712988-695e-4dae-a121-ce52bf39689e\") " pod="metallb-system/speaker-s5glz" Oct 01 15:08:51 crc kubenswrapper[4771]: E1001 15:08:51.178944 4771 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Oct 01 15:08:51 crc kubenswrapper[4771]: E1001 15:08:51.178998 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0200ff4-2245-408d-bf5f-28479e049c57-cert podName:d0200ff4-2245-408d-bf5f-28479e049c57 nodeName:}" failed. No retries permitted until 2025-10-01 15:08:51.678983569 +0000 UTC m=+776.298158740 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d0200ff4-2245-408d-bf5f-28479e049c57-cert") pod "frr-k8s-webhook-server-5478bdb765-4lvs5" (UID: "d0200ff4-2245-408d-bf5f-28479e049c57") : secret "frr-k8s-webhook-server-cert" not found Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.179048 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/36cbe93e-7162-4367-a756-e731b356fa91-reloader\") pod \"frr-k8s-mp8gm\" (UID: \"36cbe93e-7162-4367-a756-e731b356fa91\") " pod="metallb-system/frr-k8s-mp8gm" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.179053 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/36cbe93e-7162-4367-a756-e731b356fa91-frr-conf\") pod \"frr-k8s-mp8gm\" (UID: \"36cbe93e-7162-4367-a756-e731b356fa91\") " pod="metallb-system/frr-k8s-mp8gm" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.179227 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/36cbe93e-7162-4367-a756-e731b356fa91-frr-sockets\") pod \"frr-k8s-mp8gm\" (UID: \"36cbe93e-7162-4367-a756-e731b356fa91\") " pod="metallb-system/frr-k8s-mp8gm" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.179297 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/36cbe93e-7162-4367-a756-e731b356fa91-metrics\") pod \"frr-k8s-mp8gm\" (UID: \"36cbe93e-7162-4367-a756-e731b356fa91\") " pod="metallb-system/frr-k8s-mp8gm" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.179944 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/36cbe93e-7162-4367-a756-e731b356fa91-frr-startup\") pod \"frr-k8s-mp8gm\" (UID: \"36cbe93e-7162-4367-a756-e731b356fa91\") " pod="metallb-system/frr-k8s-mp8gm" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.186277 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/36cbe93e-7162-4367-a756-e731b356fa91-metrics-certs\") pod \"frr-k8s-mp8gm\" (UID: \"36cbe93e-7162-4367-a756-e731b356fa91\") " pod="metallb-system/frr-k8s-mp8gm" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.198193 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6f76\" (UniqueName: \"kubernetes.io/projected/36cbe93e-7162-4367-a756-e731b356fa91-kube-api-access-t6f76\") pod \"frr-k8s-mp8gm\" (UID: \"36cbe93e-7162-4367-a756-e731b356fa91\") " pod="metallb-system/frr-k8s-mp8gm" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.216978 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psjk7\" (UniqueName: \"kubernetes.io/projected/d0200ff4-2245-408d-bf5f-28479e049c57-kube-api-access-psjk7\") pod \"frr-k8s-webhook-server-5478bdb765-4lvs5\" (UID: \"d0200ff4-2245-408d-bf5f-28479e049c57\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-4lvs5" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.236467 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-mp8gm" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.280302 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/cf712988-695e-4dae-a121-ce52bf39689e-memberlist\") pod \"speaker-s5glz\" (UID: \"cf712988-695e-4dae-a121-ce52bf39689e\") " pod="metallb-system/speaker-s5glz" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.280343 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cf712988-695e-4dae-a121-ce52bf39689e-metrics-certs\") pod \"speaker-s5glz\" (UID: \"cf712988-695e-4dae-a121-ce52bf39689e\") " pod="metallb-system/speaker-s5glz" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.280380 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2cac0e6-5d1c-4914-b9f6-334aeedcf2d4-metrics-certs\") pod \"controller-5d688f5ffc-ffb8z\" (UID: \"e2cac0e6-5d1c-4914-b9f6-334aeedcf2d4\") " pod="metallb-system/controller-5d688f5ffc-ffb8z" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.280401 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/cf712988-695e-4dae-a121-ce52bf39689e-metallb-excludel2\") pod \"speaker-s5glz\" (UID: \"cf712988-695e-4dae-a121-ce52bf39689e\") " pod="metallb-system/speaker-s5glz" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.280436 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f9gg\" (UniqueName: \"kubernetes.io/projected/e2cac0e6-5d1c-4914-b9f6-334aeedcf2d4-kube-api-access-4f9gg\") pod \"controller-5d688f5ffc-ffb8z\" (UID: \"e2cac0e6-5d1c-4914-b9f6-334aeedcf2d4\") " pod="metallb-system/controller-5d688f5ffc-ffb8z" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.280454 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e2cac0e6-5d1c-4914-b9f6-334aeedcf2d4-cert\") pod \"controller-5d688f5ffc-ffb8z\" (UID: \"e2cac0e6-5d1c-4914-b9f6-334aeedcf2d4\") " pod="metallb-system/controller-5d688f5ffc-ffb8z" Oct 01 15:08:51 crc kubenswrapper[4771]: E1001 15:08:51.280472 4771 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.280488 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2vsl\" (UniqueName: \"kubernetes.io/projected/cf712988-695e-4dae-a121-ce52bf39689e-kube-api-access-l2vsl\") pod \"speaker-s5glz\" (UID: \"cf712988-695e-4dae-a121-ce52bf39689e\") " pod="metallb-system/speaker-s5glz" Oct 01 15:08:51 crc kubenswrapper[4771]: E1001 15:08:51.280544 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf712988-695e-4dae-a121-ce52bf39689e-memberlist podName:cf712988-695e-4dae-a121-ce52bf39689e nodeName:}" failed. No retries permitted until 2025-10-01 15:08:51.780524721 +0000 UTC m=+776.399699892 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/cf712988-695e-4dae-a121-ce52bf39689e-memberlist") pod "speaker-s5glz" (UID: "cf712988-695e-4dae-a121-ce52bf39689e") : secret "metallb-memberlist" not found Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.281364 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/cf712988-695e-4dae-a121-ce52bf39689e-metallb-excludel2\") pod \"speaker-s5glz\" (UID: \"cf712988-695e-4dae-a121-ce52bf39689e\") " pod="metallb-system/speaker-s5glz" Oct 01 15:08:51 crc kubenswrapper[4771]: E1001 15:08:51.281627 4771 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Oct 01 15:08:51 crc kubenswrapper[4771]: E1001 15:08:51.281821 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e2cac0e6-5d1c-4914-b9f6-334aeedcf2d4-metrics-certs podName:e2cac0e6-5d1c-4914-b9f6-334aeedcf2d4 nodeName:}" failed. No retries permitted until 2025-10-01 15:08:51.781782792 +0000 UTC m=+776.400958023 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e2cac0e6-5d1c-4914-b9f6-334aeedcf2d4-metrics-certs") pod "controller-5d688f5ffc-ffb8z" (UID: "e2cac0e6-5d1c-4914-b9f6-334aeedcf2d4") : secret "controller-certs-secret" not found Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.285302 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cf712988-695e-4dae-a121-ce52bf39689e-metrics-certs\") pod \"speaker-s5glz\" (UID: \"cf712988-695e-4dae-a121-ce52bf39689e\") " pod="metallb-system/speaker-s5glz" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.287017 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e2cac0e6-5d1c-4914-b9f6-334aeedcf2d4-cert\") pod \"controller-5d688f5ffc-ffb8z\" (UID: \"e2cac0e6-5d1c-4914-b9f6-334aeedcf2d4\") " pod="metallb-system/controller-5d688f5ffc-ffb8z" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.298996 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f9gg\" (UniqueName: \"kubernetes.io/projected/e2cac0e6-5d1c-4914-b9f6-334aeedcf2d4-kube-api-access-4f9gg\") pod \"controller-5d688f5ffc-ffb8z\" (UID: \"e2cac0e6-5d1c-4914-b9f6-334aeedcf2d4\") " pod="metallb-system/controller-5d688f5ffc-ffb8z" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.306594 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2vsl\" (UniqueName: \"kubernetes.io/projected/cf712988-695e-4dae-a121-ce52bf39689e-kube-api-access-l2vsl\") pod \"speaker-s5glz\" (UID: \"cf712988-695e-4dae-a121-ce52bf39689e\") " pod="metallb-system/speaker-s5glz" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.685554 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d0200ff4-2245-408d-bf5f-28479e049c57-cert\") pod \"frr-k8s-webhook-server-5478bdb765-4lvs5\" (UID: \"d0200ff4-2245-408d-bf5f-28479e049c57\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-4lvs5" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.691473 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d0200ff4-2245-408d-bf5f-28479e049c57-cert\") pod \"frr-k8s-webhook-server-5478bdb765-4lvs5\" (UID: \"d0200ff4-2245-408d-bf5f-28479e049c57\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-4lvs5" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.787835 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/cf712988-695e-4dae-a121-ce52bf39689e-memberlist\") pod \"speaker-s5glz\" (UID: \"cf712988-695e-4dae-a121-ce52bf39689e\") " pod="metallb-system/speaker-s5glz" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.787928 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2cac0e6-5d1c-4914-b9f6-334aeedcf2d4-metrics-certs\") pod \"controller-5d688f5ffc-ffb8z\" (UID: \"e2cac0e6-5d1c-4914-b9f6-334aeedcf2d4\") " pod="metallb-system/controller-5d688f5ffc-ffb8z" Oct 01 15:08:51 crc kubenswrapper[4771]: E1001 15:08:51.788122 4771 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 01 15:08:51 crc kubenswrapper[4771]: E1001 15:08:51.788234 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf712988-695e-4dae-a121-ce52bf39689e-memberlist podName:cf712988-695e-4dae-a121-ce52bf39689e nodeName:}" failed. No retries permitted until 2025-10-01 15:08:52.788206047 +0000 UTC m=+777.407381258 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/cf712988-695e-4dae-a121-ce52bf39689e-memberlist") pod "speaker-s5glz" (UID: "cf712988-695e-4dae-a121-ce52bf39689e") : secret "metallb-memberlist" not found Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.793526 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e2cac0e6-5d1c-4914-b9f6-334aeedcf2d4-metrics-certs\") pod \"controller-5d688f5ffc-ffb8z\" (UID: \"e2cac0e6-5d1c-4914-b9f6-334aeedcf2d4\") " pod="metallb-system/controller-5d688f5ffc-ffb8z" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.857579 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-4lvs5" Oct 01 15:08:51 crc kubenswrapper[4771]: I1001 15:08:51.972669 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-ffb8z" Oct 01 15:08:52 crc kubenswrapper[4771]: I1001 15:08:52.302386 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-4lvs5"] Oct 01 15:08:52 crc kubenswrapper[4771]: I1001 15:08:52.309030 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mp8gm" event={"ID":"36cbe93e-7162-4367-a756-e731b356fa91","Type":"ContainerStarted","Data":"bb23942d4be46c5af530f71fc17203b7ce7fbacf9323af0f0f45a1df4d6b068a"} Oct 01 15:08:52 crc kubenswrapper[4771]: W1001 15:08:52.336259 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0200ff4_2245_408d_bf5f_28479e049c57.slice/crio-e82995e9f4cb67baa501dd29abc26487a67cb4ea154de416d285030bba5da229 WatchSource:0}: Error finding container e82995e9f4cb67baa501dd29abc26487a67cb4ea154de416d285030bba5da229: Status 404 returned error can't find the container with id e82995e9f4cb67baa501dd29abc26487a67cb4ea154de416d285030bba5da229 Oct 01 15:08:52 crc kubenswrapper[4771]: I1001 15:08:52.444047 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-ffb8z"] Oct 01 15:08:52 crc kubenswrapper[4771]: W1001 15:08:52.448194 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2cac0e6_5d1c_4914_b9f6_334aeedcf2d4.slice/crio-8502ddcb545ffa63d5fe12d922e710cc4abd4d33f74f271ac4cc988e45516778 WatchSource:0}: Error finding container 8502ddcb545ffa63d5fe12d922e710cc4abd4d33f74f271ac4cc988e45516778: Status 404 returned error can't find the container with id 8502ddcb545ffa63d5fe12d922e710cc4abd4d33f74f271ac4cc988e45516778 Oct 01 15:08:52 crc kubenswrapper[4771]: I1001 15:08:52.807914 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/cf712988-695e-4dae-a121-ce52bf39689e-memberlist\") pod \"speaker-s5glz\" (UID: \"cf712988-695e-4dae-a121-ce52bf39689e\") " pod="metallb-system/speaker-s5glz" Oct 01 15:08:52 crc kubenswrapper[4771]: I1001 15:08:52.814401 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/cf712988-695e-4dae-a121-ce52bf39689e-memberlist\") pod \"speaker-s5glz\" (UID: \"cf712988-695e-4dae-a121-ce52bf39689e\") " pod="metallb-system/speaker-s5glz" Oct 01 15:08:52 crc kubenswrapper[4771]: I1001 15:08:52.838879 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-s5glz" Oct 01 15:08:52 crc kubenswrapper[4771]: W1001 15:08:52.856619 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf712988_695e_4dae_a121_ce52bf39689e.slice/crio-3c0672a5a24438aa31ffc21d1510a742152fb3de3244f0cdd09b4b3e735390f6 WatchSource:0}: Error finding container 3c0672a5a24438aa31ffc21d1510a742152fb3de3244f0cdd09b4b3e735390f6: Status 404 returned error can't find the container with id 3c0672a5a24438aa31ffc21d1510a742152fb3de3244f0cdd09b4b3e735390f6 Oct 01 15:08:53 crc kubenswrapper[4771]: I1001 15:08:53.357830 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-ffb8z" event={"ID":"e2cac0e6-5d1c-4914-b9f6-334aeedcf2d4","Type":"ContainerStarted","Data":"13d342dfeb51f75af0c5a654ac15ae940c3678c98069eaa9937bb29e4fbd5fa1"} Oct 01 15:08:53 crc kubenswrapper[4771]: I1001 15:08:53.357872 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-ffb8z" event={"ID":"e2cac0e6-5d1c-4914-b9f6-334aeedcf2d4","Type":"ContainerStarted","Data":"a13f0ad4b4105f4c8218ffa371267178e5b8c2927e8bdce97b58ff5b4d6e676b"} Oct 01 15:08:53 crc kubenswrapper[4771]: I1001 15:08:53.357886 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-ffb8z" event={"ID":"e2cac0e6-5d1c-4914-b9f6-334aeedcf2d4","Type":"ContainerStarted","Data":"8502ddcb545ffa63d5fe12d922e710cc4abd4d33f74f271ac4cc988e45516778"} Oct 01 15:08:53 crc kubenswrapper[4771]: I1001 15:08:53.358604 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-5d688f5ffc-ffb8z" Oct 01 15:08:53 crc kubenswrapper[4771]: I1001 15:08:53.360527 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-4lvs5" event={"ID":"d0200ff4-2245-408d-bf5f-28479e049c57","Type":"ContainerStarted","Data":"e82995e9f4cb67baa501dd29abc26487a67cb4ea154de416d285030bba5da229"} Oct 01 15:08:53 crc kubenswrapper[4771]: I1001 15:08:53.372986 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-s5glz" event={"ID":"cf712988-695e-4dae-a121-ce52bf39689e","Type":"ContainerStarted","Data":"e1552f1bf185678fe29ca8173cd913df02cd10bd02ef7698385403fc33d69f58"} Oct 01 15:08:53 crc kubenswrapper[4771]: I1001 15:08:53.373039 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-s5glz" event={"ID":"cf712988-695e-4dae-a121-ce52bf39689e","Type":"ContainerStarted","Data":"3c0672a5a24438aa31ffc21d1510a742152fb3de3244f0cdd09b4b3e735390f6"} Oct 01 15:08:53 crc kubenswrapper[4771]: I1001 15:08:53.394502 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-5d688f5ffc-ffb8z" podStartSLOduration=2.394487397 podStartE2EDuration="2.394487397s" podCreationTimestamp="2025-10-01 15:08:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:08:53.393454761 +0000 UTC m=+778.012629932" watchObservedRunningTime="2025-10-01 15:08:53.394487397 +0000 UTC m=+778.013662568" Oct 01 15:08:54 crc kubenswrapper[4771]: I1001 15:08:54.387525 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-s5glz" event={"ID":"cf712988-695e-4dae-a121-ce52bf39689e","Type":"ContainerStarted","Data":"f1147b08bfb254277b7fdd90adde3cda9210007ee298aeaf438823e1bb894cde"} Oct 01 15:08:54 crc kubenswrapper[4771]: I1001 15:08:54.405934 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-s5glz" podStartSLOduration=3.405913683 podStartE2EDuration="3.405913683s" podCreationTimestamp="2025-10-01 15:08:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:08:54.40417439 +0000 UTC m=+779.023349571" watchObservedRunningTime="2025-10-01 15:08:54.405913683 +0000 UTC m=+779.025088854" Oct 01 15:08:55 crc kubenswrapper[4771]: I1001 15:08:55.394872 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-s5glz" Oct 01 15:08:58 crc kubenswrapper[4771]: I1001 15:08:58.438599 4771 generic.go:334] "Generic (PLEG): container finished" podID="36cbe93e-7162-4367-a756-e731b356fa91" containerID="d02f10f3e86e1ff43d4ae4c539e46f348edeceb19a1d1f9ffb026cca00b9beb7" exitCode=0 Oct 01 15:08:58 crc kubenswrapper[4771]: I1001 15:08:58.438796 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mp8gm" event={"ID":"36cbe93e-7162-4367-a756-e731b356fa91","Type":"ContainerDied","Data":"d02f10f3e86e1ff43d4ae4c539e46f348edeceb19a1d1f9ffb026cca00b9beb7"} Oct 01 15:08:58 crc kubenswrapper[4771]: I1001 15:08:58.442142 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-4lvs5" event={"ID":"d0200ff4-2245-408d-bf5f-28479e049c57","Type":"ContainerStarted","Data":"f63d628dfe249fc8647f3249bad4081a3e9d00dd742c2da2cfb97e5101095373"} Oct 01 15:08:58 crc kubenswrapper[4771]: I1001 15:08:58.442319 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-4lvs5" Oct 01 15:08:58 crc kubenswrapper[4771]: I1001 15:08:58.511840 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-4lvs5" podStartSLOduration=2.7684280489999997 podStartE2EDuration="8.511803134s" podCreationTimestamp="2025-10-01 15:08:50 +0000 UTC" firstStartedPulling="2025-10-01 15:08:52.339908688 +0000 UTC m=+776.959083889" lastFinishedPulling="2025-10-01 15:08:58.083283763 +0000 UTC m=+782.702458974" observedRunningTime="2025-10-01 15:08:58.499719975 +0000 UTC m=+783.118895166" watchObservedRunningTime="2025-10-01 15:08:58.511803134 +0000 UTC m=+783.130978375" Oct 01 15:08:59 crc kubenswrapper[4771]: I1001 15:08:59.448240 4771 generic.go:334] "Generic (PLEG): container finished" podID="36cbe93e-7162-4367-a756-e731b356fa91" containerID="36f801979e6e0442f4f98b1e33a11a7a181ea4c213d157985709c3fbd21a8be8" exitCode=0 Oct 01 15:08:59 crc kubenswrapper[4771]: I1001 15:08:59.448289 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mp8gm" event={"ID":"36cbe93e-7162-4367-a756-e731b356fa91","Type":"ContainerDied","Data":"36f801979e6e0442f4f98b1e33a11a7a181ea4c213d157985709c3fbd21a8be8"} Oct 01 15:09:00 crc kubenswrapper[4771]: I1001 15:09:00.457647 4771 generic.go:334] "Generic (PLEG): container finished" podID="36cbe93e-7162-4367-a756-e731b356fa91" containerID="0c6fe7cbfb9c76fe8ccd5637d9fd7fe426e6a3cc54b92a75bccb5233f74da5ef" exitCode=0 Oct 01 15:09:00 crc kubenswrapper[4771]: I1001 15:09:00.457879 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mp8gm" event={"ID":"36cbe93e-7162-4367-a756-e731b356fa91","Type":"ContainerDied","Data":"0c6fe7cbfb9c76fe8ccd5637d9fd7fe426e6a3cc54b92a75bccb5233f74da5ef"} Oct 01 15:09:01 crc kubenswrapper[4771]: I1001 15:09:01.501452 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mp8gm" event={"ID":"36cbe93e-7162-4367-a756-e731b356fa91","Type":"ContainerStarted","Data":"60ba82dbcf9d0be6f890829343252018b81ec528f338edbe0181a6641fd54b3c"} Oct 01 15:09:01 crc kubenswrapper[4771]: I1001 15:09:01.501831 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mp8gm" event={"ID":"36cbe93e-7162-4367-a756-e731b356fa91","Type":"ContainerStarted","Data":"da5d47d7455b1cc9bc70cb548669a9762a70200d439266697694b4402e4b754a"} Oct 01 15:09:01 crc kubenswrapper[4771]: I1001 15:09:01.501852 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mp8gm" event={"ID":"36cbe93e-7162-4367-a756-e731b356fa91","Type":"ContainerStarted","Data":"2addf045891ecca5260d6292cda08993e2d1a3194975f976aba7b8ae375a1345"} Oct 01 15:09:01 crc kubenswrapper[4771]: I1001 15:09:01.501869 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mp8gm" event={"ID":"36cbe93e-7162-4367-a756-e731b356fa91","Type":"ContainerStarted","Data":"d7a5d32da77ea7456b34bfc95bd2a4b764c26a647d4c9b2c3cfefebeb595386c"} Oct 01 15:09:01 crc kubenswrapper[4771]: I1001 15:09:01.501886 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mp8gm" event={"ID":"36cbe93e-7162-4367-a756-e731b356fa91","Type":"ContainerStarted","Data":"14ae7ef6e5e28321ab0d69669715c58ab9d0bf5eb46a09f134e3ef565405824a"} Oct 01 15:09:02 crc kubenswrapper[4771]: I1001 15:09:02.513101 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-mp8gm" event={"ID":"36cbe93e-7162-4367-a756-e731b356fa91","Type":"ContainerStarted","Data":"24b6b77851d3c4ccf6bb5b9af64193a6648bed2252f8f8412363ce0fe0ecdc2c"} Oct 01 15:09:02 crc kubenswrapper[4771]: I1001 15:09:02.513540 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-mp8gm" Oct 01 15:09:02 crc kubenswrapper[4771]: I1001 15:09:02.544662 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-mp8gm" podStartSLOduration=5.808191526 podStartE2EDuration="12.544644402s" podCreationTimestamp="2025-10-01 15:08:50 +0000 UTC" firstStartedPulling="2025-10-01 15:08:51.355696978 +0000 UTC m=+775.974872149" lastFinishedPulling="2025-10-01 15:08:58.092149804 +0000 UTC m=+782.711325025" observedRunningTime="2025-10-01 15:09:02.540558551 +0000 UTC m=+787.159733762" watchObservedRunningTime="2025-10-01 15:09:02.544644402 +0000 UTC m=+787.163819583" Oct 01 15:09:06 crc kubenswrapper[4771]: I1001 15:09:06.236865 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-mp8gm" Oct 01 15:09:06 crc kubenswrapper[4771]: I1001 15:09:06.287963 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-mp8gm" Oct 01 15:09:11 crc kubenswrapper[4771]: I1001 15:09:11.240351 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-mp8gm" Oct 01 15:09:11 crc kubenswrapper[4771]: I1001 15:09:11.870725 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-4lvs5" Oct 01 15:09:11 crc kubenswrapper[4771]: I1001 15:09:11.980541 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-5d688f5ffc-ffb8z" Oct 01 15:09:12 crc kubenswrapper[4771]: I1001 15:09:12.177328 4771 patch_prober.go:28] interesting pod/machine-config-daemon-vck47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 15:09:12 crc kubenswrapper[4771]: I1001 15:09:12.177492 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 15:09:12 crc kubenswrapper[4771]: I1001 15:09:12.177570 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vck47" Oct 01 15:09:12 crc kubenswrapper[4771]: I1001 15:09:12.178562 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"80215404bb4371102288dbe39becc7517e16d1990f65ff6bea20c4cb7f6f0681"} pod="openshift-machine-config-operator/machine-config-daemon-vck47" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 15:09:12 crc kubenswrapper[4771]: I1001 15:09:12.178701 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" containerID="cri-o://80215404bb4371102288dbe39becc7517e16d1990f65ff6bea20c4cb7f6f0681" gracePeriod=600 Oct 01 15:09:12 crc kubenswrapper[4771]: I1001 15:09:12.583553 4771 generic.go:334] "Generic (PLEG): container finished" podID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerID="80215404bb4371102288dbe39becc7517e16d1990f65ff6bea20c4cb7f6f0681" exitCode=0 Oct 01 15:09:12 crc kubenswrapper[4771]: I1001 15:09:12.583599 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" event={"ID":"289ee6d3-fabe-417f-964c-76ca03c143cc","Type":"ContainerDied","Data":"80215404bb4371102288dbe39becc7517e16d1990f65ff6bea20c4cb7f6f0681"} Oct 01 15:09:12 crc kubenswrapper[4771]: I1001 15:09:12.583648 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" event={"ID":"289ee6d3-fabe-417f-964c-76ca03c143cc","Type":"ContainerStarted","Data":"a954616bb5027e4b658bf522da064c2dd70331be4152d83f2506f267347e29d3"} Oct 01 15:09:12 crc kubenswrapper[4771]: I1001 15:09:12.583669 4771 scope.go:117] "RemoveContainer" containerID="19a5b763fc09a48e284061322b6ea6e90ab3ff0404cebd1078a70132290e4cb2" Oct 01 15:09:12 crc kubenswrapper[4771]: I1001 15:09:12.842601 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-s5glz" Oct 01 15:09:16 crc kubenswrapper[4771]: I1001 15:09:16.012856 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-s5mcm"] Oct 01 15:09:16 crc kubenswrapper[4771]: I1001 15:09:16.014385 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-s5mcm" Oct 01 15:09:16 crc kubenswrapper[4771]: I1001 15:09:16.026205 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 01 15:09:16 crc kubenswrapper[4771]: I1001 15:09:16.026328 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-2dc4w" Oct 01 15:09:16 crc kubenswrapper[4771]: I1001 15:09:16.026538 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 01 15:09:16 crc kubenswrapper[4771]: I1001 15:09:16.036085 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-s5mcm"] Oct 01 15:09:16 crc kubenswrapper[4771]: I1001 15:09:16.044027 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7455t\" (UniqueName: \"kubernetes.io/projected/c9eb3e22-ac14-44ea-ba42-b67d47f91941-kube-api-access-7455t\") pod \"openstack-operator-index-s5mcm\" (UID: \"c9eb3e22-ac14-44ea-ba42-b67d47f91941\") " pod="openstack-operators/openstack-operator-index-s5mcm" Oct 01 15:09:16 crc kubenswrapper[4771]: I1001 15:09:16.144864 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7455t\" (UniqueName: \"kubernetes.io/projected/c9eb3e22-ac14-44ea-ba42-b67d47f91941-kube-api-access-7455t\") pod \"openstack-operator-index-s5mcm\" (UID: \"c9eb3e22-ac14-44ea-ba42-b67d47f91941\") " pod="openstack-operators/openstack-operator-index-s5mcm" Oct 01 15:09:16 crc kubenswrapper[4771]: I1001 15:09:16.167213 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7455t\" (UniqueName: \"kubernetes.io/projected/c9eb3e22-ac14-44ea-ba42-b67d47f91941-kube-api-access-7455t\") pod \"openstack-operator-index-s5mcm\" (UID: \"c9eb3e22-ac14-44ea-ba42-b67d47f91941\") " pod="openstack-operators/openstack-operator-index-s5mcm" Oct 01 15:09:16 crc kubenswrapper[4771]: I1001 15:09:16.341963 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-s5mcm" Oct 01 15:09:16 crc kubenswrapper[4771]: I1001 15:09:16.799152 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-s5mcm"] Oct 01 15:09:16 crc kubenswrapper[4771]: W1001 15:09:16.809167 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9eb3e22_ac14_44ea_ba42_b67d47f91941.slice/crio-f22da90d9a9cdc46545521157772edb9e74c0a3ec97d1594b082b877cfcd6711 WatchSource:0}: Error finding container f22da90d9a9cdc46545521157772edb9e74c0a3ec97d1594b082b877cfcd6711: Status 404 returned error can't find the container with id f22da90d9a9cdc46545521157772edb9e74c0a3ec97d1594b082b877cfcd6711 Oct 01 15:09:17 crc kubenswrapper[4771]: I1001 15:09:17.632601 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-s5mcm" event={"ID":"c9eb3e22-ac14-44ea-ba42-b67d47f91941","Type":"ContainerStarted","Data":"f22da90d9a9cdc46545521157772edb9e74c0a3ec97d1594b082b877cfcd6711"} Oct 01 15:09:19 crc kubenswrapper[4771]: I1001 15:09:19.350451 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-s5mcm"] Oct 01 15:09:19 crc kubenswrapper[4771]: I1001 15:09:19.652508 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-s5mcm" event={"ID":"c9eb3e22-ac14-44ea-ba42-b67d47f91941","Type":"ContainerStarted","Data":"fd2f58fcbe39d9b4531417631218101d8e935c9ba2adf49126df6a7c45c08e2b"} Oct 01 15:09:19 crc kubenswrapper[4771]: I1001 15:09:19.652693 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-s5mcm" podUID="c9eb3e22-ac14-44ea-ba42-b67d47f91941" containerName="registry-server" containerID="cri-o://fd2f58fcbe39d9b4531417631218101d8e935c9ba2adf49126df6a7c45c08e2b" gracePeriod=2 Oct 01 15:09:19 crc kubenswrapper[4771]: I1001 15:09:19.680974 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-s5mcm" podStartSLOduration=2.195899298 podStartE2EDuration="4.680941379s" podCreationTimestamp="2025-10-01 15:09:15 +0000 UTC" firstStartedPulling="2025-10-01 15:09:16.812387554 +0000 UTC m=+801.431562755" lastFinishedPulling="2025-10-01 15:09:19.297429665 +0000 UTC m=+803.916604836" observedRunningTime="2025-10-01 15:09:19.676609471 +0000 UTC m=+804.295784682" watchObservedRunningTime="2025-10-01 15:09:19.680941379 +0000 UTC m=+804.300116590" Oct 01 15:09:19 crc kubenswrapper[4771]: I1001 15:09:19.977215 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-hl4lz"] Oct 01 15:09:19 crc kubenswrapper[4771]: I1001 15:09:19.978151 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hl4lz" Oct 01 15:09:20 crc kubenswrapper[4771]: I1001 15:09:20.027828 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-hl4lz"] Oct 01 15:09:20 crc kubenswrapper[4771]: I1001 15:09:20.105360 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-s5mcm" Oct 01 15:09:20 crc kubenswrapper[4771]: I1001 15:09:20.109936 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwvd8\" (UniqueName: \"kubernetes.io/projected/4a21d3b6-e1e3-493b-baf1-fcbb055fb859-kube-api-access-zwvd8\") pod \"openstack-operator-index-hl4lz\" (UID: \"4a21d3b6-e1e3-493b-baf1-fcbb055fb859\") " pod="openstack-operators/openstack-operator-index-hl4lz" Oct 01 15:09:20 crc kubenswrapper[4771]: I1001 15:09:20.211023 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7455t\" (UniqueName: \"kubernetes.io/projected/c9eb3e22-ac14-44ea-ba42-b67d47f91941-kube-api-access-7455t\") pod \"c9eb3e22-ac14-44ea-ba42-b67d47f91941\" (UID: \"c9eb3e22-ac14-44ea-ba42-b67d47f91941\") " Oct 01 15:09:20 crc kubenswrapper[4771]: I1001 15:09:20.211445 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwvd8\" (UniqueName: \"kubernetes.io/projected/4a21d3b6-e1e3-493b-baf1-fcbb055fb859-kube-api-access-zwvd8\") pod \"openstack-operator-index-hl4lz\" (UID: \"4a21d3b6-e1e3-493b-baf1-fcbb055fb859\") " pod="openstack-operators/openstack-operator-index-hl4lz" Oct 01 15:09:20 crc kubenswrapper[4771]: I1001 15:09:20.216531 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9eb3e22-ac14-44ea-ba42-b67d47f91941-kube-api-access-7455t" (OuterVolumeSpecName: "kube-api-access-7455t") pod "c9eb3e22-ac14-44ea-ba42-b67d47f91941" (UID: "c9eb3e22-ac14-44ea-ba42-b67d47f91941"). InnerVolumeSpecName "kube-api-access-7455t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:09:20 crc kubenswrapper[4771]: I1001 15:09:20.228357 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwvd8\" (UniqueName: \"kubernetes.io/projected/4a21d3b6-e1e3-493b-baf1-fcbb055fb859-kube-api-access-zwvd8\") pod \"openstack-operator-index-hl4lz\" (UID: \"4a21d3b6-e1e3-493b-baf1-fcbb055fb859\") " pod="openstack-operators/openstack-operator-index-hl4lz" Oct 01 15:09:20 crc kubenswrapper[4771]: I1001 15:09:20.310783 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-hl4lz" Oct 01 15:09:20 crc kubenswrapper[4771]: I1001 15:09:20.312268 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7455t\" (UniqueName: \"kubernetes.io/projected/c9eb3e22-ac14-44ea-ba42-b67d47f91941-kube-api-access-7455t\") on node \"crc\" DevicePath \"\"" Oct 01 15:09:20 crc kubenswrapper[4771]: I1001 15:09:20.663046 4771 generic.go:334] "Generic (PLEG): container finished" podID="c9eb3e22-ac14-44ea-ba42-b67d47f91941" containerID="fd2f58fcbe39d9b4531417631218101d8e935c9ba2adf49126df6a7c45c08e2b" exitCode=0 Oct 01 15:09:20 crc kubenswrapper[4771]: I1001 15:09:20.663124 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-s5mcm" event={"ID":"c9eb3e22-ac14-44ea-ba42-b67d47f91941","Type":"ContainerDied","Data":"fd2f58fcbe39d9b4531417631218101d8e935c9ba2adf49126df6a7c45c08e2b"} Oct 01 15:09:20 crc kubenswrapper[4771]: I1001 15:09:20.663183 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-s5mcm" event={"ID":"c9eb3e22-ac14-44ea-ba42-b67d47f91941","Type":"ContainerDied","Data":"f22da90d9a9cdc46545521157772edb9e74c0a3ec97d1594b082b877cfcd6711"} Oct 01 15:09:20 crc kubenswrapper[4771]: I1001 15:09:20.663205 4771 scope.go:117] "RemoveContainer" containerID="fd2f58fcbe39d9b4531417631218101d8e935c9ba2adf49126df6a7c45c08e2b" Oct 01 15:09:20 crc kubenswrapper[4771]: I1001 15:09:20.663141 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-s5mcm" Oct 01 15:09:20 crc kubenswrapper[4771]: I1001 15:09:20.689784 4771 scope.go:117] "RemoveContainer" containerID="fd2f58fcbe39d9b4531417631218101d8e935c9ba2adf49126df6a7c45c08e2b" Oct 01 15:09:20 crc kubenswrapper[4771]: E1001 15:09:20.690599 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd2f58fcbe39d9b4531417631218101d8e935c9ba2adf49126df6a7c45c08e2b\": container with ID starting with fd2f58fcbe39d9b4531417631218101d8e935c9ba2adf49126df6a7c45c08e2b not found: ID does not exist" containerID="fd2f58fcbe39d9b4531417631218101d8e935c9ba2adf49126df6a7c45c08e2b" Oct 01 15:09:20 crc kubenswrapper[4771]: I1001 15:09:20.690672 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd2f58fcbe39d9b4531417631218101d8e935c9ba2adf49126df6a7c45c08e2b"} err="failed to get container status \"fd2f58fcbe39d9b4531417631218101d8e935c9ba2adf49126df6a7c45c08e2b\": rpc error: code = NotFound desc = could not find container \"fd2f58fcbe39d9b4531417631218101d8e935c9ba2adf49126df6a7c45c08e2b\": container with ID starting with fd2f58fcbe39d9b4531417631218101d8e935c9ba2adf49126df6a7c45c08e2b not found: ID does not exist" Oct 01 15:09:20 crc kubenswrapper[4771]: I1001 15:09:20.708259 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-s5mcm"] Oct 01 15:09:20 crc kubenswrapper[4771]: I1001 15:09:20.714889 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-s5mcm"] Oct 01 15:09:20 crc kubenswrapper[4771]: I1001 15:09:20.780779 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-hl4lz"] Oct 01 15:09:20 crc kubenswrapper[4771]: W1001 15:09:20.788995 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a21d3b6_e1e3_493b_baf1_fcbb055fb859.slice/crio-3ac5746452659458e4c76e113788b6e300f045bc38c83479b6774af14885d5a2 WatchSource:0}: Error finding container 3ac5746452659458e4c76e113788b6e300f045bc38c83479b6774af14885d5a2: Status 404 returned error can't find the container with id 3ac5746452659458e4c76e113788b6e300f045bc38c83479b6774af14885d5a2 Oct 01 15:09:21 crc kubenswrapper[4771]: I1001 15:09:21.674713 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hl4lz" event={"ID":"4a21d3b6-e1e3-493b-baf1-fcbb055fb859","Type":"ContainerStarted","Data":"c59ea892efefb1e9b00df3c2234b72ea80148ca006b8e34aa9f9f7bdb74887c6"} Oct 01 15:09:21 crc kubenswrapper[4771]: I1001 15:09:21.675225 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-hl4lz" event={"ID":"4a21d3b6-e1e3-493b-baf1-fcbb055fb859","Type":"ContainerStarted","Data":"3ac5746452659458e4c76e113788b6e300f045bc38c83479b6774af14885d5a2"} Oct 01 15:09:21 crc kubenswrapper[4771]: I1001 15:09:21.701554 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-hl4lz" podStartSLOduration=2.641646159 podStartE2EDuration="2.701525406s" podCreationTimestamp="2025-10-01 15:09:19 +0000 UTC" firstStartedPulling="2025-10-01 15:09:20.794358558 +0000 UTC m=+805.413533759" lastFinishedPulling="2025-10-01 15:09:20.854237805 +0000 UTC m=+805.473413006" observedRunningTime="2025-10-01 15:09:21.697856574 +0000 UTC m=+806.317031825" watchObservedRunningTime="2025-10-01 15:09:21.701525406 +0000 UTC m=+806.320700617" Oct 01 15:09:22 crc kubenswrapper[4771]: I1001 15:09:22.001900 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9eb3e22-ac14-44ea-ba42-b67d47f91941" path="/var/lib/kubelet/pods/c9eb3e22-ac14-44ea-ba42-b67d47f91941/volumes" Oct 01 15:09:30 crc kubenswrapper[4771]: I1001 15:09:30.311694 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-hl4lz" Oct 01 15:09:30 crc kubenswrapper[4771]: I1001 15:09:30.312317 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-hl4lz" Oct 01 15:09:30 crc kubenswrapper[4771]: I1001 15:09:30.355985 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-hl4lz" Oct 01 15:09:30 crc kubenswrapper[4771]: I1001 15:09:30.785390 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-hl4lz" Oct 01 15:09:33 crc kubenswrapper[4771]: I1001 15:09:33.721674 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cxscn"] Oct 01 15:09:33 crc kubenswrapper[4771]: E1001 15:09:33.722612 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9eb3e22-ac14-44ea-ba42-b67d47f91941" containerName="registry-server" Oct 01 15:09:33 crc kubenswrapper[4771]: I1001 15:09:33.722645 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9eb3e22-ac14-44ea-ba42-b67d47f91941" containerName="registry-server" Oct 01 15:09:33 crc kubenswrapper[4771]: I1001 15:09:33.722976 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9eb3e22-ac14-44ea-ba42-b67d47f91941" containerName="registry-server" Oct 01 15:09:33 crc kubenswrapper[4771]: I1001 15:09:33.724903 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cxscn" Oct 01 15:09:33 crc kubenswrapper[4771]: I1001 15:09:33.740628 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2kj2\" (UniqueName: \"kubernetes.io/projected/a7f9464c-4fa4-4b33-91d0-01b69a670e25-kube-api-access-r2kj2\") pod \"redhat-marketplace-cxscn\" (UID: \"a7f9464c-4fa4-4b33-91d0-01b69a670e25\") " pod="openshift-marketplace/redhat-marketplace-cxscn" Oct 01 15:09:33 crc kubenswrapper[4771]: I1001 15:09:33.740814 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7f9464c-4fa4-4b33-91d0-01b69a670e25-catalog-content\") pod \"redhat-marketplace-cxscn\" (UID: \"a7f9464c-4fa4-4b33-91d0-01b69a670e25\") " pod="openshift-marketplace/redhat-marketplace-cxscn" Oct 01 15:09:33 crc kubenswrapper[4771]: I1001 15:09:33.740908 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7f9464c-4fa4-4b33-91d0-01b69a670e25-utilities\") pod \"redhat-marketplace-cxscn\" (UID: \"a7f9464c-4fa4-4b33-91d0-01b69a670e25\") " pod="openshift-marketplace/redhat-marketplace-cxscn" Oct 01 15:09:33 crc kubenswrapper[4771]: I1001 15:09:33.758298 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cxscn"] Oct 01 15:09:33 crc kubenswrapper[4771]: I1001 15:09:33.842497 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7f9464c-4fa4-4b33-91d0-01b69a670e25-catalog-content\") pod \"redhat-marketplace-cxscn\" (UID: \"a7f9464c-4fa4-4b33-91d0-01b69a670e25\") " pod="openshift-marketplace/redhat-marketplace-cxscn" Oct 01 15:09:33 crc kubenswrapper[4771]: I1001 15:09:33.842592 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7f9464c-4fa4-4b33-91d0-01b69a670e25-utilities\") pod \"redhat-marketplace-cxscn\" (UID: \"a7f9464c-4fa4-4b33-91d0-01b69a670e25\") " pod="openshift-marketplace/redhat-marketplace-cxscn" Oct 01 15:09:33 crc kubenswrapper[4771]: I1001 15:09:33.842640 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2kj2\" (UniqueName: \"kubernetes.io/projected/a7f9464c-4fa4-4b33-91d0-01b69a670e25-kube-api-access-r2kj2\") pod \"redhat-marketplace-cxscn\" (UID: \"a7f9464c-4fa4-4b33-91d0-01b69a670e25\") " pod="openshift-marketplace/redhat-marketplace-cxscn" Oct 01 15:09:33 crc kubenswrapper[4771]: I1001 15:09:33.843034 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7f9464c-4fa4-4b33-91d0-01b69a670e25-catalog-content\") pod \"redhat-marketplace-cxscn\" (UID: \"a7f9464c-4fa4-4b33-91d0-01b69a670e25\") " pod="openshift-marketplace/redhat-marketplace-cxscn" Oct 01 15:09:33 crc kubenswrapper[4771]: I1001 15:09:33.843316 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7f9464c-4fa4-4b33-91d0-01b69a670e25-utilities\") pod \"redhat-marketplace-cxscn\" (UID: \"a7f9464c-4fa4-4b33-91d0-01b69a670e25\") " pod="openshift-marketplace/redhat-marketplace-cxscn" Oct 01 15:09:33 crc kubenswrapper[4771]: I1001 15:09:33.871420 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2kj2\" (UniqueName: \"kubernetes.io/projected/a7f9464c-4fa4-4b33-91d0-01b69a670e25-kube-api-access-r2kj2\") pod \"redhat-marketplace-cxscn\" (UID: \"a7f9464c-4fa4-4b33-91d0-01b69a670e25\") " pod="openshift-marketplace/redhat-marketplace-cxscn" Oct 01 15:09:34 crc kubenswrapper[4771]: I1001 15:09:34.052113 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cxscn" Oct 01 15:09:34 crc kubenswrapper[4771]: I1001 15:09:34.541183 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cxscn"] Oct 01 15:09:34 crc kubenswrapper[4771]: W1001 15:09:34.552974 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7f9464c_4fa4_4b33_91d0_01b69a670e25.slice/crio-13bf58aa1acb95bd312fe1fed1bb21171fb446fea42edaf0434590137e9e36e5 WatchSource:0}: Error finding container 13bf58aa1acb95bd312fe1fed1bb21171fb446fea42edaf0434590137e9e36e5: Status 404 returned error can't find the container with id 13bf58aa1acb95bd312fe1fed1bb21171fb446fea42edaf0434590137e9e36e5 Oct 01 15:09:34 crc kubenswrapper[4771]: I1001 15:09:34.796978 4771 generic.go:334] "Generic (PLEG): container finished" podID="a7f9464c-4fa4-4b33-91d0-01b69a670e25" containerID="b8b6149bf97157a0b23615a75510d726965caf264f487377745001bbc3397a74" exitCode=0 Oct 01 15:09:34 crc kubenswrapper[4771]: I1001 15:09:34.797056 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cxscn" event={"ID":"a7f9464c-4fa4-4b33-91d0-01b69a670e25","Type":"ContainerDied","Data":"b8b6149bf97157a0b23615a75510d726965caf264f487377745001bbc3397a74"} Oct 01 15:09:34 crc kubenswrapper[4771]: I1001 15:09:34.797264 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cxscn" event={"ID":"a7f9464c-4fa4-4b33-91d0-01b69a670e25","Type":"ContainerStarted","Data":"13bf58aa1acb95bd312fe1fed1bb21171fb446fea42edaf0434590137e9e36e5"} Oct 01 15:09:35 crc kubenswrapper[4771]: I1001 15:09:35.809011 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cxscn" event={"ID":"a7f9464c-4fa4-4b33-91d0-01b69a670e25","Type":"ContainerStarted","Data":"fef89bbca8ffa831ffe196997ad232502c0101c2c2bea0a5d85b6334192313a8"} Oct 01 15:09:36 crc kubenswrapper[4771]: I1001 15:09:36.818335 4771 generic.go:334] "Generic (PLEG): container finished" podID="a7f9464c-4fa4-4b33-91d0-01b69a670e25" containerID="fef89bbca8ffa831ffe196997ad232502c0101c2c2bea0a5d85b6334192313a8" exitCode=0 Oct 01 15:09:36 crc kubenswrapper[4771]: I1001 15:09:36.818373 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cxscn" event={"ID":"a7f9464c-4fa4-4b33-91d0-01b69a670e25","Type":"ContainerDied","Data":"fef89bbca8ffa831ffe196997ad232502c0101c2c2bea0a5d85b6334192313a8"} Oct 01 15:09:37 crc kubenswrapper[4771]: I1001 15:09:37.829308 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cxscn" event={"ID":"a7f9464c-4fa4-4b33-91d0-01b69a670e25","Type":"ContainerStarted","Data":"320db39d196f19c5bc91e624dbfd890feb9f06dd862ffbe29f8c132526c2af16"} Oct 01 15:09:37 crc kubenswrapper[4771]: I1001 15:09:37.853452 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cxscn" podStartSLOduration=2.326730313 podStartE2EDuration="4.853428488s" podCreationTimestamp="2025-10-01 15:09:33 +0000 UTC" firstStartedPulling="2025-10-01 15:09:34.798513626 +0000 UTC m=+819.417688787" lastFinishedPulling="2025-10-01 15:09:37.325211761 +0000 UTC m=+821.944386962" observedRunningTime="2025-10-01 15:09:37.850490305 +0000 UTC m=+822.469665486" watchObservedRunningTime="2025-10-01 15:09:37.853428488 +0000 UTC m=+822.472603699" Oct 01 15:09:38 crc kubenswrapper[4771]: I1001 15:09:38.342309 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/f67954f135a897979546c33b0a718c86d8b189ef1f36644161d6597b857mb4c"] Oct 01 15:09:38 crc kubenswrapper[4771]: I1001 15:09:38.343411 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f67954f135a897979546c33b0a718c86d8b189ef1f36644161d6597b857mb4c" Oct 01 15:09:38 crc kubenswrapper[4771]: I1001 15:09:38.345867 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-kvrtf" Oct 01 15:09:38 crc kubenswrapper[4771]: I1001 15:09:38.359028 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/f67954f135a897979546c33b0a718c86d8b189ef1f36644161d6597b857mb4c"] Oct 01 15:09:38 crc kubenswrapper[4771]: I1001 15:09:38.506886 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/115476f1-e753-4ab0-9c3f-b4a6ea4a6739-bundle\") pod \"f67954f135a897979546c33b0a718c86d8b189ef1f36644161d6597b857mb4c\" (UID: \"115476f1-e753-4ab0-9c3f-b4a6ea4a6739\") " pod="openstack-operators/f67954f135a897979546c33b0a718c86d8b189ef1f36644161d6597b857mb4c" Oct 01 15:09:38 crc kubenswrapper[4771]: I1001 15:09:38.506953 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4dqk\" (UniqueName: \"kubernetes.io/projected/115476f1-e753-4ab0-9c3f-b4a6ea4a6739-kube-api-access-p4dqk\") pod \"f67954f135a897979546c33b0a718c86d8b189ef1f36644161d6597b857mb4c\" (UID: \"115476f1-e753-4ab0-9c3f-b4a6ea4a6739\") " pod="openstack-operators/f67954f135a897979546c33b0a718c86d8b189ef1f36644161d6597b857mb4c" Oct 01 15:09:38 crc kubenswrapper[4771]: I1001 15:09:38.507582 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/115476f1-e753-4ab0-9c3f-b4a6ea4a6739-util\") pod \"f67954f135a897979546c33b0a718c86d8b189ef1f36644161d6597b857mb4c\" (UID: \"115476f1-e753-4ab0-9c3f-b4a6ea4a6739\") " pod="openstack-operators/f67954f135a897979546c33b0a718c86d8b189ef1f36644161d6597b857mb4c" Oct 01 15:09:38 crc kubenswrapper[4771]: I1001 15:09:38.608988 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/115476f1-e753-4ab0-9c3f-b4a6ea4a6739-util\") pod \"f67954f135a897979546c33b0a718c86d8b189ef1f36644161d6597b857mb4c\" (UID: \"115476f1-e753-4ab0-9c3f-b4a6ea4a6739\") " pod="openstack-operators/f67954f135a897979546c33b0a718c86d8b189ef1f36644161d6597b857mb4c" Oct 01 15:09:38 crc kubenswrapper[4771]: I1001 15:09:38.609058 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/115476f1-e753-4ab0-9c3f-b4a6ea4a6739-bundle\") pod \"f67954f135a897979546c33b0a718c86d8b189ef1f36644161d6597b857mb4c\" (UID: \"115476f1-e753-4ab0-9c3f-b4a6ea4a6739\") " pod="openstack-operators/f67954f135a897979546c33b0a718c86d8b189ef1f36644161d6597b857mb4c" Oct 01 15:09:38 crc kubenswrapper[4771]: I1001 15:09:38.609089 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4dqk\" (UniqueName: \"kubernetes.io/projected/115476f1-e753-4ab0-9c3f-b4a6ea4a6739-kube-api-access-p4dqk\") pod \"f67954f135a897979546c33b0a718c86d8b189ef1f36644161d6597b857mb4c\" (UID: \"115476f1-e753-4ab0-9c3f-b4a6ea4a6739\") " pod="openstack-operators/f67954f135a897979546c33b0a718c86d8b189ef1f36644161d6597b857mb4c" Oct 01 15:09:38 crc kubenswrapper[4771]: I1001 15:09:38.609602 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/115476f1-e753-4ab0-9c3f-b4a6ea4a6739-util\") pod \"f67954f135a897979546c33b0a718c86d8b189ef1f36644161d6597b857mb4c\" (UID: \"115476f1-e753-4ab0-9c3f-b4a6ea4a6739\") " pod="openstack-operators/f67954f135a897979546c33b0a718c86d8b189ef1f36644161d6597b857mb4c" Oct 01 15:09:38 crc kubenswrapper[4771]: I1001 15:09:38.609653 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/115476f1-e753-4ab0-9c3f-b4a6ea4a6739-bundle\") pod \"f67954f135a897979546c33b0a718c86d8b189ef1f36644161d6597b857mb4c\" (UID: \"115476f1-e753-4ab0-9c3f-b4a6ea4a6739\") " pod="openstack-operators/f67954f135a897979546c33b0a718c86d8b189ef1f36644161d6597b857mb4c" Oct 01 15:09:38 crc kubenswrapper[4771]: I1001 15:09:38.634891 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4dqk\" (UniqueName: \"kubernetes.io/projected/115476f1-e753-4ab0-9c3f-b4a6ea4a6739-kube-api-access-p4dqk\") pod \"f67954f135a897979546c33b0a718c86d8b189ef1f36644161d6597b857mb4c\" (UID: \"115476f1-e753-4ab0-9c3f-b4a6ea4a6739\") " pod="openstack-operators/f67954f135a897979546c33b0a718c86d8b189ef1f36644161d6597b857mb4c" Oct 01 15:09:38 crc kubenswrapper[4771]: I1001 15:09:38.659216 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f67954f135a897979546c33b0a718c86d8b189ef1f36644161d6597b857mb4c" Oct 01 15:09:39 crc kubenswrapper[4771]: I1001 15:09:39.081778 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/f67954f135a897979546c33b0a718c86d8b189ef1f36644161d6597b857mb4c"] Oct 01 15:09:39 crc kubenswrapper[4771]: W1001 15:09:39.084918 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod115476f1_e753_4ab0_9c3f_b4a6ea4a6739.slice/crio-e03d538e3731116a566a97a8d728080fd3648f1cfe59e25e57b1e608a37e0cdf WatchSource:0}: Error finding container e03d538e3731116a566a97a8d728080fd3648f1cfe59e25e57b1e608a37e0cdf: Status 404 returned error can't find the container with id e03d538e3731116a566a97a8d728080fd3648f1cfe59e25e57b1e608a37e0cdf Oct 01 15:09:39 crc kubenswrapper[4771]: I1001 15:09:39.847435 4771 generic.go:334] "Generic (PLEG): container finished" podID="115476f1-e753-4ab0-9c3f-b4a6ea4a6739" containerID="16b66d59d48736479ff4d41e5ed946516fa76f565306e389c1110b859092638a" exitCode=0 Oct 01 15:09:39 crc kubenswrapper[4771]: I1001 15:09:39.847515 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f67954f135a897979546c33b0a718c86d8b189ef1f36644161d6597b857mb4c" event={"ID":"115476f1-e753-4ab0-9c3f-b4a6ea4a6739","Type":"ContainerDied","Data":"16b66d59d48736479ff4d41e5ed946516fa76f565306e389c1110b859092638a"} Oct 01 15:09:39 crc kubenswrapper[4771]: I1001 15:09:39.847878 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f67954f135a897979546c33b0a718c86d8b189ef1f36644161d6597b857mb4c" event={"ID":"115476f1-e753-4ab0-9c3f-b4a6ea4a6739","Type":"ContainerStarted","Data":"e03d538e3731116a566a97a8d728080fd3648f1cfe59e25e57b1e608a37e0cdf"} Oct 01 15:09:40 crc kubenswrapper[4771]: I1001 15:09:40.859875 4771 generic.go:334] "Generic (PLEG): container finished" podID="115476f1-e753-4ab0-9c3f-b4a6ea4a6739" containerID="b64e95009f19847d9746f4c90b55f45720e0b3530d0964ab0874fd16b303a38d" exitCode=0 Oct 01 15:09:40 crc kubenswrapper[4771]: I1001 15:09:40.859962 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f67954f135a897979546c33b0a718c86d8b189ef1f36644161d6597b857mb4c" event={"ID":"115476f1-e753-4ab0-9c3f-b4a6ea4a6739","Type":"ContainerDied","Data":"b64e95009f19847d9746f4c90b55f45720e0b3530d0964ab0874fd16b303a38d"} Oct 01 15:09:41 crc kubenswrapper[4771]: I1001 15:09:41.871861 4771 generic.go:334] "Generic (PLEG): container finished" podID="115476f1-e753-4ab0-9c3f-b4a6ea4a6739" containerID="18971714db0c56b53350c3fba78a6870ed91892bb858346b5683f13a833a0d73" exitCode=0 Oct 01 15:09:41 crc kubenswrapper[4771]: I1001 15:09:41.871943 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f67954f135a897979546c33b0a718c86d8b189ef1f36644161d6597b857mb4c" event={"ID":"115476f1-e753-4ab0-9c3f-b4a6ea4a6739","Type":"ContainerDied","Data":"18971714db0c56b53350c3fba78a6870ed91892bb858346b5683f13a833a0d73"} Oct 01 15:09:43 crc kubenswrapper[4771]: I1001 15:09:43.274617 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f67954f135a897979546c33b0a718c86d8b189ef1f36644161d6597b857mb4c" Oct 01 15:09:43 crc kubenswrapper[4771]: I1001 15:09:43.299137 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4dqk\" (UniqueName: \"kubernetes.io/projected/115476f1-e753-4ab0-9c3f-b4a6ea4a6739-kube-api-access-p4dqk\") pod \"115476f1-e753-4ab0-9c3f-b4a6ea4a6739\" (UID: \"115476f1-e753-4ab0-9c3f-b4a6ea4a6739\") " Oct 01 15:09:43 crc kubenswrapper[4771]: I1001 15:09:43.300215 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/115476f1-e753-4ab0-9c3f-b4a6ea4a6739-util\") pod \"115476f1-e753-4ab0-9c3f-b4a6ea4a6739\" (UID: \"115476f1-e753-4ab0-9c3f-b4a6ea4a6739\") " Oct 01 15:09:43 crc kubenswrapper[4771]: I1001 15:09:43.300312 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/115476f1-e753-4ab0-9c3f-b4a6ea4a6739-bundle\") pod \"115476f1-e753-4ab0-9c3f-b4a6ea4a6739\" (UID: \"115476f1-e753-4ab0-9c3f-b4a6ea4a6739\") " Oct 01 15:09:43 crc kubenswrapper[4771]: I1001 15:09:43.302384 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/115476f1-e753-4ab0-9c3f-b4a6ea4a6739-bundle" (OuterVolumeSpecName: "bundle") pod "115476f1-e753-4ab0-9c3f-b4a6ea4a6739" (UID: "115476f1-e753-4ab0-9c3f-b4a6ea4a6739"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:09:43 crc kubenswrapper[4771]: I1001 15:09:43.316117 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/115476f1-e753-4ab0-9c3f-b4a6ea4a6739-kube-api-access-p4dqk" (OuterVolumeSpecName: "kube-api-access-p4dqk") pod "115476f1-e753-4ab0-9c3f-b4a6ea4a6739" (UID: "115476f1-e753-4ab0-9c3f-b4a6ea4a6739"). InnerVolumeSpecName "kube-api-access-p4dqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:09:43 crc kubenswrapper[4771]: I1001 15:09:43.325460 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/115476f1-e753-4ab0-9c3f-b4a6ea4a6739-util" (OuterVolumeSpecName: "util") pod "115476f1-e753-4ab0-9c3f-b4a6ea4a6739" (UID: "115476f1-e753-4ab0-9c3f-b4a6ea4a6739"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:09:43 crc kubenswrapper[4771]: I1001 15:09:43.403202 4771 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/115476f1-e753-4ab0-9c3f-b4a6ea4a6739-util\") on node \"crc\" DevicePath \"\"" Oct 01 15:09:43 crc kubenswrapper[4771]: I1001 15:09:43.403278 4771 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/115476f1-e753-4ab0-9c3f-b4a6ea4a6739-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:09:43 crc kubenswrapper[4771]: I1001 15:09:43.403297 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4dqk\" (UniqueName: \"kubernetes.io/projected/115476f1-e753-4ab0-9c3f-b4a6ea4a6739-kube-api-access-p4dqk\") on node \"crc\" DevicePath \"\"" Oct 01 15:09:43 crc kubenswrapper[4771]: I1001 15:09:43.892365 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f67954f135a897979546c33b0a718c86d8b189ef1f36644161d6597b857mb4c" event={"ID":"115476f1-e753-4ab0-9c3f-b4a6ea4a6739","Type":"ContainerDied","Data":"e03d538e3731116a566a97a8d728080fd3648f1cfe59e25e57b1e608a37e0cdf"} Oct 01 15:09:43 crc kubenswrapper[4771]: I1001 15:09:43.892448 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e03d538e3731116a566a97a8d728080fd3648f1cfe59e25e57b1e608a37e0cdf" Oct 01 15:09:43 crc kubenswrapper[4771]: I1001 15:09:43.892526 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f67954f135a897979546c33b0a718c86d8b189ef1f36644161d6597b857mb4c" Oct 01 15:09:44 crc kubenswrapper[4771]: I1001 15:09:44.052549 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cxscn" Oct 01 15:09:44 crc kubenswrapper[4771]: I1001 15:09:44.052636 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cxscn" Oct 01 15:09:44 crc kubenswrapper[4771]: I1001 15:09:44.129287 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cxscn" Oct 01 15:09:44 crc kubenswrapper[4771]: I1001 15:09:44.980097 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cxscn" Oct 01 15:09:47 crc kubenswrapper[4771]: I1001 15:09:47.098049 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cxscn"] Oct 01 15:09:47 crc kubenswrapper[4771]: I1001 15:09:47.098404 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cxscn" podUID="a7f9464c-4fa4-4b33-91d0-01b69a670e25" containerName="registry-server" containerID="cri-o://320db39d196f19c5bc91e624dbfd890feb9f06dd862ffbe29f8c132526c2af16" gracePeriod=2 Oct 01 15:09:47 crc kubenswrapper[4771]: I1001 15:09:47.615450 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cxscn" Oct 01 15:09:47 crc kubenswrapper[4771]: I1001 15:09:47.668391 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7f9464c-4fa4-4b33-91d0-01b69a670e25-catalog-content\") pod \"a7f9464c-4fa4-4b33-91d0-01b69a670e25\" (UID: \"a7f9464c-4fa4-4b33-91d0-01b69a670e25\") " Oct 01 15:09:47 crc kubenswrapper[4771]: I1001 15:09:47.668557 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2kj2\" (UniqueName: \"kubernetes.io/projected/a7f9464c-4fa4-4b33-91d0-01b69a670e25-kube-api-access-r2kj2\") pod \"a7f9464c-4fa4-4b33-91d0-01b69a670e25\" (UID: \"a7f9464c-4fa4-4b33-91d0-01b69a670e25\") " Oct 01 15:09:47 crc kubenswrapper[4771]: I1001 15:09:47.668616 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7f9464c-4fa4-4b33-91d0-01b69a670e25-utilities\") pod \"a7f9464c-4fa4-4b33-91d0-01b69a670e25\" (UID: \"a7f9464c-4fa4-4b33-91d0-01b69a670e25\") " Oct 01 15:09:47 crc kubenswrapper[4771]: I1001 15:09:47.670323 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7f9464c-4fa4-4b33-91d0-01b69a670e25-utilities" (OuterVolumeSpecName: "utilities") pod "a7f9464c-4fa4-4b33-91d0-01b69a670e25" (UID: "a7f9464c-4fa4-4b33-91d0-01b69a670e25"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:09:47 crc kubenswrapper[4771]: I1001 15:09:47.677623 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7f9464c-4fa4-4b33-91d0-01b69a670e25-kube-api-access-r2kj2" (OuterVolumeSpecName: "kube-api-access-r2kj2") pod "a7f9464c-4fa4-4b33-91d0-01b69a670e25" (UID: "a7f9464c-4fa4-4b33-91d0-01b69a670e25"). InnerVolumeSpecName "kube-api-access-r2kj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:09:47 crc kubenswrapper[4771]: I1001 15:09:47.693762 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7f9464c-4fa4-4b33-91d0-01b69a670e25-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a7f9464c-4fa4-4b33-91d0-01b69a670e25" (UID: "a7f9464c-4fa4-4b33-91d0-01b69a670e25"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:09:47 crc kubenswrapper[4771]: I1001 15:09:47.770868 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a7f9464c-4fa4-4b33-91d0-01b69a670e25-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 15:09:47 crc kubenswrapper[4771]: I1001 15:09:47.770924 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2kj2\" (UniqueName: \"kubernetes.io/projected/a7f9464c-4fa4-4b33-91d0-01b69a670e25-kube-api-access-r2kj2\") on node \"crc\" DevicePath \"\"" Oct 01 15:09:47 crc kubenswrapper[4771]: I1001 15:09:47.770946 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a7f9464c-4fa4-4b33-91d0-01b69a670e25-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 15:09:47 crc kubenswrapper[4771]: I1001 15:09:47.931072 4771 generic.go:334] "Generic (PLEG): container finished" podID="a7f9464c-4fa4-4b33-91d0-01b69a670e25" containerID="320db39d196f19c5bc91e624dbfd890feb9f06dd862ffbe29f8c132526c2af16" exitCode=0 Oct 01 15:09:47 crc kubenswrapper[4771]: I1001 15:09:47.931137 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cxscn" event={"ID":"a7f9464c-4fa4-4b33-91d0-01b69a670e25","Type":"ContainerDied","Data":"320db39d196f19c5bc91e624dbfd890feb9f06dd862ffbe29f8c132526c2af16"} Oct 01 15:09:47 crc kubenswrapper[4771]: I1001 15:09:47.931178 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cxscn" event={"ID":"a7f9464c-4fa4-4b33-91d0-01b69a670e25","Type":"ContainerDied","Data":"13bf58aa1acb95bd312fe1fed1bb21171fb446fea42edaf0434590137e9e36e5"} Oct 01 15:09:47 crc kubenswrapper[4771]: I1001 15:09:47.931207 4771 scope.go:117] "RemoveContainer" containerID="320db39d196f19c5bc91e624dbfd890feb9f06dd862ffbe29f8c132526c2af16" Oct 01 15:09:47 crc kubenswrapper[4771]: I1001 15:09:47.931264 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cxscn" Oct 01 15:09:47 crc kubenswrapper[4771]: I1001 15:09:47.961243 4771 scope.go:117] "RemoveContainer" containerID="fef89bbca8ffa831ffe196997ad232502c0101c2c2bea0a5d85b6334192313a8" Oct 01 15:09:47 crc kubenswrapper[4771]: I1001 15:09:47.978631 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cxscn"] Oct 01 15:09:48 crc kubenswrapper[4771]: I1001 15:09:47.999938 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cxscn"] Oct 01 15:09:48 crc kubenswrapper[4771]: I1001 15:09:48.019856 4771 scope.go:117] "RemoveContainer" containerID="b8b6149bf97157a0b23615a75510d726965caf264f487377745001bbc3397a74" Oct 01 15:09:48 crc kubenswrapper[4771]: I1001 15:09:48.044945 4771 scope.go:117] "RemoveContainer" containerID="320db39d196f19c5bc91e624dbfd890feb9f06dd862ffbe29f8c132526c2af16" Oct 01 15:09:48 crc kubenswrapper[4771]: E1001 15:09:48.049206 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"320db39d196f19c5bc91e624dbfd890feb9f06dd862ffbe29f8c132526c2af16\": container with ID starting with 320db39d196f19c5bc91e624dbfd890feb9f06dd862ffbe29f8c132526c2af16 not found: ID does not exist" containerID="320db39d196f19c5bc91e624dbfd890feb9f06dd862ffbe29f8c132526c2af16" Oct 01 15:09:48 crc kubenswrapper[4771]: I1001 15:09:48.049363 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"320db39d196f19c5bc91e624dbfd890feb9f06dd862ffbe29f8c132526c2af16"} err="failed to get container status \"320db39d196f19c5bc91e624dbfd890feb9f06dd862ffbe29f8c132526c2af16\": rpc error: code = NotFound desc = could not find container \"320db39d196f19c5bc91e624dbfd890feb9f06dd862ffbe29f8c132526c2af16\": container with ID starting with 320db39d196f19c5bc91e624dbfd890feb9f06dd862ffbe29f8c132526c2af16 not found: ID does not exist" Oct 01 15:09:48 crc kubenswrapper[4771]: I1001 15:09:48.049422 4771 scope.go:117] "RemoveContainer" containerID="fef89bbca8ffa831ffe196997ad232502c0101c2c2bea0a5d85b6334192313a8" Oct 01 15:09:48 crc kubenswrapper[4771]: E1001 15:09:48.050238 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fef89bbca8ffa831ffe196997ad232502c0101c2c2bea0a5d85b6334192313a8\": container with ID starting with fef89bbca8ffa831ffe196997ad232502c0101c2c2bea0a5d85b6334192313a8 not found: ID does not exist" containerID="fef89bbca8ffa831ffe196997ad232502c0101c2c2bea0a5d85b6334192313a8" Oct 01 15:09:48 crc kubenswrapper[4771]: I1001 15:09:48.050316 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fef89bbca8ffa831ffe196997ad232502c0101c2c2bea0a5d85b6334192313a8"} err="failed to get container status \"fef89bbca8ffa831ffe196997ad232502c0101c2c2bea0a5d85b6334192313a8\": rpc error: code = NotFound desc = could not find container \"fef89bbca8ffa831ffe196997ad232502c0101c2c2bea0a5d85b6334192313a8\": container with ID starting with fef89bbca8ffa831ffe196997ad232502c0101c2c2bea0a5d85b6334192313a8 not found: ID does not exist" Oct 01 15:09:48 crc kubenswrapper[4771]: I1001 15:09:48.050359 4771 scope.go:117] "RemoveContainer" containerID="b8b6149bf97157a0b23615a75510d726965caf264f487377745001bbc3397a74" Oct 01 15:09:48 crc kubenswrapper[4771]: E1001 15:09:48.050906 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8b6149bf97157a0b23615a75510d726965caf264f487377745001bbc3397a74\": container with ID starting with b8b6149bf97157a0b23615a75510d726965caf264f487377745001bbc3397a74 not found: ID does not exist" containerID="b8b6149bf97157a0b23615a75510d726965caf264f487377745001bbc3397a74" Oct 01 15:09:48 crc kubenswrapper[4771]: I1001 15:09:48.050961 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8b6149bf97157a0b23615a75510d726965caf264f487377745001bbc3397a74"} err="failed to get container status \"b8b6149bf97157a0b23615a75510d726965caf264f487377745001bbc3397a74\": rpc error: code = NotFound desc = could not find container \"b8b6149bf97157a0b23615a75510d726965caf264f487377745001bbc3397a74\": container with ID starting with b8b6149bf97157a0b23615a75510d726965caf264f487377745001bbc3397a74 not found: ID does not exist" Oct 01 15:09:48 crc kubenswrapper[4771]: I1001 15:09:48.109551 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zvnl9"] Oct 01 15:09:48 crc kubenswrapper[4771]: E1001 15:09:48.117028 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="115476f1-e753-4ab0-9c3f-b4a6ea4a6739" containerName="extract" Oct 01 15:09:48 crc kubenswrapper[4771]: I1001 15:09:48.117067 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="115476f1-e753-4ab0-9c3f-b4a6ea4a6739" containerName="extract" Oct 01 15:09:48 crc kubenswrapper[4771]: E1001 15:09:48.117106 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="115476f1-e753-4ab0-9c3f-b4a6ea4a6739" containerName="util" Oct 01 15:09:48 crc kubenswrapper[4771]: I1001 15:09:48.117117 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="115476f1-e753-4ab0-9c3f-b4a6ea4a6739" containerName="util" Oct 01 15:09:48 crc kubenswrapper[4771]: E1001 15:09:48.117140 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7f9464c-4fa4-4b33-91d0-01b69a670e25" containerName="extract-utilities" Oct 01 15:09:48 crc kubenswrapper[4771]: I1001 15:09:48.117151 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7f9464c-4fa4-4b33-91d0-01b69a670e25" containerName="extract-utilities" Oct 01 15:09:48 crc kubenswrapper[4771]: E1001 15:09:48.117165 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7f9464c-4fa4-4b33-91d0-01b69a670e25" containerName="registry-server" Oct 01 15:09:48 crc kubenswrapper[4771]: I1001 15:09:48.117176 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7f9464c-4fa4-4b33-91d0-01b69a670e25" containerName="registry-server" Oct 01 15:09:48 crc kubenswrapper[4771]: E1001 15:09:48.117201 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7f9464c-4fa4-4b33-91d0-01b69a670e25" containerName="extract-content" Oct 01 15:09:48 crc kubenswrapper[4771]: I1001 15:09:48.117212 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7f9464c-4fa4-4b33-91d0-01b69a670e25" containerName="extract-content" Oct 01 15:09:48 crc kubenswrapper[4771]: E1001 15:09:48.117228 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="115476f1-e753-4ab0-9c3f-b4a6ea4a6739" containerName="pull" Oct 01 15:09:48 crc kubenswrapper[4771]: I1001 15:09:48.117239 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="115476f1-e753-4ab0-9c3f-b4a6ea4a6739" containerName="pull" Oct 01 15:09:48 crc kubenswrapper[4771]: I1001 15:09:48.117436 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7f9464c-4fa4-4b33-91d0-01b69a670e25" containerName="registry-server" Oct 01 15:09:48 crc kubenswrapper[4771]: I1001 15:09:48.117451 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="115476f1-e753-4ab0-9c3f-b4a6ea4a6739" containerName="extract" Oct 01 15:09:48 crc kubenswrapper[4771]: I1001 15:09:48.118610 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zvnl9" Oct 01 15:09:48 crc kubenswrapper[4771]: I1001 15:09:48.121170 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zvnl9"] Oct 01 15:09:48 crc kubenswrapper[4771]: I1001 15:09:48.176835 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/338fc0b2-8b82-4137-bddc-5852a29ced03-catalog-content\") pod \"community-operators-zvnl9\" (UID: \"338fc0b2-8b82-4137-bddc-5852a29ced03\") " pod="openshift-marketplace/community-operators-zvnl9" Oct 01 15:09:48 crc kubenswrapper[4771]: I1001 15:09:48.176919 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpc2k\" (UniqueName: \"kubernetes.io/projected/338fc0b2-8b82-4137-bddc-5852a29ced03-kube-api-access-lpc2k\") pod \"community-operators-zvnl9\" (UID: \"338fc0b2-8b82-4137-bddc-5852a29ced03\") " pod="openshift-marketplace/community-operators-zvnl9" Oct 01 15:09:48 crc kubenswrapper[4771]: I1001 15:09:48.177116 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/338fc0b2-8b82-4137-bddc-5852a29ced03-utilities\") pod \"community-operators-zvnl9\" (UID: \"338fc0b2-8b82-4137-bddc-5852a29ced03\") " pod="openshift-marketplace/community-operators-zvnl9" Oct 01 15:09:48 crc kubenswrapper[4771]: I1001 15:09:48.278313 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/338fc0b2-8b82-4137-bddc-5852a29ced03-utilities\") pod \"community-operators-zvnl9\" (UID: \"338fc0b2-8b82-4137-bddc-5852a29ced03\") " pod="openshift-marketplace/community-operators-zvnl9" Oct 01 15:09:48 crc kubenswrapper[4771]: I1001 15:09:48.278380 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/338fc0b2-8b82-4137-bddc-5852a29ced03-catalog-content\") pod \"community-operators-zvnl9\" (UID: \"338fc0b2-8b82-4137-bddc-5852a29ced03\") " pod="openshift-marketplace/community-operators-zvnl9" Oct 01 15:09:48 crc kubenswrapper[4771]: I1001 15:09:48.278415 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpc2k\" (UniqueName: \"kubernetes.io/projected/338fc0b2-8b82-4137-bddc-5852a29ced03-kube-api-access-lpc2k\") pod \"community-operators-zvnl9\" (UID: \"338fc0b2-8b82-4137-bddc-5852a29ced03\") " pod="openshift-marketplace/community-operators-zvnl9" Oct 01 15:09:48 crc kubenswrapper[4771]: I1001 15:09:48.279092 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/338fc0b2-8b82-4137-bddc-5852a29ced03-utilities\") pod \"community-operators-zvnl9\" (UID: \"338fc0b2-8b82-4137-bddc-5852a29ced03\") " pod="openshift-marketplace/community-operators-zvnl9" Oct 01 15:09:48 crc kubenswrapper[4771]: I1001 15:09:48.279163 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/338fc0b2-8b82-4137-bddc-5852a29ced03-catalog-content\") pod \"community-operators-zvnl9\" (UID: \"338fc0b2-8b82-4137-bddc-5852a29ced03\") " pod="openshift-marketplace/community-operators-zvnl9" Oct 01 15:09:48 crc kubenswrapper[4771]: I1001 15:09:48.296947 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpc2k\" (UniqueName: \"kubernetes.io/projected/338fc0b2-8b82-4137-bddc-5852a29ced03-kube-api-access-lpc2k\") pod \"community-operators-zvnl9\" (UID: \"338fc0b2-8b82-4137-bddc-5852a29ced03\") " pod="openshift-marketplace/community-operators-zvnl9" Oct 01 15:09:48 crc kubenswrapper[4771]: I1001 15:09:48.461110 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zvnl9" Oct 01 15:09:48 crc kubenswrapper[4771]: I1001 15:09:48.982207 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zvnl9"] Oct 01 15:09:49 crc kubenswrapper[4771]: I1001 15:09:49.701980 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-9fb4b654-b6wcr"] Oct 01 15:09:49 crc kubenswrapper[4771]: I1001 15:09:49.703285 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-9fb4b654-b6wcr" Oct 01 15:09:49 crc kubenswrapper[4771]: I1001 15:09:49.705153 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-flgnd" Oct 01 15:09:49 crc kubenswrapper[4771]: I1001 15:09:49.730031 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-9fb4b654-b6wcr"] Oct 01 15:09:49 crc kubenswrapper[4771]: I1001 15:09:49.802898 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc2xp\" (UniqueName: \"kubernetes.io/projected/6285b448-a922-4015-96e8-3af02ca8a82d-kube-api-access-dc2xp\") pod \"openstack-operator-controller-operator-9fb4b654-b6wcr\" (UID: \"6285b448-a922-4015-96e8-3af02ca8a82d\") " pod="openstack-operators/openstack-operator-controller-operator-9fb4b654-b6wcr" Oct 01 15:09:49 crc kubenswrapper[4771]: I1001 15:09:49.904520 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc2xp\" (UniqueName: \"kubernetes.io/projected/6285b448-a922-4015-96e8-3af02ca8a82d-kube-api-access-dc2xp\") pod \"openstack-operator-controller-operator-9fb4b654-b6wcr\" (UID: \"6285b448-a922-4015-96e8-3af02ca8a82d\") " pod="openstack-operators/openstack-operator-controller-operator-9fb4b654-b6wcr" Oct 01 15:09:49 crc kubenswrapper[4771]: I1001 15:09:49.923762 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc2xp\" (UniqueName: \"kubernetes.io/projected/6285b448-a922-4015-96e8-3af02ca8a82d-kube-api-access-dc2xp\") pod \"openstack-operator-controller-operator-9fb4b654-b6wcr\" (UID: \"6285b448-a922-4015-96e8-3af02ca8a82d\") " pod="openstack-operators/openstack-operator-controller-operator-9fb4b654-b6wcr" Oct 01 15:09:49 crc kubenswrapper[4771]: I1001 15:09:49.943874 4771 generic.go:334] "Generic (PLEG): container finished" podID="338fc0b2-8b82-4137-bddc-5852a29ced03" containerID="24a59c92919e9f77806f521b9a514baad396ed755ab43474eb462dd138fa9bb5" exitCode=0 Oct 01 15:09:49 crc kubenswrapper[4771]: I1001 15:09:49.943938 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zvnl9" event={"ID":"338fc0b2-8b82-4137-bddc-5852a29ced03","Type":"ContainerDied","Data":"24a59c92919e9f77806f521b9a514baad396ed755ab43474eb462dd138fa9bb5"} Oct 01 15:09:49 crc kubenswrapper[4771]: I1001 15:09:49.943989 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zvnl9" event={"ID":"338fc0b2-8b82-4137-bddc-5852a29ced03","Type":"ContainerStarted","Data":"fcef2ab4aee11ba4e6ac7a89b74de076049bc7c4400fa607eb184905f1569e57"} Oct 01 15:09:49 crc kubenswrapper[4771]: I1001 15:09:49.993357 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7f9464c-4fa4-4b33-91d0-01b69a670e25" path="/var/lib/kubelet/pods/a7f9464c-4fa4-4b33-91d0-01b69a670e25/volumes" Oct 01 15:09:50 crc kubenswrapper[4771]: I1001 15:09:50.019405 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-9fb4b654-b6wcr" Oct 01 15:09:50 crc kubenswrapper[4771]: W1001 15:09:50.464265 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6285b448_a922_4015_96e8_3af02ca8a82d.slice/crio-d5d076298e72867433ae6d9ec176d9b4b94e5053a32c34cc746d1078d002f7a1 WatchSource:0}: Error finding container d5d076298e72867433ae6d9ec176d9b4b94e5053a32c34cc746d1078d002f7a1: Status 404 returned error can't find the container with id d5d076298e72867433ae6d9ec176d9b4b94e5053a32c34cc746d1078d002f7a1 Oct 01 15:09:50 crc kubenswrapper[4771]: I1001 15:09:50.470712 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-9fb4b654-b6wcr"] Oct 01 15:09:50 crc kubenswrapper[4771]: I1001 15:09:50.954968 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-9fb4b654-b6wcr" event={"ID":"6285b448-a922-4015-96e8-3af02ca8a82d","Type":"ContainerStarted","Data":"d5d076298e72867433ae6d9ec176d9b4b94e5053a32c34cc746d1078d002f7a1"} Oct 01 15:09:51 crc kubenswrapper[4771]: I1001 15:09:51.963247 4771 generic.go:334] "Generic (PLEG): container finished" podID="338fc0b2-8b82-4137-bddc-5852a29ced03" containerID="d6e4ec98f48bf755127f1820bd95b397718bc3a63034b6b98f1d36f778f3253f" exitCode=0 Oct 01 15:09:51 crc kubenswrapper[4771]: I1001 15:09:51.963406 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zvnl9" event={"ID":"338fc0b2-8b82-4137-bddc-5852a29ced03","Type":"ContainerDied","Data":"d6e4ec98f48bf755127f1820bd95b397718bc3a63034b6b98f1d36f778f3253f"} Oct 01 15:09:54 crc kubenswrapper[4771]: I1001 15:09:54.986708 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zvnl9" event={"ID":"338fc0b2-8b82-4137-bddc-5852a29ced03","Type":"ContainerStarted","Data":"807a777dd793581e3df72e5bc2efe3fc239c37efda1ac3f4a866df92f470a308"} Oct 01 15:09:54 crc kubenswrapper[4771]: I1001 15:09:54.988843 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-9fb4b654-b6wcr" event={"ID":"6285b448-a922-4015-96e8-3af02ca8a82d","Type":"ContainerStarted","Data":"896e07aa27f645499243a99b3b74657c4670f801bc106e51a5a9191be26e979c"} Oct 01 15:09:55 crc kubenswrapper[4771]: I1001 15:09:55.010865 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zvnl9" podStartSLOduration=3.016207339 podStartE2EDuration="7.010847718s" podCreationTimestamp="2025-10-01 15:09:48 +0000 UTC" firstStartedPulling="2025-10-01 15:09:49.945679134 +0000 UTC m=+834.564854305" lastFinishedPulling="2025-10-01 15:09:53.940319483 +0000 UTC m=+838.559494684" observedRunningTime="2025-10-01 15:09:55.00610771 +0000 UTC m=+839.625282891" watchObservedRunningTime="2025-10-01 15:09:55.010847718 +0000 UTC m=+839.630022889" Oct 01 15:09:57 crc kubenswrapper[4771]: I1001 15:09:57.001230 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-9fb4b654-b6wcr" event={"ID":"6285b448-a922-4015-96e8-3af02ca8a82d","Type":"ContainerStarted","Data":"0f02d07d9a9dfa3d930990504482bd3117f60f4009631ebac39371b17cca8061"} Oct 01 15:09:57 crc kubenswrapper[4771]: I1001 15:09:57.001598 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-9fb4b654-b6wcr" Oct 01 15:09:57 crc kubenswrapper[4771]: I1001 15:09:57.040248 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-9fb4b654-b6wcr" podStartSLOduration=2.056907039 podStartE2EDuration="8.040230023s" podCreationTimestamp="2025-10-01 15:09:49 +0000 UTC" firstStartedPulling="2025-10-01 15:09:50.467806721 +0000 UTC m=+835.086981902" lastFinishedPulling="2025-10-01 15:09:56.451129715 +0000 UTC m=+841.070304886" observedRunningTime="2025-10-01 15:09:57.038702896 +0000 UTC m=+841.657878067" watchObservedRunningTime="2025-10-01 15:09:57.040230023 +0000 UTC m=+841.659405194" Oct 01 15:09:58 crc kubenswrapper[4771]: I1001 15:09:58.461339 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zvnl9" Oct 01 15:09:58 crc kubenswrapper[4771]: I1001 15:09:58.461820 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zvnl9" Oct 01 15:09:58 crc kubenswrapper[4771]: I1001 15:09:58.521682 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zvnl9" Oct 01 15:09:59 crc kubenswrapper[4771]: I1001 15:09:59.064888 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zvnl9" Oct 01 15:09:59 crc kubenswrapper[4771]: I1001 15:09:59.691406 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zvnl9"] Oct 01 15:10:00 crc kubenswrapper[4771]: I1001 15:10:00.022648 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-9fb4b654-b6wcr" Oct 01 15:10:01 crc kubenswrapper[4771]: I1001 15:10:01.028986 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zvnl9" podUID="338fc0b2-8b82-4137-bddc-5852a29ced03" containerName="registry-server" containerID="cri-o://807a777dd793581e3df72e5bc2efe3fc239c37efda1ac3f4a866df92f470a308" gracePeriod=2 Oct 01 15:10:01 crc kubenswrapper[4771]: I1001 15:10:01.519906 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zvnl9" Oct 01 15:10:01 crc kubenswrapper[4771]: I1001 15:10:01.590113 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/338fc0b2-8b82-4137-bddc-5852a29ced03-catalog-content\") pod \"338fc0b2-8b82-4137-bddc-5852a29ced03\" (UID: \"338fc0b2-8b82-4137-bddc-5852a29ced03\") " Oct 01 15:10:01 crc kubenswrapper[4771]: I1001 15:10:01.590433 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpc2k\" (UniqueName: \"kubernetes.io/projected/338fc0b2-8b82-4137-bddc-5852a29ced03-kube-api-access-lpc2k\") pod \"338fc0b2-8b82-4137-bddc-5852a29ced03\" (UID: \"338fc0b2-8b82-4137-bddc-5852a29ced03\") " Oct 01 15:10:01 crc kubenswrapper[4771]: I1001 15:10:01.590548 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/338fc0b2-8b82-4137-bddc-5852a29ced03-utilities\") pod \"338fc0b2-8b82-4137-bddc-5852a29ced03\" (UID: \"338fc0b2-8b82-4137-bddc-5852a29ced03\") " Oct 01 15:10:01 crc kubenswrapper[4771]: I1001 15:10:01.591538 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/338fc0b2-8b82-4137-bddc-5852a29ced03-utilities" (OuterVolumeSpecName: "utilities") pod "338fc0b2-8b82-4137-bddc-5852a29ced03" (UID: "338fc0b2-8b82-4137-bddc-5852a29ced03"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:10:01 crc kubenswrapper[4771]: I1001 15:10:01.597667 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/338fc0b2-8b82-4137-bddc-5852a29ced03-kube-api-access-lpc2k" (OuterVolumeSpecName: "kube-api-access-lpc2k") pod "338fc0b2-8b82-4137-bddc-5852a29ced03" (UID: "338fc0b2-8b82-4137-bddc-5852a29ced03"). InnerVolumeSpecName "kube-api-access-lpc2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:10:01 crc kubenswrapper[4771]: I1001 15:10:01.649509 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/338fc0b2-8b82-4137-bddc-5852a29ced03-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "338fc0b2-8b82-4137-bddc-5852a29ced03" (UID: "338fc0b2-8b82-4137-bddc-5852a29ced03"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:10:01 crc kubenswrapper[4771]: I1001 15:10:01.691995 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/338fc0b2-8b82-4137-bddc-5852a29ced03-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 15:10:01 crc kubenswrapper[4771]: I1001 15:10:01.692125 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/338fc0b2-8b82-4137-bddc-5852a29ced03-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 15:10:01 crc kubenswrapper[4771]: I1001 15:10:01.692152 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpc2k\" (UniqueName: \"kubernetes.io/projected/338fc0b2-8b82-4137-bddc-5852a29ced03-kube-api-access-lpc2k\") on node \"crc\" DevicePath \"\"" Oct 01 15:10:02 crc kubenswrapper[4771]: I1001 15:10:02.046281 4771 generic.go:334] "Generic (PLEG): container finished" podID="338fc0b2-8b82-4137-bddc-5852a29ced03" containerID="807a777dd793581e3df72e5bc2efe3fc239c37efda1ac3f4a866df92f470a308" exitCode=0 Oct 01 15:10:02 crc kubenswrapper[4771]: I1001 15:10:02.046346 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zvnl9" event={"ID":"338fc0b2-8b82-4137-bddc-5852a29ced03","Type":"ContainerDied","Data":"807a777dd793581e3df72e5bc2efe3fc239c37efda1ac3f4a866df92f470a308"} Oct 01 15:10:02 crc kubenswrapper[4771]: I1001 15:10:02.046390 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zvnl9" event={"ID":"338fc0b2-8b82-4137-bddc-5852a29ced03","Type":"ContainerDied","Data":"fcef2ab4aee11ba4e6ac7a89b74de076049bc7c4400fa607eb184905f1569e57"} Oct 01 15:10:02 crc kubenswrapper[4771]: I1001 15:10:02.046420 4771 scope.go:117] "RemoveContainer" containerID="807a777dd793581e3df72e5bc2efe3fc239c37efda1ac3f4a866df92f470a308" Oct 01 15:10:02 crc kubenswrapper[4771]: I1001 15:10:02.046603 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zvnl9" Oct 01 15:10:02 crc kubenswrapper[4771]: I1001 15:10:02.080350 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zvnl9"] Oct 01 15:10:02 crc kubenswrapper[4771]: I1001 15:10:02.081082 4771 scope.go:117] "RemoveContainer" containerID="d6e4ec98f48bf755127f1820bd95b397718bc3a63034b6b98f1d36f778f3253f" Oct 01 15:10:02 crc kubenswrapper[4771]: I1001 15:10:02.084638 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zvnl9"] Oct 01 15:10:02 crc kubenswrapper[4771]: I1001 15:10:02.104316 4771 scope.go:117] "RemoveContainer" containerID="24a59c92919e9f77806f521b9a514baad396ed755ab43474eb462dd138fa9bb5" Oct 01 15:10:02 crc kubenswrapper[4771]: I1001 15:10:02.138335 4771 scope.go:117] "RemoveContainer" containerID="807a777dd793581e3df72e5bc2efe3fc239c37efda1ac3f4a866df92f470a308" Oct 01 15:10:02 crc kubenswrapper[4771]: E1001 15:10:02.139265 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"807a777dd793581e3df72e5bc2efe3fc239c37efda1ac3f4a866df92f470a308\": container with ID starting with 807a777dd793581e3df72e5bc2efe3fc239c37efda1ac3f4a866df92f470a308 not found: ID does not exist" containerID="807a777dd793581e3df72e5bc2efe3fc239c37efda1ac3f4a866df92f470a308" Oct 01 15:10:02 crc kubenswrapper[4771]: I1001 15:10:02.139399 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"807a777dd793581e3df72e5bc2efe3fc239c37efda1ac3f4a866df92f470a308"} err="failed to get container status \"807a777dd793581e3df72e5bc2efe3fc239c37efda1ac3f4a866df92f470a308\": rpc error: code = NotFound desc = could not find container \"807a777dd793581e3df72e5bc2efe3fc239c37efda1ac3f4a866df92f470a308\": container with ID starting with 807a777dd793581e3df72e5bc2efe3fc239c37efda1ac3f4a866df92f470a308 not found: ID does not exist" Oct 01 15:10:02 crc kubenswrapper[4771]: I1001 15:10:02.139687 4771 scope.go:117] "RemoveContainer" containerID="d6e4ec98f48bf755127f1820bd95b397718bc3a63034b6b98f1d36f778f3253f" Oct 01 15:10:02 crc kubenswrapper[4771]: E1001 15:10:02.140379 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6e4ec98f48bf755127f1820bd95b397718bc3a63034b6b98f1d36f778f3253f\": container with ID starting with d6e4ec98f48bf755127f1820bd95b397718bc3a63034b6b98f1d36f778f3253f not found: ID does not exist" containerID="d6e4ec98f48bf755127f1820bd95b397718bc3a63034b6b98f1d36f778f3253f" Oct 01 15:10:02 crc kubenswrapper[4771]: I1001 15:10:02.140462 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6e4ec98f48bf755127f1820bd95b397718bc3a63034b6b98f1d36f778f3253f"} err="failed to get container status \"d6e4ec98f48bf755127f1820bd95b397718bc3a63034b6b98f1d36f778f3253f\": rpc error: code = NotFound desc = could not find container \"d6e4ec98f48bf755127f1820bd95b397718bc3a63034b6b98f1d36f778f3253f\": container with ID starting with d6e4ec98f48bf755127f1820bd95b397718bc3a63034b6b98f1d36f778f3253f not found: ID does not exist" Oct 01 15:10:02 crc kubenswrapper[4771]: I1001 15:10:02.140565 4771 scope.go:117] "RemoveContainer" containerID="24a59c92919e9f77806f521b9a514baad396ed755ab43474eb462dd138fa9bb5" Oct 01 15:10:02 crc kubenswrapper[4771]: E1001 15:10:02.141021 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24a59c92919e9f77806f521b9a514baad396ed755ab43474eb462dd138fa9bb5\": container with ID starting with 24a59c92919e9f77806f521b9a514baad396ed755ab43474eb462dd138fa9bb5 not found: ID does not exist" containerID="24a59c92919e9f77806f521b9a514baad396ed755ab43474eb462dd138fa9bb5" Oct 01 15:10:02 crc kubenswrapper[4771]: I1001 15:10:02.141092 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24a59c92919e9f77806f521b9a514baad396ed755ab43474eb462dd138fa9bb5"} err="failed to get container status \"24a59c92919e9f77806f521b9a514baad396ed755ab43474eb462dd138fa9bb5\": rpc error: code = NotFound desc = could not find container \"24a59c92919e9f77806f521b9a514baad396ed755ab43474eb462dd138fa9bb5\": container with ID starting with 24a59c92919e9f77806f521b9a514baad396ed755ab43474eb462dd138fa9bb5 not found: ID does not exist" Oct 01 15:10:04 crc kubenswrapper[4771]: I1001 15:10:03.999616 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="338fc0b2-8b82-4137-bddc-5852a29ced03" path="/var/lib/kubelet/pods/338fc0b2-8b82-4137-bddc-5852a29ced03/volumes" Oct 01 15:10:09 crc kubenswrapper[4771]: I1001 15:10:09.303170 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sn2wf"] Oct 01 15:10:09 crc kubenswrapper[4771]: E1001 15:10:09.305168 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="338fc0b2-8b82-4137-bddc-5852a29ced03" containerName="extract-utilities" Oct 01 15:10:09 crc kubenswrapper[4771]: I1001 15:10:09.305202 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="338fc0b2-8b82-4137-bddc-5852a29ced03" containerName="extract-utilities" Oct 01 15:10:09 crc kubenswrapper[4771]: E1001 15:10:09.305224 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="338fc0b2-8b82-4137-bddc-5852a29ced03" containerName="registry-server" Oct 01 15:10:09 crc kubenswrapper[4771]: I1001 15:10:09.305234 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="338fc0b2-8b82-4137-bddc-5852a29ced03" containerName="registry-server" Oct 01 15:10:09 crc kubenswrapper[4771]: E1001 15:10:09.305250 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="338fc0b2-8b82-4137-bddc-5852a29ced03" containerName="extract-content" Oct 01 15:10:09 crc kubenswrapper[4771]: I1001 15:10:09.305260 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="338fc0b2-8b82-4137-bddc-5852a29ced03" containerName="extract-content" Oct 01 15:10:09 crc kubenswrapper[4771]: I1001 15:10:09.305429 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="338fc0b2-8b82-4137-bddc-5852a29ced03" containerName="registry-server" Oct 01 15:10:09 crc kubenswrapper[4771]: I1001 15:10:09.306675 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sn2wf" Oct 01 15:10:09 crc kubenswrapper[4771]: I1001 15:10:09.322041 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sn2wf"] Oct 01 15:10:09 crc kubenswrapper[4771]: I1001 15:10:09.402399 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec48824b-8859-4bc4-956d-78e5bbcc6d2b-catalog-content\") pod \"certified-operators-sn2wf\" (UID: \"ec48824b-8859-4bc4-956d-78e5bbcc6d2b\") " pod="openshift-marketplace/certified-operators-sn2wf" Oct 01 15:10:09 crc kubenswrapper[4771]: I1001 15:10:09.402460 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec48824b-8859-4bc4-956d-78e5bbcc6d2b-utilities\") pod \"certified-operators-sn2wf\" (UID: \"ec48824b-8859-4bc4-956d-78e5bbcc6d2b\") " pod="openshift-marketplace/certified-operators-sn2wf" Oct 01 15:10:09 crc kubenswrapper[4771]: I1001 15:10:09.402507 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6r49\" (UniqueName: \"kubernetes.io/projected/ec48824b-8859-4bc4-956d-78e5bbcc6d2b-kube-api-access-z6r49\") pod \"certified-operators-sn2wf\" (UID: \"ec48824b-8859-4bc4-956d-78e5bbcc6d2b\") " pod="openshift-marketplace/certified-operators-sn2wf" Oct 01 15:10:09 crc kubenswrapper[4771]: I1001 15:10:09.504452 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec48824b-8859-4bc4-956d-78e5bbcc6d2b-catalog-content\") pod \"certified-operators-sn2wf\" (UID: \"ec48824b-8859-4bc4-956d-78e5bbcc6d2b\") " pod="openshift-marketplace/certified-operators-sn2wf" Oct 01 15:10:09 crc kubenswrapper[4771]: I1001 15:10:09.504559 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec48824b-8859-4bc4-956d-78e5bbcc6d2b-utilities\") pod \"certified-operators-sn2wf\" (UID: \"ec48824b-8859-4bc4-956d-78e5bbcc6d2b\") " pod="openshift-marketplace/certified-operators-sn2wf" Oct 01 15:10:09 crc kubenswrapper[4771]: I1001 15:10:09.504619 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6r49\" (UniqueName: \"kubernetes.io/projected/ec48824b-8859-4bc4-956d-78e5bbcc6d2b-kube-api-access-z6r49\") pod \"certified-operators-sn2wf\" (UID: \"ec48824b-8859-4bc4-956d-78e5bbcc6d2b\") " pod="openshift-marketplace/certified-operators-sn2wf" Oct 01 15:10:09 crc kubenswrapper[4771]: I1001 15:10:09.505677 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec48824b-8859-4bc4-956d-78e5bbcc6d2b-catalog-content\") pod \"certified-operators-sn2wf\" (UID: \"ec48824b-8859-4bc4-956d-78e5bbcc6d2b\") " pod="openshift-marketplace/certified-operators-sn2wf" Oct 01 15:10:09 crc kubenswrapper[4771]: I1001 15:10:09.506098 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec48824b-8859-4bc4-956d-78e5bbcc6d2b-utilities\") pod \"certified-operators-sn2wf\" (UID: \"ec48824b-8859-4bc4-956d-78e5bbcc6d2b\") " pod="openshift-marketplace/certified-operators-sn2wf" Oct 01 15:10:09 crc kubenswrapper[4771]: I1001 15:10:09.527235 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6r49\" (UniqueName: \"kubernetes.io/projected/ec48824b-8859-4bc4-956d-78e5bbcc6d2b-kube-api-access-z6r49\") pod \"certified-operators-sn2wf\" (UID: \"ec48824b-8859-4bc4-956d-78e5bbcc6d2b\") " pod="openshift-marketplace/certified-operators-sn2wf" Oct 01 15:10:09 crc kubenswrapper[4771]: I1001 15:10:09.635978 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sn2wf" Oct 01 15:10:10 crc kubenswrapper[4771]: I1001 15:10:10.107975 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sn2wf"] Oct 01 15:10:10 crc kubenswrapper[4771]: W1001 15:10:10.123914 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec48824b_8859_4bc4_956d_78e5bbcc6d2b.slice/crio-db1bfdd08b26bb2682dc263df918a390107a718c71acf03687d3b5f4ad3f1ea0 WatchSource:0}: Error finding container db1bfdd08b26bb2682dc263df918a390107a718c71acf03687d3b5f4ad3f1ea0: Status 404 returned error can't find the container with id db1bfdd08b26bb2682dc263df918a390107a718c71acf03687d3b5f4ad3f1ea0 Oct 01 15:10:11 crc kubenswrapper[4771]: I1001 15:10:11.122324 4771 generic.go:334] "Generic (PLEG): container finished" podID="ec48824b-8859-4bc4-956d-78e5bbcc6d2b" containerID="c1f7320758e7d5ca10a1d150541d3fb4c3282efca44a298674eb2392cdfe63c3" exitCode=0 Oct 01 15:10:11 crc kubenswrapper[4771]: I1001 15:10:11.122437 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sn2wf" event={"ID":"ec48824b-8859-4bc4-956d-78e5bbcc6d2b","Type":"ContainerDied","Data":"c1f7320758e7d5ca10a1d150541d3fb4c3282efca44a298674eb2392cdfe63c3"} Oct 01 15:10:11 crc kubenswrapper[4771]: I1001 15:10:11.122809 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sn2wf" event={"ID":"ec48824b-8859-4bc4-956d-78e5bbcc6d2b","Type":"ContainerStarted","Data":"db1bfdd08b26bb2682dc263df918a390107a718c71acf03687d3b5f4ad3f1ea0"} Oct 01 15:10:13 crc kubenswrapper[4771]: I1001 15:10:13.141131 4771 generic.go:334] "Generic (PLEG): container finished" podID="ec48824b-8859-4bc4-956d-78e5bbcc6d2b" containerID="732a90ddb3d625279c8ec7fce5255f65b5676a1f52bf923c3bc0b68fc991cf67" exitCode=0 Oct 01 15:10:13 crc kubenswrapper[4771]: I1001 15:10:13.141245 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sn2wf" event={"ID":"ec48824b-8859-4bc4-956d-78e5bbcc6d2b","Type":"ContainerDied","Data":"732a90ddb3d625279c8ec7fce5255f65b5676a1f52bf923c3bc0b68fc991cf67"} Oct 01 15:10:14 crc kubenswrapper[4771]: I1001 15:10:14.152274 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sn2wf" event={"ID":"ec48824b-8859-4bc4-956d-78e5bbcc6d2b","Type":"ContainerStarted","Data":"e0f6be131e2a1450d416dd613662c2d373d42d7f40f5939eefa7cec3d73f30af"} Oct 01 15:10:14 crc kubenswrapper[4771]: I1001 15:10:14.183395 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sn2wf" podStartSLOduration=2.746455384 podStartE2EDuration="5.183361217s" podCreationTimestamp="2025-10-01 15:10:09 +0000 UTC" firstStartedPulling="2025-10-01 15:10:11.125078698 +0000 UTC m=+855.744253899" lastFinishedPulling="2025-10-01 15:10:13.561984561 +0000 UTC m=+858.181159732" observedRunningTime="2025-10-01 15:10:14.180328432 +0000 UTC m=+858.799503633" watchObservedRunningTime="2025-10-01 15:10:14.183361217 +0000 UTC m=+858.802536438" Oct 01 15:10:19 crc kubenswrapper[4771]: I1001 15:10:19.636253 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sn2wf" Oct 01 15:10:19 crc kubenswrapper[4771]: I1001 15:10:19.636806 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sn2wf" Oct 01 15:10:19 crc kubenswrapper[4771]: I1001 15:10:19.709576 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sn2wf" Oct 01 15:10:20 crc kubenswrapper[4771]: I1001 15:10:20.272359 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sn2wf" Oct 01 15:10:20 crc kubenswrapper[4771]: I1001 15:10:20.345815 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sn2wf"] Oct 01 15:10:22 crc kubenswrapper[4771]: I1001 15:10:22.219492 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sn2wf" podUID="ec48824b-8859-4bc4-956d-78e5bbcc6d2b" containerName="registry-server" containerID="cri-o://e0f6be131e2a1450d416dd613662c2d373d42d7f40f5939eefa7cec3d73f30af" gracePeriod=2 Oct 01 15:10:22 crc kubenswrapper[4771]: I1001 15:10:22.655687 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sn2wf" Oct 01 15:10:22 crc kubenswrapper[4771]: I1001 15:10:22.795765 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6r49\" (UniqueName: \"kubernetes.io/projected/ec48824b-8859-4bc4-956d-78e5bbcc6d2b-kube-api-access-z6r49\") pod \"ec48824b-8859-4bc4-956d-78e5bbcc6d2b\" (UID: \"ec48824b-8859-4bc4-956d-78e5bbcc6d2b\") " Oct 01 15:10:22 crc kubenswrapper[4771]: I1001 15:10:22.795841 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec48824b-8859-4bc4-956d-78e5bbcc6d2b-catalog-content\") pod \"ec48824b-8859-4bc4-956d-78e5bbcc6d2b\" (UID: \"ec48824b-8859-4bc4-956d-78e5bbcc6d2b\") " Oct 01 15:10:22 crc kubenswrapper[4771]: I1001 15:10:22.795919 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec48824b-8859-4bc4-956d-78e5bbcc6d2b-utilities\") pod \"ec48824b-8859-4bc4-956d-78e5bbcc6d2b\" (UID: \"ec48824b-8859-4bc4-956d-78e5bbcc6d2b\") " Oct 01 15:10:22 crc kubenswrapper[4771]: I1001 15:10:22.797067 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec48824b-8859-4bc4-956d-78e5bbcc6d2b-utilities" (OuterVolumeSpecName: "utilities") pod "ec48824b-8859-4bc4-956d-78e5bbcc6d2b" (UID: "ec48824b-8859-4bc4-956d-78e5bbcc6d2b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:10:22 crc kubenswrapper[4771]: I1001 15:10:22.797319 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec48824b-8859-4bc4-956d-78e5bbcc6d2b-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 15:10:22 crc kubenswrapper[4771]: I1001 15:10:22.801363 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec48824b-8859-4bc4-956d-78e5bbcc6d2b-kube-api-access-z6r49" (OuterVolumeSpecName: "kube-api-access-z6r49") pod "ec48824b-8859-4bc4-956d-78e5bbcc6d2b" (UID: "ec48824b-8859-4bc4-956d-78e5bbcc6d2b"). InnerVolumeSpecName "kube-api-access-z6r49". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:10:22 crc kubenswrapper[4771]: I1001 15:10:22.899429 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6r49\" (UniqueName: \"kubernetes.io/projected/ec48824b-8859-4bc4-956d-78e5bbcc6d2b-kube-api-access-z6r49\") on node \"crc\" DevicePath \"\"" Oct 01 15:10:23 crc kubenswrapper[4771]: I1001 15:10:23.126960 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec48824b-8859-4bc4-956d-78e5bbcc6d2b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ec48824b-8859-4bc4-956d-78e5bbcc6d2b" (UID: "ec48824b-8859-4bc4-956d-78e5bbcc6d2b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:10:23 crc kubenswrapper[4771]: I1001 15:10:23.203203 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec48824b-8859-4bc4-956d-78e5bbcc6d2b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 15:10:23 crc kubenswrapper[4771]: I1001 15:10:23.227370 4771 generic.go:334] "Generic (PLEG): container finished" podID="ec48824b-8859-4bc4-956d-78e5bbcc6d2b" containerID="e0f6be131e2a1450d416dd613662c2d373d42d7f40f5939eefa7cec3d73f30af" exitCode=0 Oct 01 15:10:23 crc kubenswrapper[4771]: I1001 15:10:23.227417 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sn2wf" event={"ID":"ec48824b-8859-4bc4-956d-78e5bbcc6d2b","Type":"ContainerDied","Data":"e0f6be131e2a1450d416dd613662c2d373d42d7f40f5939eefa7cec3d73f30af"} Oct 01 15:10:23 crc kubenswrapper[4771]: I1001 15:10:23.227452 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sn2wf" event={"ID":"ec48824b-8859-4bc4-956d-78e5bbcc6d2b","Type":"ContainerDied","Data":"db1bfdd08b26bb2682dc263df918a390107a718c71acf03687d3b5f4ad3f1ea0"} Oct 01 15:10:23 crc kubenswrapper[4771]: I1001 15:10:23.227470 4771 scope.go:117] "RemoveContainer" containerID="e0f6be131e2a1450d416dd613662c2d373d42d7f40f5939eefa7cec3d73f30af" Oct 01 15:10:23 crc kubenswrapper[4771]: I1001 15:10:23.227464 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sn2wf" Oct 01 15:10:23 crc kubenswrapper[4771]: I1001 15:10:23.242381 4771 scope.go:117] "RemoveContainer" containerID="732a90ddb3d625279c8ec7fce5255f65b5676a1f52bf923c3bc0b68fc991cf67" Oct 01 15:10:23 crc kubenswrapper[4771]: I1001 15:10:23.255081 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sn2wf"] Oct 01 15:10:23 crc kubenswrapper[4771]: I1001 15:10:23.260287 4771 scope.go:117] "RemoveContainer" containerID="c1f7320758e7d5ca10a1d150541d3fb4c3282efca44a298674eb2392cdfe63c3" Oct 01 15:10:23 crc kubenswrapper[4771]: I1001 15:10:23.262413 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sn2wf"] Oct 01 15:10:23 crc kubenswrapper[4771]: I1001 15:10:23.280062 4771 scope.go:117] "RemoveContainer" containerID="e0f6be131e2a1450d416dd613662c2d373d42d7f40f5939eefa7cec3d73f30af" Oct 01 15:10:23 crc kubenswrapper[4771]: E1001 15:10:23.280533 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0f6be131e2a1450d416dd613662c2d373d42d7f40f5939eefa7cec3d73f30af\": container with ID starting with e0f6be131e2a1450d416dd613662c2d373d42d7f40f5939eefa7cec3d73f30af not found: ID does not exist" containerID="e0f6be131e2a1450d416dd613662c2d373d42d7f40f5939eefa7cec3d73f30af" Oct 01 15:10:23 crc kubenswrapper[4771]: I1001 15:10:23.280574 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0f6be131e2a1450d416dd613662c2d373d42d7f40f5939eefa7cec3d73f30af"} err="failed to get container status \"e0f6be131e2a1450d416dd613662c2d373d42d7f40f5939eefa7cec3d73f30af\": rpc error: code = NotFound desc = could not find container \"e0f6be131e2a1450d416dd613662c2d373d42d7f40f5939eefa7cec3d73f30af\": container with ID starting with e0f6be131e2a1450d416dd613662c2d373d42d7f40f5939eefa7cec3d73f30af not found: ID does not exist" Oct 01 15:10:23 crc kubenswrapper[4771]: I1001 15:10:23.280601 4771 scope.go:117] "RemoveContainer" containerID="732a90ddb3d625279c8ec7fce5255f65b5676a1f52bf923c3bc0b68fc991cf67" Oct 01 15:10:23 crc kubenswrapper[4771]: E1001 15:10:23.284651 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"732a90ddb3d625279c8ec7fce5255f65b5676a1f52bf923c3bc0b68fc991cf67\": container with ID starting with 732a90ddb3d625279c8ec7fce5255f65b5676a1f52bf923c3bc0b68fc991cf67 not found: ID does not exist" containerID="732a90ddb3d625279c8ec7fce5255f65b5676a1f52bf923c3bc0b68fc991cf67" Oct 01 15:10:23 crc kubenswrapper[4771]: I1001 15:10:23.284699 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"732a90ddb3d625279c8ec7fce5255f65b5676a1f52bf923c3bc0b68fc991cf67"} err="failed to get container status \"732a90ddb3d625279c8ec7fce5255f65b5676a1f52bf923c3bc0b68fc991cf67\": rpc error: code = NotFound desc = could not find container \"732a90ddb3d625279c8ec7fce5255f65b5676a1f52bf923c3bc0b68fc991cf67\": container with ID starting with 732a90ddb3d625279c8ec7fce5255f65b5676a1f52bf923c3bc0b68fc991cf67 not found: ID does not exist" Oct 01 15:10:23 crc kubenswrapper[4771]: I1001 15:10:23.284991 4771 scope.go:117] "RemoveContainer" containerID="c1f7320758e7d5ca10a1d150541d3fb4c3282efca44a298674eb2392cdfe63c3" Oct 01 15:10:23 crc kubenswrapper[4771]: E1001 15:10:23.285446 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1f7320758e7d5ca10a1d150541d3fb4c3282efca44a298674eb2392cdfe63c3\": container with ID starting with c1f7320758e7d5ca10a1d150541d3fb4c3282efca44a298674eb2392cdfe63c3 not found: ID does not exist" containerID="c1f7320758e7d5ca10a1d150541d3fb4c3282efca44a298674eb2392cdfe63c3" Oct 01 15:10:23 crc kubenswrapper[4771]: I1001 15:10:23.285531 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1f7320758e7d5ca10a1d150541d3fb4c3282efca44a298674eb2392cdfe63c3"} err="failed to get container status \"c1f7320758e7d5ca10a1d150541d3fb4c3282efca44a298674eb2392cdfe63c3\": rpc error: code = NotFound desc = could not find container \"c1f7320758e7d5ca10a1d150541d3fb4c3282efca44a298674eb2392cdfe63c3\": container with ID starting with c1f7320758e7d5ca10a1d150541d3fb4c3282efca44a298674eb2392cdfe63c3 not found: ID does not exist" Oct 01 15:10:23 crc kubenswrapper[4771]: I1001 15:10:23.997419 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec48824b-8859-4bc4-956d-78e5bbcc6d2b" path="/var/lib/kubelet/pods/ec48824b-8859-4bc4-956d-78e5bbcc6d2b/volumes" Oct 01 15:10:25 crc kubenswrapper[4771]: I1001 15:10:25.361713 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-l8q87"] Oct 01 15:10:25 crc kubenswrapper[4771]: E1001 15:10:25.362010 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec48824b-8859-4bc4-956d-78e5bbcc6d2b" containerName="extract-utilities" Oct 01 15:10:25 crc kubenswrapper[4771]: I1001 15:10:25.362024 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec48824b-8859-4bc4-956d-78e5bbcc6d2b" containerName="extract-utilities" Oct 01 15:10:25 crc kubenswrapper[4771]: E1001 15:10:25.362039 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec48824b-8859-4bc4-956d-78e5bbcc6d2b" containerName="registry-server" Oct 01 15:10:25 crc kubenswrapper[4771]: I1001 15:10:25.362047 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec48824b-8859-4bc4-956d-78e5bbcc6d2b" containerName="registry-server" Oct 01 15:10:25 crc kubenswrapper[4771]: E1001 15:10:25.362062 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec48824b-8859-4bc4-956d-78e5bbcc6d2b" containerName="extract-content" Oct 01 15:10:25 crc kubenswrapper[4771]: I1001 15:10:25.362071 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec48824b-8859-4bc4-956d-78e5bbcc6d2b" containerName="extract-content" Oct 01 15:10:25 crc kubenswrapper[4771]: I1001 15:10:25.362199 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec48824b-8859-4bc4-956d-78e5bbcc6d2b" containerName="registry-server" Oct 01 15:10:25 crc kubenswrapper[4771]: I1001 15:10:25.363298 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l8q87" Oct 01 15:10:25 crc kubenswrapper[4771]: I1001 15:10:25.377426 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l8q87"] Oct 01 15:10:25 crc kubenswrapper[4771]: I1001 15:10:25.536714 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sn4m\" (UniqueName: \"kubernetes.io/projected/0745ade3-bbcb-4ac3-b60d-4956c63d1be6-kube-api-access-9sn4m\") pod \"redhat-operators-l8q87\" (UID: \"0745ade3-bbcb-4ac3-b60d-4956c63d1be6\") " pod="openshift-marketplace/redhat-operators-l8q87" Oct 01 15:10:25 crc kubenswrapper[4771]: I1001 15:10:25.536875 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0745ade3-bbcb-4ac3-b60d-4956c63d1be6-catalog-content\") pod \"redhat-operators-l8q87\" (UID: \"0745ade3-bbcb-4ac3-b60d-4956c63d1be6\") " pod="openshift-marketplace/redhat-operators-l8q87" Oct 01 15:10:25 crc kubenswrapper[4771]: I1001 15:10:25.536940 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0745ade3-bbcb-4ac3-b60d-4956c63d1be6-utilities\") pod \"redhat-operators-l8q87\" (UID: \"0745ade3-bbcb-4ac3-b60d-4956c63d1be6\") " pod="openshift-marketplace/redhat-operators-l8q87" Oct 01 15:10:25 crc kubenswrapper[4771]: I1001 15:10:25.637877 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sn4m\" (UniqueName: \"kubernetes.io/projected/0745ade3-bbcb-4ac3-b60d-4956c63d1be6-kube-api-access-9sn4m\") pod \"redhat-operators-l8q87\" (UID: \"0745ade3-bbcb-4ac3-b60d-4956c63d1be6\") " pod="openshift-marketplace/redhat-operators-l8q87" Oct 01 15:10:25 crc kubenswrapper[4771]: I1001 15:10:25.637942 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0745ade3-bbcb-4ac3-b60d-4956c63d1be6-catalog-content\") pod \"redhat-operators-l8q87\" (UID: \"0745ade3-bbcb-4ac3-b60d-4956c63d1be6\") " pod="openshift-marketplace/redhat-operators-l8q87" Oct 01 15:10:25 crc kubenswrapper[4771]: I1001 15:10:25.637963 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0745ade3-bbcb-4ac3-b60d-4956c63d1be6-utilities\") pod \"redhat-operators-l8q87\" (UID: \"0745ade3-bbcb-4ac3-b60d-4956c63d1be6\") " pod="openshift-marketplace/redhat-operators-l8q87" Oct 01 15:10:25 crc kubenswrapper[4771]: I1001 15:10:25.638423 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0745ade3-bbcb-4ac3-b60d-4956c63d1be6-utilities\") pod \"redhat-operators-l8q87\" (UID: \"0745ade3-bbcb-4ac3-b60d-4956c63d1be6\") " pod="openshift-marketplace/redhat-operators-l8q87" Oct 01 15:10:25 crc kubenswrapper[4771]: I1001 15:10:25.638687 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0745ade3-bbcb-4ac3-b60d-4956c63d1be6-catalog-content\") pod \"redhat-operators-l8q87\" (UID: \"0745ade3-bbcb-4ac3-b60d-4956c63d1be6\") " pod="openshift-marketplace/redhat-operators-l8q87" Oct 01 15:10:25 crc kubenswrapper[4771]: I1001 15:10:25.656236 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sn4m\" (UniqueName: \"kubernetes.io/projected/0745ade3-bbcb-4ac3-b60d-4956c63d1be6-kube-api-access-9sn4m\") pod \"redhat-operators-l8q87\" (UID: \"0745ade3-bbcb-4ac3-b60d-4956c63d1be6\") " pod="openshift-marketplace/redhat-operators-l8q87" Oct 01 15:10:25 crc kubenswrapper[4771]: I1001 15:10:25.684617 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l8q87" Oct 01 15:10:25 crc kubenswrapper[4771]: I1001 15:10:25.938544 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-l8q87"] Oct 01 15:10:26 crc kubenswrapper[4771]: I1001 15:10:26.244675 4771 generic.go:334] "Generic (PLEG): container finished" podID="0745ade3-bbcb-4ac3-b60d-4956c63d1be6" containerID="cd961ca7f77031fbeec19c0c99febcb868d0d13f5bb38c4171c717bd92ad5b27" exitCode=0 Oct 01 15:10:26 crc kubenswrapper[4771]: I1001 15:10:26.244724 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l8q87" event={"ID":"0745ade3-bbcb-4ac3-b60d-4956c63d1be6","Type":"ContainerDied","Data":"cd961ca7f77031fbeec19c0c99febcb868d0d13f5bb38c4171c717bd92ad5b27"} Oct 01 15:10:26 crc kubenswrapper[4771]: I1001 15:10:26.245011 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l8q87" event={"ID":"0745ade3-bbcb-4ac3-b60d-4956c63d1be6","Type":"ContainerStarted","Data":"092af0863e4e4e0201f3047a4c6d6c0895d690e842dabb4c85c9c8168330ecb2"} Oct 01 15:10:27 crc kubenswrapper[4771]: I1001 15:10:27.252056 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l8q87" event={"ID":"0745ade3-bbcb-4ac3-b60d-4956c63d1be6","Type":"ContainerStarted","Data":"56166530dd051f3f00ded628b3439ec262fbd277cab4079d03a7750d805828e6"} Oct 01 15:10:28 crc kubenswrapper[4771]: I1001 15:10:28.262677 4771 generic.go:334] "Generic (PLEG): container finished" podID="0745ade3-bbcb-4ac3-b60d-4956c63d1be6" containerID="56166530dd051f3f00ded628b3439ec262fbd277cab4079d03a7750d805828e6" exitCode=0 Oct 01 15:10:28 crc kubenswrapper[4771]: I1001 15:10:28.262822 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l8q87" event={"ID":"0745ade3-bbcb-4ac3-b60d-4956c63d1be6","Type":"ContainerDied","Data":"56166530dd051f3f00ded628b3439ec262fbd277cab4079d03a7750d805828e6"} Oct 01 15:10:29 crc kubenswrapper[4771]: I1001 15:10:29.273262 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l8q87" event={"ID":"0745ade3-bbcb-4ac3-b60d-4956c63d1be6","Type":"ContainerStarted","Data":"e5a803c30a8b3747cc18b1a894edf31818a64c5a1352636aa2c1ec34e191734d"} Oct 01 15:10:29 crc kubenswrapper[4771]: I1001 15:10:29.294129 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-l8q87" podStartSLOduration=1.794802311 podStartE2EDuration="4.294102401s" podCreationTimestamp="2025-10-01 15:10:25 +0000 UTC" firstStartedPulling="2025-10-01 15:10:26.246119017 +0000 UTC m=+870.865294188" lastFinishedPulling="2025-10-01 15:10:28.745419097 +0000 UTC m=+873.364594278" observedRunningTime="2025-10-01 15:10:29.289104779 +0000 UTC m=+873.908279990" watchObservedRunningTime="2025-10-01 15:10:29.294102401 +0000 UTC m=+873.913277612" Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.664212 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-6vp9n"] Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.666437 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-6vp9n" Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.669785 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-f2b9c" Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.670723 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-gbllm"] Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.672299 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-gbllm" Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.677616 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-z42px" Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.683346 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-7tpgh"] Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.684815 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-l8q87" Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.684954 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-7tpgh" Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.685424 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-l8q87" Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.693478 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-7tt7q" Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.720852 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-99nfn"] Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.722443 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-99nfn" Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.727785 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-7tpgh"] Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.728037 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-zn4lz" Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.735156 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-6vp9n"] Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.750830 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-99nfn"] Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.759758 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-55pmn"] Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.760716 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-55pmn" Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.767743 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-l8q87" Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.768548 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-h5h4b" Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.769111 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-gbllm"] Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.774653 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhgnc\" (UniqueName: \"kubernetes.io/projected/31b30f6f-3f4e-4c6b-9517-0eb866b2c68c-kube-api-access-qhgnc\") pod \"cinder-operator-controller-manager-644bddb6d8-gbllm\" (UID: \"31b30f6f-3f4e-4c6b-9517-0eb866b2c68c\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-gbllm" Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.774689 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrcmf\" (UniqueName: \"kubernetes.io/projected/41e9ddd0-3d55-4c3d-a4e9-2fbe4a7ec6f6-kube-api-access-wrcmf\") pod \"barbican-operator-controller-manager-6ff8b75857-6vp9n\" (UID: \"41e9ddd0-3d55-4c3d-a4e9-2fbe4a7ec6f6\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-6vp9n" Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.776596 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-mwl7z"] Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.777609 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-mwl7z" Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.779783 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-z7q4q" Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.805871 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-55pmn"] Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.813946 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-9d6c5db85-xkmp2"] Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.832503 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-xkmp2" Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.836500 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-p75jq" Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.836830 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.876552 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhgnc\" (UniqueName: \"kubernetes.io/projected/31b30f6f-3f4e-4c6b-9517-0eb866b2c68c-kube-api-access-qhgnc\") pod \"cinder-operator-controller-manager-644bddb6d8-gbllm\" (UID: \"31b30f6f-3f4e-4c6b-9517-0eb866b2c68c\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-gbllm" Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.876599 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrcmf\" (UniqueName: \"kubernetes.io/projected/41e9ddd0-3d55-4c3d-a4e9-2fbe4a7ec6f6-kube-api-access-wrcmf\") pod \"barbican-operator-controller-manager-6ff8b75857-6vp9n\" (UID: \"41e9ddd0-3d55-4c3d-a4e9-2fbe4a7ec6f6\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-6vp9n" Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.876762 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs9pc\" (UniqueName: \"kubernetes.io/projected/46077b26-3930-4245-86b4-2d836a165664-kube-api-access-bs9pc\") pod \"designate-operator-controller-manager-84f4f7b77b-7tpgh\" (UID: \"46077b26-3930-4245-86b4-2d836a165664\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-7tpgh" Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.876832 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czp58\" (UniqueName: \"kubernetes.io/projected/43a1a358-9eba-46eb-90c5-a34e0fad09d6-kube-api-access-czp58\") pod \"heat-operator-controller-manager-5d889d78cf-55pmn\" (UID: \"43a1a358-9eba-46eb-90c5-a34e0fad09d6\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-55pmn" Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.877000 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfk7l\" (UniqueName: \"kubernetes.io/projected/681d5bbd-36b0-497f-9d27-f8cc7473399a-kube-api-access-qfk7l\") pod \"horizon-operator-controller-manager-9f4696d94-mwl7z\" (UID: \"681d5bbd-36b0-497f-9d27-f8cc7473399a\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-mwl7z" Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.877023 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5wmr\" (UniqueName: \"kubernetes.io/projected/60253a13-4845-4234-81f4-329e6f35a86e-kube-api-access-m5wmr\") pod \"glance-operator-controller-manager-84958c4d49-99nfn\" (UID: \"60253a13-4845-4234-81f4-329e6f35a86e\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-99nfn" Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.880803 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5cd4858477-zcdfw"] Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.882933 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-zcdfw" Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.885623 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-7mnhn" Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.925098 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhgnc\" (UniqueName: \"kubernetes.io/projected/31b30f6f-3f4e-4c6b-9517-0eb866b2c68c-kube-api-access-qhgnc\") pod \"cinder-operator-controller-manager-644bddb6d8-gbllm\" (UID: \"31b30f6f-3f4e-4c6b-9517-0eb866b2c68c\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-gbllm" Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.925184 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-mwl7z"] Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.934434 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrcmf\" (UniqueName: \"kubernetes.io/projected/41e9ddd0-3d55-4c3d-a4e9-2fbe4a7ec6f6-kube-api-access-wrcmf\") pod \"barbican-operator-controller-manager-6ff8b75857-6vp9n\" (UID: \"41e9ddd0-3d55-4c3d-a4e9-2fbe4a7ec6f6\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-6vp9n" Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.954561 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-9d6c5db85-xkmp2"] Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.978025 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dp9cm\" (UniqueName: \"kubernetes.io/projected/fcb3eff6-0c4e-4046-a829-fab3a5942d21-kube-api-access-dp9cm\") pod \"ironic-operator-controller-manager-5cd4858477-zcdfw\" (UID: \"fcb3eff6-0c4e-4046-a829-fab3a5942d21\") " pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-zcdfw" Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.978077 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46h76\" (UniqueName: \"kubernetes.io/projected/f7c18d5d-6ebb-4c31-a348-6ae7feebfafc-kube-api-access-46h76\") pod \"infra-operator-controller-manager-9d6c5db85-xkmp2\" (UID: \"f7c18d5d-6ebb-4c31-a348-6ae7feebfafc\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-xkmp2" Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.978116 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bs9pc\" (UniqueName: \"kubernetes.io/projected/46077b26-3930-4245-86b4-2d836a165664-kube-api-access-bs9pc\") pod \"designate-operator-controller-manager-84f4f7b77b-7tpgh\" (UID: \"46077b26-3930-4245-86b4-2d836a165664\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-7tpgh" Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.978141 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czp58\" (UniqueName: \"kubernetes.io/projected/43a1a358-9eba-46eb-90c5-a34e0fad09d6-kube-api-access-czp58\") pod \"heat-operator-controller-manager-5d889d78cf-55pmn\" (UID: \"43a1a358-9eba-46eb-90c5-a34e0fad09d6\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-55pmn" Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.978168 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f7c18d5d-6ebb-4c31-a348-6ae7feebfafc-cert\") pod \"infra-operator-controller-manager-9d6c5db85-xkmp2\" (UID: \"f7c18d5d-6ebb-4c31-a348-6ae7feebfafc\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-xkmp2" Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.978196 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfk7l\" (UniqueName: \"kubernetes.io/projected/681d5bbd-36b0-497f-9d27-f8cc7473399a-kube-api-access-qfk7l\") pod \"horizon-operator-controller-manager-9f4696d94-mwl7z\" (UID: \"681d5bbd-36b0-497f-9d27-f8cc7473399a\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-mwl7z" Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.978214 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5wmr\" (UniqueName: \"kubernetes.io/projected/60253a13-4845-4234-81f4-329e6f35a86e-kube-api-access-m5wmr\") pod \"glance-operator-controller-manager-84958c4d49-99nfn\" (UID: \"60253a13-4845-4234-81f4-329e6f35a86e\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-99nfn" Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.979437 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-wps64"] Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.980655 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-wps64" Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.982744 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-t55np" Oct 01 15:10:35 crc kubenswrapper[4771]: I1001 15:10:35.995058 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-6vp9n" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.003295 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5wmr\" (UniqueName: \"kubernetes.io/projected/60253a13-4845-4234-81f4-329e6f35a86e-kube-api-access-m5wmr\") pod \"glance-operator-controller-manager-84958c4d49-99nfn\" (UID: \"60253a13-4845-4234-81f4-329e6f35a86e\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-99nfn" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.005352 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czp58\" (UniqueName: \"kubernetes.io/projected/43a1a358-9eba-46eb-90c5-a34e0fad09d6-kube-api-access-czp58\") pod \"heat-operator-controller-manager-5d889d78cf-55pmn\" (UID: \"43a1a358-9eba-46eb-90c5-a34e0fad09d6\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-55pmn" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.008102 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-gbllm" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.015351 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5cd4858477-zcdfw"] Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.018712 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs9pc\" (UniqueName: \"kubernetes.io/projected/46077b26-3930-4245-86b4-2d836a165664-kube-api-access-bs9pc\") pod \"designate-operator-controller-manager-84f4f7b77b-7tpgh\" (UID: \"46077b26-3930-4245-86b4-2d836a165664\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-7tpgh" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.019710 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfk7l\" (UniqueName: \"kubernetes.io/projected/681d5bbd-36b0-497f-9d27-f8cc7473399a-kube-api-access-qfk7l\") pod \"horizon-operator-controller-manager-9f4696d94-mwl7z\" (UID: \"681d5bbd-36b0-497f-9d27-f8cc7473399a\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-mwl7z" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.026206 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-7tpgh" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.033844 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-wps64"] Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.043011 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-99nfn" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.083073 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-55pmn" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.083412 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9hl2\" (UniqueName: \"kubernetes.io/projected/0c5bf036-417b-4f93-94a0-7c8ddc9028d7-kube-api-access-d9hl2\") pod \"keystone-operator-controller-manager-5bd55b4bff-wps64\" (UID: \"0c5bf036-417b-4f93-94a0-7c8ddc9028d7\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-wps64" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.083485 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f7c18d5d-6ebb-4c31-a348-6ae7feebfafc-cert\") pod \"infra-operator-controller-manager-9d6c5db85-xkmp2\" (UID: \"f7c18d5d-6ebb-4c31-a348-6ae7feebfafc\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-xkmp2" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.083529 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dp9cm\" (UniqueName: \"kubernetes.io/projected/fcb3eff6-0c4e-4046-a829-fab3a5942d21-kube-api-access-dp9cm\") pod \"ironic-operator-controller-manager-5cd4858477-zcdfw\" (UID: \"fcb3eff6-0c4e-4046-a829-fab3a5942d21\") " pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-zcdfw" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.083559 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46h76\" (UniqueName: \"kubernetes.io/projected/f7c18d5d-6ebb-4c31-a348-6ae7feebfafc-kube-api-access-46h76\") pod \"infra-operator-controller-manager-9d6c5db85-xkmp2\" (UID: \"f7c18d5d-6ebb-4c31-a348-6ae7feebfafc\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-xkmp2" Oct 01 15:10:36 crc kubenswrapper[4771]: E1001 15:10:36.083913 4771 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Oct 01 15:10:36 crc kubenswrapper[4771]: E1001 15:10:36.083957 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7c18d5d-6ebb-4c31-a348-6ae7feebfafc-cert podName:f7c18d5d-6ebb-4c31-a348-6ae7feebfafc nodeName:}" failed. No retries permitted until 2025-10-01 15:10:36.583940484 +0000 UTC m=+881.203115655 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f7c18d5d-6ebb-4c31-a348-6ae7feebfafc-cert") pod "infra-operator-controller-manager-9d6c5db85-xkmp2" (UID: "f7c18d5d-6ebb-4c31-a348-6ae7feebfafc") : secret "infra-operator-webhook-server-cert" not found Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.097790 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-nv495"] Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.098870 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-nv495" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.100996 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-mwl7z" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.107081 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-dkbhb" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.113801 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dp9cm\" (UniqueName: \"kubernetes.io/projected/fcb3eff6-0c4e-4046-a829-fab3a5942d21-kube-api-access-dp9cm\") pod \"ironic-operator-controller-manager-5cd4858477-zcdfw\" (UID: \"fcb3eff6-0c4e-4046-a829-fab3a5942d21\") " pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-zcdfw" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.114606 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46h76\" (UniqueName: \"kubernetes.io/projected/f7c18d5d-6ebb-4c31-a348-6ae7feebfafc-kube-api-access-46h76\") pod \"infra-operator-controller-manager-9d6c5db85-xkmp2\" (UID: \"f7c18d5d-6ebb-4c31-a348-6ae7feebfafc\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-xkmp2" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.134176 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-nv495"] Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.145905 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-5jmlk"] Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.147528 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-88c7-5jmlk" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.156472 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-k5scb" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.179019 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-5jmlk"] Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.183829 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-849d5b9b84-nv6q5"] Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.184637 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9hl2\" (UniqueName: \"kubernetes.io/projected/0c5bf036-417b-4f93-94a0-7c8ddc9028d7-kube-api-access-d9hl2\") pod \"keystone-operator-controller-manager-5bd55b4bff-wps64\" (UID: \"0c5bf036-417b-4f93-94a0-7c8ddc9028d7\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-wps64" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.184897 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-nv6q5" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.187626 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-64cd67b5cb-l85hk"] Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.188810 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-l85hk" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.193367 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-dq6mz" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.193668 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-fpkqn" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.196680 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-64cd67b5cb-l85hk"] Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.200948 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-849d5b9b84-nv6q5"] Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.207027 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7b787867f4-fx47m"] Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.208126 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-fx47m" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.210390 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-z7w9v" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.211246 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9hl2\" (UniqueName: \"kubernetes.io/projected/0c5bf036-417b-4f93-94a0-7c8ddc9028d7-kube-api-access-d9hl2\") pod \"keystone-operator-controller-manager-5bd55b4bff-wps64\" (UID: \"0c5bf036-417b-4f93-94a0-7c8ddc9028d7\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-wps64" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.218424 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7b787867f4-fx47m"] Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.235520 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-qnlxq"] Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.238492 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-qnlxq" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.240814 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-vxhj6" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.250381 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-qnlxq"] Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.258160 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8c62g57"] Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.259221 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8c62g57" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.262169 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-s6pvx" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.262235 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.273481 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-zcdfw" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.285584 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv5wt\" (UniqueName: \"kubernetes.io/projected/863ec596-646c-41a0-b3e4-e33ad84c79aa-kube-api-access-cv5wt\") pod \"nova-operator-controller-manager-64cd67b5cb-l85hk\" (UID: \"863ec596-646c-41a0-b3e4-e33ad84c79aa\") " pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-l85hk" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.285622 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6ngb\" (UniqueName: \"kubernetes.io/projected/c8125ce6-7c9a-45d3-b820-698fd30d3471-kube-api-access-s6ngb\") pod \"neutron-operator-controller-manager-849d5b9b84-nv6q5\" (UID: \"c8125ce6-7c9a-45d3-b820-698fd30d3471\") " pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-nv6q5" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.285649 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gqj8\" (UniqueName: \"kubernetes.io/projected/a1c30bd9-dd78-4d92-9423-16597bf7d758-kube-api-access-4gqj8\") pod \"mariadb-operator-controller-manager-88c7-5jmlk\" (UID: \"a1c30bd9-dd78-4d92-9423-16597bf7d758\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-5jmlk" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.285694 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ntcm\" (UniqueName: \"kubernetes.io/projected/4c10ae08-be13-4725-b9be-c55ce015f33e-kube-api-access-2ntcm\") pod \"manila-operator-controller-manager-6d68dbc695-nv495\" (UID: \"4c10ae08-be13-4725-b9be-c55ce015f33e\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-nv495" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.302290 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-bkpfg"] Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.303465 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-bkpfg" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.312927 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-hh9cn" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.321219 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8c62g57"] Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.333342 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-84d6b4b759-wljgn"] Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.335886 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-wljgn" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.339663 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-fhr88" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.352507 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-bkpfg"] Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.359590 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-wps64" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.382433 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-84d6b4b759-wljgn"] Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.387126 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cv5wt\" (UniqueName: \"kubernetes.io/projected/863ec596-646c-41a0-b3e4-e33ad84c79aa-kube-api-access-cv5wt\") pod \"nova-operator-controller-manager-64cd67b5cb-l85hk\" (UID: \"863ec596-646c-41a0-b3e4-e33ad84c79aa\") " pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-l85hk" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.387181 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6ngb\" (UniqueName: \"kubernetes.io/projected/c8125ce6-7c9a-45d3-b820-698fd30d3471-kube-api-access-s6ngb\") pod \"neutron-operator-controller-manager-849d5b9b84-nv6q5\" (UID: \"c8125ce6-7c9a-45d3-b820-698fd30d3471\") " pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-nv6q5" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.387213 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gqj8\" (UniqueName: \"kubernetes.io/projected/a1c30bd9-dd78-4d92-9423-16597bf7d758-kube-api-access-4gqj8\") pod \"mariadb-operator-controller-manager-88c7-5jmlk\" (UID: \"a1c30bd9-dd78-4d92-9423-16597bf7d758\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-5jmlk" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.387244 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c87cb84f-c539-4562-8492-b1106b6181f1-cert\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8c62g57\" (UID: \"c87cb84f-c539-4562-8492-b1106b6181f1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8c62g57" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.387280 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4whbx\" (UniqueName: \"kubernetes.io/projected/c87cb84f-c539-4562-8492-b1106b6181f1-kube-api-access-4whbx\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8c62g57\" (UID: \"c87cb84f-c539-4562-8492-b1106b6181f1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8c62g57" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.387301 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lgv8\" (UniqueName: \"kubernetes.io/projected/b90ad79b-e447-4a96-82a1-4ae8cb5b9959-kube-api-access-2lgv8\") pod \"ovn-operator-controller-manager-9976ff44c-qnlxq\" (UID: \"b90ad79b-e447-4a96-82a1-4ae8cb5b9959\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-qnlxq" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.387319 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ntcm\" (UniqueName: \"kubernetes.io/projected/4c10ae08-be13-4725-b9be-c55ce015f33e-kube-api-access-2ntcm\") pod \"manila-operator-controller-manager-6d68dbc695-nv495\" (UID: \"4c10ae08-be13-4725-b9be-c55ce015f33e\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-nv495" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.387348 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zxcp\" (UniqueName: \"kubernetes.io/projected/417e6338-ae16-4903-8381-5bb1c3a92c75-kube-api-access-7zxcp\") pod \"octavia-operator-controller-manager-7b787867f4-fx47m\" (UID: \"417e6338-ae16-4903-8381-5bb1c3a92c75\") " pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-fx47m" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.388347 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-c495dbccb-25dzd"] Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.420821 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-c495dbccb-25dzd"] Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.421562 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-c495dbccb-25dzd" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.439312 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-85777745bb-vvkwz"] Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.450352 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-8vzxc" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.450908 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gqj8\" (UniqueName: \"kubernetes.io/projected/a1c30bd9-dd78-4d92-9423-16597bf7d758-kube-api-access-4gqj8\") pod \"mariadb-operator-controller-manager-88c7-5jmlk\" (UID: \"a1c30bd9-dd78-4d92-9423-16597bf7d758\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-5jmlk" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.451709 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-85777745bb-vvkwz" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.453321 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6ngb\" (UniqueName: \"kubernetes.io/projected/c8125ce6-7c9a-45d3-b820-698fd30d3471-kube-api-access-s6ngb\") pod \"neutron-operator-controller-manager-849d5b9b84-nv6q5\" (UID: \"c8125ce6-7c9a-45d3-b820-698fd30d3471\") " pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-nv6q5" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.453802 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-rh7mv" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.458710 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-85777745bb-vvkwz"] Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.464404 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv5wt\" (UniqueName: \"kubernetes.io/projected/863ec596-646c-41a0-b3e4-e33ad84c79aa-kube-api-access-cv5wt\") pod \"nova-operator-controller-manager-64cd67b5cb-l85hk\" (UID: \"863ec596-646c-41a0-b3e4-e33ad84c79aa\") " pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-l85hk" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.464881 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ntcm\" (UniqueName: \"kubernetes.io/projected/4c10ae08-be13-4725-b9be-c55ce015f33e-kube-api-access-2ntcm\") pod \"manila-operator-controller-manager-6d68dbc695-nv495\" (UID: \"4c10ae08-be13-4725-b9be-c55ce015f33e\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-nv495" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.467671 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-88c7-5jmlk" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.476016 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6b9957f54f-crn74"] Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.477345 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-crn74" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.478945 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-blh84" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.480095 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6b9957f54f-crn74"] Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.483556 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-l8q87" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.492096 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcdnz\" (UniqueName: \"kubernetes.io/projected/193791c4-4d63-4f50-a743-439b664c16b7-kube-api-access-gcdnz\") pod \"swift-operator-controller-manager-84d6b4b759-wljgn\" (UID: \"193791c4-4d63-4f50-a743-439b664c16b7\") " pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-wljgn" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.492195 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c87cb84f-c539-4562-8492-b1106b6181f1-cert\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8c62g57\" (UID: \"c87cb84f-c539-4562-8492-b1106b6181f1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8c62g57" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.492291 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4whbx\" (UniqueName: \"kubernetes.io/projected/c87cb84f-c539-4562-8492-b1106b6181f1-kube-api-access-4whbx\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8c62g57\" (UID: \"c87cb84f-c539-4562-8492-b1106b6181f1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8c62g57" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.492313 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lgv8\" (UniqueName: \"kubernetes.io/projected/b90ad79b-e447-4a96-82a1-4ae8cb5b9959-kube-api-access-2lgv8\") pod \"ovn-operator-controller-manager-9976ff44c-qnlxq\" (UID: \"b90ad79b-e447-4a96-82a1-4ae8cb5b9959\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-qnlxq" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.492353 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zxcp\" (UniqueName: \"kubernetes.io/projected/417e6338-ae16-4903-8381-5bb1c3a92c75-kube-api-access-7zxcp\") pod \"octavia-operator-controller-manager-7b787867f4-fx47m\" (UID: \"417e6338-ae16-4903-8381-5bb1c3a92c75\") " pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-fx47m" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.492437 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5w79\" (UniqueName: \"kubernetes.io/projected/9c1c4098-4bbf-4d54-a09d-44b29ef352c3-kube-api-access-g5w79\") pod \"placement-operator-controller-manager-589c58c6c-bkpfg\" (UID: \"9c1c4098-4bbf-4d54-a09d-44b29ef352c3\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-bkpfg" Oct 01 15:10:36 crc kubenswrapper[4771]: E1001 15:10:36.492569 4771 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 01 15:10:36 crc kubenswrapper[4771]: E1001 15:10:36.492609 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c87cb84f-c539-4562-8492-b1106b6181f1-cert podName:c87cb84f-c539-4562-8492-b1106b6181f1 nodeName:}" failed. No retries permitted until 2025-10-01 15:10:36.99259655 +0000 UTC m=+881.611771721 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c87cb84f-c539-4562-8492-b1106b6181f1-cert") pod "openstack-baremetal-operator-controller-manager-77b9676b8c62g57" (UID: "c87cb84f-c539-4562-8492-b1106b6181f1") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.509256 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-nv6q5" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.519433 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-577574bf4d-p8zrk"] Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.521340 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-577574bf4d-p8zrk" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.523230 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4whbx\" (UniqueName: \"kubernetes.io/projected/c87cb84f-c539-4562-8492-b1106b6181f1-kube-api-access-4whbx\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8c62g57\" (UID: \"c87cb84f-c539-4562-8492-b1106b6181f1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8c62g57" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.524155 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.524499 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-69n8d" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.526207 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-l85hk" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.535135 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-577574bf4d-p8zrk"] Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.537331 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lgv8\" (UniqueName: \"kubernetes.io/projected/b90ad79b-e447-4a96-82a1-4ae8cb5b9959-kube-api-access-2lgv8\") pod \"ovn-operator-controller-manager-9976ff44c-qnlxq\" (UID: \"b90ad79b-e447-4a96-82a1-4ae8cb5b9959\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-qnlxq" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.538294 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7clp8"] Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.539480 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7clp8" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.541278 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-6zspf" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.542989 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7clp8"] Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.555602 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zxcp\" (UniqueName: \"kubernetes.io/projected/417e6338-ae16-4903-8381-5bb1c3a92c75-kube-api-access-7zxcp\") pod \"octavia-operator-controller-manager-7b787867f4-fx47m\" (UID: \"417e6338-ae16-4903-8381-5bb1c3a92c75\") " pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-fx47m" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.563495 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-qnlxq" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.586289 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l8q87"] Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.595120 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/31e6f7a8-0c8e-48c9-a2ba-38ffdabc1d95-cert\") pod \"openstack-operator-controller-manager-577574bf4d-p8zrk\" (UID: \"31e6f7a8-0c8e-48c9-a2ba-38ffdabc1d95\") " pod="openstack-operators/openstack-operator-controller-manager-577574bf4d-p8zrk" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.595164 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9vlq\" (UniqueName: \"kubernetes.io/projected/e6d65172-53ec-4aae-a508-b955072cdd2a-kube-api-access-r9vlq\") pod \"test-operator-controller-manager-85777745bb-vvkwz\" (UID: \"e6d65172-53ec-4aae-a508-b955072cdd2a\") " pod="openstack-operators/test-operator-controller-manager-85777745bb-vvkwz" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.595187 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f7c18d5d-6ebb-4c31-a348-6ae7feebfafc-cert\") pod \"infra-operator-controller-manager-9d6c5db85-xkmp2\" (UID: \"f7c18d5d-6ebb-4c31-a348-6ae7feebfafc\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-xkmp2" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.595208 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbmzh\" (UniqueName: \"kubernetes.io/projected/a0517b85-5f5f-4d87-92f8-901564af068c-kube-api-access-lbmzh\") pod \"watcher-operator-controller-manager-6b9957f54f-crn74\" (UID: \"a0517b85-5f5f-4d87-92f8-901564af068c\") " pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-crn74" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.595228 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx7bm\" (UniqueName: \"kubernetes.io/projected/9abd1d50-adca-4d9a-8c33-89c3242174a5-kube-api-access-qx7bm\") pod \"telemetry-operator-controller-manager-c495dbccb-25dzd\" (UID: \"9abd1d50-adca-4d9a-8c33-89c3242174a5\") " pod="openstack-operators/telemetry-operator-controller-manager-c495dbccb-25dzd" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.595268 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td8hz\" (UniqueName: \"kubernetes.io/projected/31e6f7a8-0c8e-48c9-a2ba-38ffdabc1d95-kube-api-access-td8hz\") pod \"openstack-operator-controller-manager-577574bf4d-p8zrk\" (UID: \"31e6f7a8-0c8e-48c9-a2ba-38ffdabc1d95\") " pod="openstack-operators/openstack-operator-controller-manager-577574bf4d-p8zrk" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.595362 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5w79\" (UniqueName: \"kubernetes.io/projected/9c1c4098-4bbf-4d54-a09d-44b29ef352c3-kube-api-access-g5w79\") pod \"placement-operator-controller-manager-589c58c6c-bkpfg\" (UID: \"9c1c4098-4bbf-4d54-a09d-44b29ef352c3\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-bkpfg" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.595486 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcdnz\" (UniqueName: \"kubernetes.io/projected/193791c4-4d63-4f50-a743-439b664c16b7-kube-api-access-gcdnz\") pod \"swift-operator-controller-manager-84d6b4b759-wljgn\" (UID: \"193791c4-4d63-4f50-a743-439b664c16b7\") " pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-wljgn" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.601285 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f7c18d5d-6ebb-4c31-a348-6ae7feebfafc-cert\") pod \"infra-operator-controller-manager-9d6c5db85-xkmp2\" (UID: \"f7c18d5d-6ebb-4c31-a348-6ae7feebfafc\") " pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-xkmp2" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.625386 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5w79\" (UniqueName: \"kubernetes.io/projected/9c1c4098-4bbf-4d54-a09d-44b29ef352c3-kube-api-access-g5w79\") pod \"placement-operator-controller-manager-589c58c6c-bkpfg\" (UID: \"9c1c4098-4bbf-4d54-a09d-44b29ef352c3\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-bkpfg" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.628872 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcdnz\" (UniqueName: \"kubernetes.io/projected/193791c4-4d63-4f50-a743-439b664c16b7-kube-api-access-gcdnz\") pod \"swift-operator-controller-manager-84d6b4b759-wljgn\" (UID: \"193791c4-4d63-4f50-a743-439b664c16b7\") " pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-wljgn" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.645204 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-bkpfg" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.696838 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-td8hz\" (UniqueName: \"kubernetes.io/projected/31e6f7a8-0c8e-48c9-a2ba-38ffdabc1d95-kube-api-access-td8hz\") pod \"openstack-operator-controller-manager-577574bf4d-p8zrk\" (UID: \"31e6f7a8-0c8e-48c9-a2ba-38ffdabc1d95\") " pod="openstack-operators/openstack-operator-controller-manager-577574bf4d-p8zrk" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.696917 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75rdf\" (UniqueName: \"kubernetes.io/projected/b5efb91f-7e66-488b-ab6f-e52dbf63bc3c-kube-api-access-75rdf\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-7clp8\" (UID: \"b5efb91f-7e66-488b-ab6f-e52dbf63bc3c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7clp8" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.696957 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/31e6f7a8-0c8e-48c9-a2ba-38ffdabc1d95-cert\") pod \"openstack-operator-controller-manager-577574bf4d-p8zrk\" (UID: \"31e6f7a8-0c8e-48c9-a2ba-38ffdabc1d95\") " pod="openstack-operators/openstack-operator-controller-manager-577574bf4d-p8zrk" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.696986 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9vlq\" (UniqueName: \"kubernetes.io/projected/e6d65172-53ec-4aae-a508-b955072cdd2a-kube-api-access-r9vlq\") pod \"test-operator-controller-manager-85777745bb-vvkwz\" (UID: \"e6d65172-53ec-4aae-a508-b955072cdd2a\") " pod="openstack-operators/test-operator-controller-manager-85777745bb-vvkwz" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.697008 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbmzh\" (UniqueName: \"kubernetes.io/projected/a0517b85-5f5f-4d87-92f8-901564af068c-kube-api-access-lbmzh\") pod \"watcher-operator-controller-manager-6b9957f54f-crn74\" (UID: \"a0517b85-5f5f-4d87-92f8-901564af068c\") " pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-crn74" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.697028 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qx7bm\" (UniqueName: \"kubernetes.io/projected/9abd1d50-adca-4d9a-8c33-89c3242174a5-kube-api-access-qx7bm\") pod \"telemetry-operator-controller-manager-c495dbccb-25dzd\" (UID: \"9abd1d50-adca-4d9a-8c33-89c3242174a5\") " pod="openstack-operators/telemetry-operator-controller-manager-c495dbccb-25dzd" Oct 01 15:10:36 crc kubenswrapper[4771]: E1001 15:10:36.697395 4771 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 01 15:10:36 crc kubenswrapper[4771]: E1001 15:10:36.697448 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/31e6f7a8-0c8e-48c9-a2ba-38ffdabc1d95-cert podName:31e6f7a8-0c8e-48c9-a2ba-38ffdabc1d95 nodeName:}" failed. No retries permitted until 2025-10-01 15:10:37.197431815 +0000 UTC m=+881.816606986 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/31e6f7a8-0c8e-48c9-a2ba-38ffdabc1d95-cert") pod "openstack-operator-controller-manager-577574bf4d-p8zrk" (UID: "31e6f7a8-0c8e-48c9-a2ba-38ffdabc1d95") : secret "webhook-server-cert" not found Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.717608 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx7bm\" (UniqueName: \"kubernetes.io/projected/9abd1d50-adca-4d9a-8c33-89c3242174a5-kube-api-access-qx7bm\") pod \"telemetry-operator-controller-manager-c495dbccb-25dzd\" (UID: \"9abd1d50-adca-4d9a-8c33-89c3242174a5\") " pod="openstack-operators/telemetry-operator-controller-manager-c495dbccb-25dzd" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.719906 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-td8hz\" (UniqueName: \"kubernetes.io/projected/31e6f7a8-0c8e-48c9-a2ba-38ffdabc1d95-kube-api-access-td8hz\") pod \"openstack-operator-controller-manager-577574bf4d-p8zrk\" (UID: \"31e6f7a8-0c8e-48c9-a2ba-38ffdabc1d95\") " pod="openstack-operators/openstack-operator-controller-manager-577574bf4d-p8zrk" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.723361 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9vlq\" (UniqueName: \"kubernetes.io/projected/e6d65172-53ec-4aae-a508-b955072cdd2a-kube-api-access-r9vlq\") pod \"test-operator-controller-manager-85777745bb-vvkwz\" (UID: \"e6d65172-53ec-4aae-a508-b955072cdd2a\") " pod="openstack-operators/test-operator-controller-manager-85777745bb-vvkwz" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.728501 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbmzh\" (UniqueName: \"kubernetes.io/projected/a0517b85-5f5f-4d87-92f8-901564af068c-kube-api-access-lbmzh\") pod \"watcher-operator-controller-manager-6b9957f54f-crn74\" (UID: \"a0517b85-5f5f-4d87-92f8-901564af068c\") " pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-crn74" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.728848 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-nv495" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.758768 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-xkmp2" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.798186 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75rdf\" (UniqueName: \"kubernetes.io/projected/b5efb91f-7e66-488b-ab6f-e52dbf63bc3c-kube-api-access-75rdf\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-7clp8\" (UID: \"b5efb91f-7e66-488b-ab6f-e52dbf63bc3c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7clp8" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.822677 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75rdf\" (UniqueName: \"kubernetes.io/projected/b5efb91f-7e66-488b-ab6f-e52dbf63bc3c-kube-api-access-75rdf\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-7clp8\" (UID: \"b5efb91f-7e66-488b-ab6f-e52dbf63bc3c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7clp8" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.841537 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-fx47m" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.891709 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-wljgn" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.915988 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-c495dbccb-25dzd" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.934198 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-85777745bb-vvkwz" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.937476 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-99nfn"] Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.949676 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-crn74" Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.963326 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-6vp9n"] Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.969886 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-gbllm"] Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.978553 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-7tpgh"] Oct 01 15:10:36 crc kubenswrapper[4771]: I1001 15:10:36.989785 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7clp8" Oct 01 15:10:37 crc kubenswrapper[4771]: I1001 15:10:37.014157 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c87cb84f-c539-4562-8492-b1106b6181f1-cert\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8c62g57\" (UID: \"c87cb84f-c539-4562-8492-b1106b6181f1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8c62g57" Oct 01 15:10:37 crc kubenswrapper[4771]: I1001 15:10:37.023679 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c87cb84f-c539-4562-8492-b1106b6181f1-cert\") pod \"openstack-baremetal-operator-controller-manager-77b9676b8c62g57\" (UID: \"c87cb84f-c539-4562-8492-b1106b6181f1\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8c62g57" Oct 01 15:10:37 crc kubenswrapper[4771]: W1001 15:10:37.040715 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60253a13_4845_4234_81f4_329e6f35a86e.slice/crio-4335212a466595a3332547cc5f548117392dfa7c494e879526f68d3f56c955e7 WatchSource:0}: Error finding container 4335212a466595a3332547cc5f548117392dfa7c494e879526f68d3f56c955e7: Status 404 returned error can't find the container with id 4335212a466595a3332547cc5f548117392dfa7c494e879526f68d3f56c955e7 Oct 01 15:10:37 crc kubenswrapper[4771]: I1001 15:10:37.186158 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8c62g57" Oct 01 15:10:37 crc kubenswrapper[4771]: I1001 15:10:37.214611 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-55pmn"] Oct 01 15:10:37 crc kubenswrapper[4771]: I1001 15:10:37.216688 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/31e6f7a8-0c8e-48c9-a2ba-38ffdabc1d95-cert\") pod \"openstack-operator-controller-manager-577574bf4d-p8zrk\" (UID: \"31e6f7a8-0c8e-48c9-a2ba-38ffdabc1d95\") " pod="openstack-operators/openstack-operator-controller-manager-577574bf4d-p8zrk" Oct 01 15:10:37 crc kubenswrapper[4771]: I1001 15:10:37.221456 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/31e6f7a8-0c8e-48c9-a2ba-38ffdabc1d95-cert\") pod \"openstack-operator-controller-manager-577574bf4d-p8zrk\" (UID: \"31e6f7a8-0c8e-48c9-a2ba-38ffdabc1d95\") " pod="openstack-operators/openstack-operator-controller-manager-577574bf4d-p8zrk" Oct 01 15:10:37 crc kubenswrapper[4771]: W1001 15:10:37.221477 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43a1a358_9eba_46eb_90c5_a34e0fad09d6.slice/crio-d66b0c834e798e57c13e779ecc0cc4dd24099910d319c0524b2077c1b20248de WatchSource:0}: Error finding container d66b0c834e798e57c13e779ecc0cc4dd24099910d319c0524b2077c1b20248de: Status 404 returned error can't find the container with id d66b0c834e798e57c13e779ecc0cc4dd24099910d319c0524b2077c1b20248de Oct 01 15:10:37 crc kubenswrapper[4771]: I1001 15:10:37.322858 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-577574bf4d-p8zrk" Oct 01 15:10:37 crc kubenswrapper[4771]: I1001 15:10:37.349481 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-gbllm" event={"ID":"31b30f6f-3f4e-4c6b-9517-0eb866b2c68c","Type":"ContainerStarted","Data":"0ac0038f55061ab0e3887b86ef836acec9b4dc7a4e40189291e5999601a4843d"} Oct 01 15:10:37 crc kubenswrapper[4771]: I1001 15:10:37.350690 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-55pmn" event={"ID":"43a1a358-9eba-46eb-90c5-a34e0fad09d6","Type":"ContainerStarted","Data":"d66b0c834e798e57c13e779ecc0cc4dd24099910d319c0524b2077c1b20248de"} Oct 01 15:10:37 crc kubenswrapper[4771]: I1001 15:10:37.351800 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-6vp9n" event={"ID":"41e9ddd0-3d55-4c3d-a4e9-2fbe4a7ec6f6","Type":"ContainerStarted","Data":"13e1903d5fb63c407724257b15189e91c3977f62996b372c25ebf545cc34c057"} Oct 01 15:10:37 crc kubenswrapper[4771]: I1001 15:10:37.352884 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-99nfn" event={"ID":"60253a13-4845-4234-81f4-329e6f35a86e","Type":"ContainerStarted","Data":"4335212a466595a3332547cc5f548117392dfa7c494e879526f68d3f56c955e7"} Oct 01 15:10:37 crc kubenswrapper[4771]: I1001 15:10:37.354420 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-7tpgh" event={"ID":"46077b26-3930-4245-86b4-2d836a165664","Type":"ContainerStarted","Data":"aed72bdf97cdbfc708419ab4eed31fed131ef0acb32f577f01d1348daef471d5"} Oct 01 15:10:37 crc kubenswrapper[4771]: I1001 15:10:37.514088 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-mwl7z"] Oct 01 15:10:37 crc kubenswrapper[4771]: I1001 15:10:37.522760 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-qnlxq"] Oct 01 15:10:37 crc kubenswrapper[4771]: I1001 15:10:37.533832 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-64cd67b5cb-l85hk"] Oct 01 15:10:37 crc kubenswrapper[4771]: W1001 15:10:37.535038 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb90ad79b_e447_4a96_82a1_4ae8cb5b9959.slice/crio-35d16d59c1b82488ea1588cd97e2fd8aaf0051bb28d30f4dcef2964df55d9402 WatchSource:0}: Error finding container 35d16d59c1b82488ea1588cd97e2fd8aaf0051bb28d30f4dcef2964df55d9402: Status 404 returned error can't find the container with id 35d16d59c1b82488ea1588cd97e2fd8aaf0051bb28d30f4dcef2964df55d9402 Oct 01 15:10:37 crc kubenswrapper[4771]: I1001 15:10:37.540746 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-5jmlk"] Oct 01 15:10:37 crc kubenswrapper[4771]: I1001 15:10:37.555474 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5cd4858477-zcdfw"] Oct 01 15:10:37 crc kubenswrapper[4771]: I1001 15:10:37.619976 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-849d5b9b84-nv6q5"] Oct 01 15:10:37 crc kubenswrapper[4771]: I1001 15:10:37.629967 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-wps64"] Oct 01 15:10:37 crc kubenswrapper[4771]: W1001 15:10:37.631851 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcb3eff6_0c4e_4046_a829_fab3a5942d21.slice/crio-448c2c47e06915ef1ec2b19c4839d259fcbabdf67db867c5f209777398e60932 WatchSource:0}: Error finding container 448c2c47e06915ef1ec2b19c4839d259fcbabdf67db867c5f209777398e60932: Status 404 returned error can't find the container with id 448c2c47e06915ef1ec2b19c4839d259fcbabdf67db867c5f209777398e60932 Oct 01 15:10:37 crc kubenswrapper[4771]: I1001 15:10:37.646899 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-bkpfg"] Oct 01 15:10:37 crc kubenswrapper[4771]: E1001 15:10:37.652117 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:acdeebaa51f962066f42f38b6c2d34a62fc6a24f58f9ee63d61b1e0cafbb29f8,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-s6ngb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-849d5b9b84-nv6q5_openstack-operators(c8125ce6-7c9a-45d3-b820-698fd30d3471): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 15:10:37 crc kubenswrapper[4771]: I1001 15:10:37.653551 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-9d6c5db85-xkmp2"] Oct 01 15:10:37 crc kubenswrapper[4771]: E1001 15:10:37.655771 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/infra-operator@sha256:3f96f0843934236c261db73dacb50fc12a288890562ee4ebdc9ec22360937cd3,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{600 -3} {} 600m DecimalSI},memory: {{2147483648 0} {} 2Gi BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{536870912 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-46h76,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infra-operator-controller-manager-9d6c5db85-xkmp2_openstack-operators(f7c18d5d-6ebb-4c31-a348-6ae7feebfafc): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 15:10:37 crc kubenswrapper[4771]: I1001 15:10:37.658126 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-nv495"] Oct 01 15:10:37 crc kubenswrapper[4771]: I1001 15:10:37.785534 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7b787867f4-fx47m"] Oct 01 15:10:37 crc kubenswrapper[4771]: E1001 15:10:37.843958 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:e1328760310f3bbf4548b8b1268cd711087dd91212b92bb0be287cad1f1b6fe9,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7zxcp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-7b787867f4-fx47m_openstack-operators(417e6338-ae16-4903-8381-5bb1c3a92c75): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 15:10:37 crc kubenswrapper[4771]: E1001 15:10:37.869138 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-nv6q5" podUID="c8125ce6-7c9a-45d3-b820-698fd30d3471" Oct 01 15:10:37 crc kubenswrapper[4771]: E1001 15:10:37.869652 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-xkmp2" podUID="f7c18d5d-6ebb-4c31-a348-6ae7feebfafc" Oct 01 15:10:37 crc kubenswrapper[4771]: I1001 15:10:37.885805 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7clp8"] Oct 01 15:10:37 crc kubenswrapper[4771]: I1001 15:10:37.892263 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-84d6b4b759-wljgn"] Oct 01 15:10:37 crc kubenswrapper[4771]: I1001 15:10:37.913801 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-c495dbccb-25dzd"] Oct 01 15:10:37 crc kubenswrapper[4771]: W1001 15:10:37.926004 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0517b85_5f5f_4d87_92f8_901564af068c.slice/crio-9078eca2b60e6e5865330cc04ee365e2ce0040488a165c90ad77ed94aaf10ac8 WatchSource:0}: Error finding container 9078eca2b60e6e5865330cc04ee365e2ce0040488a165c90ad77ed94aaf10ac8: Status 404 returned error can't find the container with id 9078eca2b60e6e5865330cc04ee365e2ce0040488a165c90ad77ed94aaf10ac8 Oct 01 15:10:37 crc kubenswrapper[4771]: E1001 15:10:37.927598 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.212:5001/openstack-k8s-operators/telemetry-operator:83668e9715381571dbce0710b16e15709f11dc29,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qx7bm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-c495dbccb-25dzd_openstack-operators(9abd1d50-adca-4d9a-8c33-89c3242174a5): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 15:10:37 crc kubenswrapper[4771]: E1001 15:10:37.928379 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:09c2f519ea218f6038b7be039b8e6ac33ee93b217b9be0d2d18a5e7f94faae06,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lbmzh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6b9957f54f-crn74_openstack-operators(a0517b85-5f5f-4d87-92f8-901564af068c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 15:10:37 crc kubenswrapper[4771]: W1001 15:10:37.943869 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5efb91f_7e66_488b_ab6f_e52dbf63bc3c.slice/crio-984f10efb1b3f79a0103fa8bdaa3eb0edeaad7319ae74eecf2949e811dacc97f WatchSource:0}: Error finding container 984f10efb1b3f79a0103fa8bdaa3eb0edeaad7319ae74eecf2949e811dacc97f: Status 404 returned error can't find the container with id 984f10efb1b3f79a0103fa8bdaa3eb0edeaad7319ae74eecf2949e811dacc97f Oct 01 15:10:37 crc kubenswrapper[4771]: I1001 15:10:37.952930 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6b9957f54f-crn74"] Oct 01 15:10:37 crc kubenswrapper[4771]: I1001 15:10:37.959071 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-85777745bb-vvkwz"] Oct 01 15:10:37 crc kubenswrapper[4771]: I1001 15:10:37.970882 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8c62g57"] Oct 01 15:10:37 crc kubenswrapper[4771]: W1001 15:10:37.972129 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d65172_53ec_4aae_a508_b955072cdd2a.slice/crio-5d0d2fcd9f6cb5c25e95b7de531a678ec029079197a385d9b1413883d2f47bbd WatchSource:0}: Error finding container 5d0d2fcd9f6cb5c25e95b7de531a678ec029079197a385d9b1413883d2f47bbd: Status 404 returned error can't find the container with id 5d0d2fcd9f6cb5c25e95b7de531a678ec029079197a385d9b1413883d2f47bbd Oct 01 15:10:37 crc kubenswrapper[4771]: E1001 15:10:37.973917 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-75rdf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-5f97d8c699-7clp8_openstack-operators(b5efb91f-7e66-488b-ab6f-e52dbf63bc3c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 15:10:37 crc kubenswrapper[4771]: E1001 15:10:37.976086 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7clp8" podUID="b5efb91f-7e66-488b-ab6f-e52dbf63bc3c" Oct 01 15:10:37 crc kubenswrapper[4771]: I1001 15:10:37.976421 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-577574bf4d-p8zrk"] Oct 01 15:10:38 crc kubenswrapper[4771]: E1001 15:10:38.004980 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:f61fdfbfd12027ce6b4e7ad553ec0582f080de0cfb472de6dc04ad3078bb17e3,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-r9vlq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-85777745bb-vvkwz_openstack-operators(e6d65172-53ec-4aae-a508-b955072cdd2a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 15:10:38 crc kubenswrapper[4771]: E1001 15:10:38.018333 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e7cfed051c1cf801e651fd4035070e38698039f284ac0b2a0332769fdbb4a9c8,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_LIGHTSPEED_IMAGE_URL_DEFAULT,Value:quay.io/openstack-lightspeed/rag-content:os-docs-2024.2,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4whbx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-77b9676b8c62g57_openstack-operators(c87cb84f-c539-4562-8492-b1106b6181f1): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 01 15:10:38 crc kubenswrapper[4771]: E1001 15:10:38.084346 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-fx47m" podUID="417e6338-ae16-4903-8381-5bb1c3a92c75" Oct 01 15:10:38 crc kubenswrapper[4771]: E1001 15:10:38.181422 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8c62g57" podUID="c87cb84f-c539-4562-8492-b1106b6181f1" Oct 01 15:10:38 crc kubenswrapper[4771]: E1001 15:10:38.185552 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-crn74" podUID="a0517b85-5f5f-4d87-92f8-901564af068c" Oct 01 15:10:38 crc kubenswrapper[4771]: E1001 15:10:38.200015 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-c495dbccb-25dzd" podUID="9abd1d50-adca-4d9a-8c33-89c3242174a5" Oct 01 15:10:38 crc kubenswrapper[4771]: E1001 15:10:38.254386 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-85777745bb-vvkwz" podUID="e6d65172-53ec-4aae-a508-b955072cdd2a" Oct 01 15:10:38 crc kubenswrapper[4771]: I1001 15:10:38.366800 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-nv495" event={"ID":"4c10ae08-be13-4725-b9be-c55ce015f33e","Type":"ContainerStarted","Data":"0cbba908cc406859a187c88c86911b4befdf30c29718154a85e48a29477857f4"} Oct 01 15:10:38 crc kubenswrapper[4771]: I1001 15:10:38.368185 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-mwl7z" event={"ID":"681d5bbd-36b0-497f-9d27-f8cc7473399a","Type":"ContainerStarted","Data":"8deb5d7642d2b0bb18be913249f49b8eacc3e57485b904c090f14ea3e04d5ce9"} Oct 01 15:10:38 crc kubenswrapper[4771]: I1001 15:10:38.377801 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-5jmlk" event={"ID":"a1c30bd9-dd78-4d92-9423-16597bf7d758","Type":"ContainerStarted","Data":"5eb2bfddddd1d5b1b4d56ad14d1ac1df73df2018491891573a85c9a3ecddb983"} Oct 01 15:10:38 crc kubenswrapper[4771]: I1001 15:10:38.392840 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-qnlxq" event={"ID":"b90ad79b-e447-4a96-82a1-4ae8cb5b9959","Type":"ContainerStarted","Data":"35d16d59c1b82488ea1588cd97e2fd8aaf0051bb28d30f4dcef2964df55d9402"} Oct 01 15:10:38 crc kubenswrapper[4771]: I1001 15:10:38.396654 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-c495dbccb-25dzd" event={"ID":"9abd1d50-adca-4d9a-8c33-89c3242174a5","Type":"ContainerStarted","Data":"ae6bfee127fbcc3657e10e7603c45348e58f580e404986be3d2779f9e9ee2611"} Oct 01 15:10:38 crc kubenswrapper[4771]: I1001 15:10:38.396688 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-c495dbccb-25dzd" event={"ID":"9abd1d50-adca-4d9a-8c33-89c3242174a5","Type":"ContainerStarted","Data":"96f0d44586261f7186617f7ddceec9171aba8fbdc71c1ec0aab6d9be67446e13"} Oct 01 15:10:38 crc kubenswrapper[4771]: E1001 15:10:38.397827 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.212:5001/openstack-k8s-operators/telemetry-operator:83668e9715381571dbce0710b16e15709f11dc29\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-c495dbccb-25dzd" podUID="9abd1d50-adca-4d9a-8c33-89c3242174a5" Oct 01 15:10:38 crc kubenswrapper[4771]: I1001 15:10:38.398591 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8c62g57" event={"ID":"c87cb84f-c539-4562-8492-b1106b6181f1","Type":"ContainerStarted","Data":"4e361d16e251e11866535b1c97790a72d0b8fe586c7b536561ee01543fece146"} Oct 01 15:10:38 crc kubenswrapper[4771]: I1001 15:10:38.398618 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8c62g57" event={"ID":"c87cb84f-c539-4562-8492-b1106b6181f1","Type":"ContainerStarted","Data":"03f185601492a6f26ae14cd9e87281e40186c8042f8b51981881db7b944fc40d"} Oct 01 15:10:38 crc kubenswrapper[4771]: I1001 15:10:38.402097 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-crn74" event={"ID":"a0517b85-5f5f-4d87-92f8-901564af068c","Type":"ContainerStarted","Data":"fa6b2902d6a103fb8a51594e15827ce51cd2482c4c3e3d58e23c3667b0d83d89"} Oct 01 15:10:38 crc kubenswrapper[4771]: I1001 15:10:38.402133 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-crn74" event={"ID":"a0517b85-5f5f-4d87-92f8-901564af068c","Type":"ContainerStarted","Data":"9078eca2b60e6e5865330cc04ee365e2ce0040488a165c90ad77ed94aaf10ac8"} Oct 01 15:10:38 crc kubenswrapper[4771]: E1001 15:10:38.403472 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:09c2f519ea218f6038b7be039b8e6ac33ee93b217b9be0d2d18a5e7f94faae06\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-crn74" podUID="a0517b85-5f5f-4d87-92f8-901564af068c" Oct 01 15:10:38 crc kubenswrapper[4771]: E1001 15:10:38.403812 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e7cfed051c1cf801e651fd4035070e38698039f284ac0b2a0332769fdbb4a9c8\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8c62g57" podUID="c87cb84f-c539-4562-8492-b1106b6181f1" Oct 01 15:10:38 crc kubenswrapper[4771]: I1001 15:10:38.409718 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-85777745bb-vvkwz" event={"ID":"e6d65172-53ec-4aae-a508-b955072cdd2a","Type":"ContainerStarted","Data":"f144bccacb029393b73216184401b67b10947ffdba8411bbc901a4bd4959f8fc"} Oct 01 15:10:38 crc kubenswrapper[4771]: I1001 15:10:38.409769 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-85777745bb-vvkwz" event={"ID":"e6d65172-53ec-4aae-a508-b955072cdd2a","Type":"ContainerStarted","Data":"5d0d2fcd9f6cb5c25e95b7de531a678ec029079197a385d9b1413883d2f47bbd"} Oct 01 15:10:38 crc kubenswrapper[4771]: I1001 15:10:38.418246 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-577574bf4d-p8zrk" event={"ID":"31e6f7a8-0c8e-48c9-a2ba-38ffdabc1d95","Type":"ContainerStarted","Data":"e9f65bf47c392ebd0333c669f6035ead5f13f3388cbde07b5c3a999e67aa02d0"} Oct 01 15:10:38 crc kubenswrapper[4771]: I1001 15:10:38.418287 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-577574bf4d-p8zrk" event={"ID":"31e6f7a8-0c8e-48c9-a2ba-38ffdabc1d95","Type":"ContainerStarted","Data":"9104acf5bd43baa93a3a2e574e14aca6f6503f6d46e16bdf54a591e66ebbde2f"} Oct 01 15:10:38 crc kubenswrapper[4771]: I1001 15:10:38.420501 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-nv6q5" event={"ID":"c8125ce6-7c9a-45d3-b820-698fd30d3471","Type":"ContainerStarted","Data":"1ae1ed36b0d3530764faf5787cf99a679bfb64f19d02efda08bc8f1662e69b1a"} Oct 01 15:10:38 crc kubenswrapper[4771]: I1001 15:10:38.420521 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-nv6q5" event={"ID":"c8125ce6-7c9a-45d3-b820-698fd30d3471","Type":"ContainerStarted","Data":"48d860ebe24dc107d00fcffc2d885e6c8b98595f64aef5939fee998eaf42c54b"} Oct 01 15:10:38 crc kubenswrapper[4771]: E1001 15:10:38.431977 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f61fdfbfd12027ce6b4e7ad553ec0582f080de0cfb472de6dc04ad3078bb17e3\\\"\"" pod="openstack-operators/test-operator-controller-manager-85777745bb-vvkwz" podUID="e6d65172-53ec-4aae-a508-b955072cdd2a" Oct 01 15:10:38 crc kubenswrapper[4771]: E1001 15:10:38.432058 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:acdeebaa51f962066f42f38b6c2d34a62fc6a24f58f9ee63d61b1e0cafbb29f8\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-nv6q5" podUID="c8125ce6-7c9a-45d3-b820-698fd30d3471" Oct 01 15:10:38 crc kubenswrapper[4771]: I1001 15:10:38.449881 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-wljgn" event={"ID":"193791c4-4d63-4f50-a743-439b664c16b7","Type":"ContainerStarted","Data":"7cf57cd4194e8e69a722c60dde372e6158348391a1e0fa68788e031540895202"} Oct 01 15:10:38 crc kubenswrapper[4771]: I1001 15:10:38.467064 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-wps64" event={"ID":"0c5bf036-417b-4f93-94a0-7c8ddc9028d7","Type":"ContainerStarted","Data":"adfd421f74a359e7f5007379eee35656634c9e1979ef6ded02a794ddacaf97f6"} Oct 01 15:10:38 crc kubenswrapper[4771]: I1001 15:10:38.482786 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-xkmp2" event={"ID":"f7c18d5d-6ebb-4c31-a348-6ae7feebfafc","Type":"ContainerStarted","Data":"8f64eba1e87aa5fdf073b7aaacbbc2e8bad4e56b127230c60caa27cc7443cd9c"} Oct 01 15:10:38 crc kubenswrapper[4771]: I1001 15:10:38.482833 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-xkmp2" event={"ID":"f7c18d5d-6ebb-4c31-a348-6ae7feebfafc","Type":"ContainerStarted","Data":"0dabba3e5d86cf4f42e4ee5da3df4d629fe70c12767e7244bdb37dffedd322ee"} Oct 01 15:10:38 crc kubenswrapper[4771]: E1001 15:10:38.484415 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:3f96f0843934236c261db73dacb50fc12a288890562ee4ebdc9ec22360937cd3\\\"\"" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-xkmp2" podUID="f7c18d5d-6ebb-4c31-a348-6ae7feebfafc" Oct 01 15:10:38 crc kubenswrapper[4771]: I1001 15:10:38.486160 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7clp8" event={"ID":"b5efb91f-7e66-488b-ab6f-e52dbf63bc3c","Type":"ContainerStarted","Data":"984f10efb1b3f79a0103fa8bdaa3eb0edeaad7319ae74eecf2949e811dacc97f"} Oct 01 15:10:38 crc kubenswrapper[4771]: E1001 15:10:38.487390 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7clp8" podUID="b5efb91f-7e66-488b-ab6f-e52dbf63bc3c" Oct 01 15:10:38 crc kubenswrapper[4771]: I1001 15:10:38.498965 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-bkpfg" event={"ID":"9c1c4098-4bbf-4d54-a09d-44b29ef352c3","Type":"ContainerStarted","Data":"afd8821405d0c74d607f51e0976f83bc2e87d85d7f0951a01ae2d8884e7fec48"} Oct 01 15:10:38 crc kubenswrapper[4771]: I1001 15:10:38.509889 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-l85hk" event={"ID":"863ec596-646c-41a0-b3e4-e33ad84c79aa","Type":"ContainerStarted","Data":"126d09b6973d2bbd6371ae22b1b84c11356a97b8984f3dc4625c85c5bf42ef89"} Oct 01 15:10:38 crc kubenswrapper[4771]: I1001 15:10:38.522386 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-zcdfw" event={"ID":"fcb3eff6-0c4e-4046-a829-fab3a5942d21","Type":"ContainerStarted","Data":"448c2c47e06915ef1ec2b19c4839d259fcbabdf67db867c5f209777398e60932"} Oct 01 15:10:38 crc kubenswrapper[4771]: I1001 15:10:38.533101 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-l8q87" podUID="0745ade3-bbcb-4ac3-b60d-4956c63d1be6" containerName="registry-server" containerID="cri-o://e5a803c30a8b3747cc18b1a894edf31818a64c5a1352636aa2c1ec34e191734d" gracePeriod=2 Oct 01 15:10:38 crc kubenswrapper[4771]: I1001 15:10:38.533822 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-fx47m" event={"ID":"417e6338-ae16-4903-8381-5bb1c3a92c75","Type":"ContainerStarted","Data":"9644becf3b32e6171b731b756326b7239630222f2462406ed7112b9dd20cdb3b"} Oct 01 15:10:38 crc kubenswrapper[4771]: I1001 15:10:38.533852 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-fx47m" event={"ID":"417e6338-ae16-4903-8381-5bb1c3a92c75","Type":"ContainerStarted","Data":"ff285fb87c27c8e601565dcf099d899f50e6b58127c99d8bd08134477d5a41bc"} Oct 01 15:10:38 crc kubenswrapper[4771]: E1001 15:10:38.537146 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:e1328760310f3bbf4548b8b1268cd711087dd91212b92bb0be287cad1f1b6fe9\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-fx47m" podUID="417e6338-ae16-4903-8381-5bb1c3a92c75" Oct 01 15:10:39 crc kubenswrapper[4771]: I1001 15:10:39.211217 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l8q87" Oct 01 15:10:39 crc kubenswrapper[4771]: I1001 15:10:39.295109 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sn4m\" (UniqueName: \"kubernetes.io/projected/0745ade3-bbcb-4ac3-b60d-4956c63d1be6-kube-api-access-9sn4m\") pod \"0745ade3-bbcb-4ac3-b60d-4956c63d1be6\" (UID: \"0745ade3-bbcb-4ac3-b60d-4956c63d1be6\") " Oct 01 15:10:39 crc kubenswrapper[4771]: I1001 15:10:39.295159 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0745ade3-bbcb-4ac3-b60d-4956c63d1be6-utilities\") pod \"0745ade3-bbcb-4ac3-b60d-4956c63d1be6\" (UID: \"0745ade3-bbcb-4ac3-b60d-4956c63d1be6\") " Oct 01 15:10:39 crc kubenswrapper[4771]: I1001 15:10:39.295249 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0745ade3-bbcb-4ac3-b60d-4956c63d1be6-catalog-content\") pod \"0745ade3-bbcb-4ac3-b60d-4956c63d1be6\" (UID: \"0745ade3-bbcb-4ac3-b60d-4956c63d1be6\") " Oct 01 15:10:39 crc kubenswrapper[4771]: I1001 15:10:39.296906 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0745ade3-bbcb-4ac3-b60d-4956c63d1be6-utilities" (OuterVolumeSpecName: "utilities") pod "0745ade3-bbcb-4ac3-b60d-4956c63d1be6" (UID: "0745ade3-bbcb-4ac3-b60d-4956c63d1be6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:10:39 crc kubenswrapper[4771]: I1001 15:10:39.307206 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0745ade3-bbcb-4ac3-b60d-4956c63d1be6-kube-api-access-9sn4m" (OuterVolumeSpecName: "kube-api-access-9sn4m") pod "0745ade3-bbcb-4ac3-b60d-4956c63d1be6" (UID: "0745ade3-bbcb-4ac3-b60d-4956c63d1be6"). InnerVolumeSpecName "kube-api-access-9sn4m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:10:39 crc kubenswrapper[4771]: I1001 15:10:39.396373 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sn4m\" (UniqueName: \"kubernetes.io/projected/0745ade3-bbcb-4ac3-b60d-4956c63d1be6-kube-api-access-9sn4m\") on node \"crc\" DevicePath \"\"" Oct 01 15:10:39 crc kubenswrapper[4771]: I1001 15:10:39.396403 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0745ade3-bbcb-4ac3-b60d-4956c63d1be6-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 15:10:39 crc kubenswrapper[4771]: I1001 15:10:39.403011 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0745ade3-bbcb-4ac3-b60d-4956c63d1be6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0745ade3-bbcb-4ac3-b60d-4956c63d1be6" (UID: "0745ade3-bbcb-4ac3-b60d-4956c63d1be6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:10:39 crc kubenswrapper[4771]: I1001 15:10:39.497707 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0745ade3-bbcb-4ac3-b60d-4956c63d1be6-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 15:10:39 crc kubenswrapper[4771]: I1001 15:10:39.547069 4771 generic.go:334] "Generic (PLEG): container finished" podID="0745ade3-bbcb-4ac3-b60d-4956c63d1be6" containerID="e5a803c30a8b3747cc18b1a894edf31818a64c5a1352636aa2c1ec34e191734d" exitCode=0 Oct 01 15:10:39 crc kubenswrapper[4771]: I1001 15:10:39.547133 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l8q87" event={"ID":"0745ade3-bbcb-4ac3-b60d-4956c63d1be6","Type":"ContainerDied","Data":"e5a803c30a8b3747cc18b1a894edf31818a64c5a1352636aa2c1ec34e191734d"} Oct 01 15:10:39 crc kubenswrapper[4771]: I1001 15:10:39.547157 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-l8q87" event={"ID":"0745ade3-bbcb-4ac3-b60d-4956c63d1be6","Type":"ContainerDied","Data":"092af0863e4e4e0201f3047a4c6d6c0895d690e842dabb4c85c9c8168330ecb2"} Oct 01 15:10:39 crc kubenswrapper[4771]: I1001 15:10:39.547174 4771 scope.go:117] "RemoveContainer" containerID="e5a803c30a8b3747cc18b1a894edf31818a64c5a1352636aa2c1ec34e191734d" Oct 01 15:10:39 crc kubenswrapper[4771]: I1001 15:10:39.547264 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-l8q87" Oct 01 15:10:39 crc kubenswrapper[4771]: I1001 15:10:39.552428 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-577574bf4d-p8zrk" event={"ID":"31e6f7a8-0c8e-48c9-a2ba-38ffdabc1d95","Type":"ContainerStarted","Data":"d7b76392675d6329802a240a5c214634dff4b0588f29a0e1e44673eb3c8752f4"} Oct 01 15:10:39 crc kubenswrapper[4771]: E1001 15:10:39.555202 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/infra-operator@sha256:3f96f0843934236c261db73dacb50fc12a288890562ee4ebdc9ec22360937cd3\\\"\"" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-xkmp2" podUID="f7c18d5d-6ebb-4c31-a348-6ae7feebfafc" Oct 01 15:10:39 crc kubenswrapper[4771]: E1001 15:10:39.555394 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f61fdfbfd12027ce6b4e7ad553ec0582f080de0cfb472de6dc04ad3078bb17e3\\\"\"" pod="openstack-operators/test-operator-controller-manager-85777745bb-vvkwz" podUID="e6d65172-53ec-4aae-a508-b955072cdd2a" Oct 01 15:10:39 crc kubenswrapper[4771]: E1001 15:10:39.555451 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:acdeebaa51f962066f42f38b6c2d34a62fc6a24f58f9ee63d61b1e0cafbb29f8\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-nv6q5" podUID="c8125ce6-7c9a-45d3-b820-698fd30d3471" Oct 01 15:10:39 crc kubenswrapper[4771]: E1001 15:10:39.555469 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7clp8" podUID="b5efb91f-7e66-488b-ab6f-e52dbf63bc3c" Oct 01 15:10:39 crc kubenswrapper[4771]: E1001 15:10:39.555504 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:09c2f519ea218f6038b7be039b8e6ac33ee93b217b9be0d2d18a5e7f94faae06\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-crn74" podUID="a0517b85-5f5f-4d87-92f8-901564af068c" Oct 01 15:10:39 crc kubenswrapper[4771]: E1001 15:10:39.555533 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:e1328760310f3bbf4548b8b1268cd711087dd91212b92bb0be287cad1f1b6fe9\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-fx47m" podUID="417e6338-ae16-4903-8381-5bb1c3a92c75" Oct 01 15:10:39 crc kubenswrapper[4771]: E1001 15:10:39.555576 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.212:5001/openstack-k8s-operators/telemetry-operator:83668e9715381571dbce0710b16e15709f11dc29\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-c495dbccb-25dzd" podUID="9abd1d50-adca-4d9a-8c33-89c3242174a5" Oct 01 15:10:39 crc kubenswrapper[4771]: E1001 15:10:39.556786 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:e7cfed051c1cf801e651fd4035070e38698039f284ac0b2a0332769fdbb4a9c8\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8c62g57" podUID="c87cb84f-c539-4562-8492-b1106b6181f1" Oct 01 15:10:39 crc kubenswrapper[4771]: I1001 15:10:39.587292 4771 scope.go:117] "RemoveContainer" containerID="56166530dd051f3f00ded628b3439ec262fbd277cab4079d03a7750d805828e6" Oct 01 15:10:39 crc kubenswrapper[4771]: I1001 15:10:39.692924 4771 scope.go:117] "RemoveContainer" containerID="cd961ca7f77031fbeec19c0c99febcb868d0d13f5bb38c4171c717bd92ad5b27" Oct 01 15:10:39 crc kubenswrapper[4771]: I1001 15:10:39.716181 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-577574bf4d-p8zrk" podStartSLOduration=3.71616164 podStartE2EDuration="3.71616164s" podCreationTimestamp="2025-10-01 15:10:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:10:39.715993666 +0000 UTC m=+884.335168837" watchObservedRunningTime="2025-10-01 15:10:39.71616164 +0000 UTC m=+884.335336811" Oct 01 15:10:39 crc kubenswrapper[4771]: I1001 15:10:39.857974 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-l8q87"] Oct 01 15:10:39 crc kubenswrapper[4771]: I1001 15:10:39.865105 4771 scope.go:117] "RemoveContainer" containerID="e5a803c30a8b3747cc18b1a894edf31818a64c5a1352636aa2c1ec34e191734d" Oct 01 15:10:39 crc kubenswrapper[4771]: E1001 15:10:39.865678 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5a803c30a8b3747cc18b1a894edf31818a64c5a1352636aa2c1ec34e191734d\": container with ID starting with e5a803c30a8b3747cc18b1a894edf31818a64c5a1352636aa2c1ec34e191734d not found: ID does not exist" containerID="e5a803c30a8b3747cc18b1a894edf31818a64c5a1352636aa2c1ec34e191734d" Oct 01 15:10:39 crc kubenswrapper[4771]: I1001 15:10:39.865725 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5a803c30a8b3747cc18b1a894edf31818a64c5a1352636aa2c1ec34e191734d"} err="failed to get container status \"e5a803c30a8b3747cc18b1a894edf31818a64c5a1352636aa2c1ec34e191734d\": rpc error: code = NotFound desc = could not find container \"e5a803c30a8b3747cc18b1a894edf31818a64c5a1352636aa2c1ec34e191734d\": container with ID starting with e5a803c30a8b3747cc18b1a894edf31818a64c5a1352636aa2c1ec34e191734d not found: ID does not exist" Oct 01 15:10:39 crc kubenswrapper[4771]: I1001 15:10:39.865768 4771 scope.go:117] "RemoveContainer" containerID="56166530dd051f3f00ded628b3439ec262fbd277cab4079d03a7750d805828e6" Oct 01 15:10:39 crc kubenswrapper[4771]: E1001 15:10:39.866285 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56166530dd051f3f00ded628b3439ec262fbd277cab4079d03a7750d805828e6\": container with ID starting with 56166530dd051f3f00ded628b3439ec262fbd277cab4079d03a7750d805828e6 not found: ID does not exist" containerID="56166530dd051f3f00ded628b3439ec262fbd277cab4079d03a7750d805828e6" Oct 01 15:10:39 crc kubenswrapper[4771]: I1001 15:10:39.866308 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56166530dd051f3f00ded628b3439ec262fbd277cab4079d03a7750d805828e6"} err="failed to get container status \"56166530dd051f3f00ded628b3439ec262fbd277cab4079d03a7750d805828e6\": rpc error: code = NotFound desc = could not find container \"56166530dd051f3f00ded628b3439ec262fbd277cab4079d03a7750d805828e6\": container with ID starting with 56166530dd051f3f00ded628b3439ec262fbd277cab4079d03a7750d805828e6 not found: ID does not exist" Oct 01 15:10:39 crc kubenswrapper[4771]: I1001 15:10:39.866325 4771 scope.go:117] "RemoveContainer" containerID="cd961ca7f77031fbeec19c0c99febcb868d0d13f5bb38c4171c717bd92ad5b27" Oct 01 15:10:39 crc kubenswrapper[4771]: I1001 15:10:39.867384 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-l8q87"] Oct 01 15:10:39 crc kubenswrapper[4771]: E1001 15:10:39.879587 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd961ca7f77031fbeec19c0c99febcb868d0d13f5bb38c4171c717bd92ad5b27\": container with ID starting with cd961ca7f77031fbeec19c0c99febcb868d0d13f5bb38c4171c717bd92ad5b27 not found: ID does not exist" containerID="cd961ca7f77031fbeec19c0c99febcb868d0d13f5bb38c4171c717bd92ad5b27" Oct 01 15:10:39 crc kubenswrapper[4771]: I1001 15:10:39.879627 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd961ca7f77031fbeec19c0c99febcb868d0d13f5bb38c4171c717bd92ad5b27"} err="failed to get container status \"cd961ca7f77031fbeec19c0c99febcb868d0d13f5bb38c4171c717bd92ad5b27\": rpc error: code = NotFound desc = could not find container \"cd961ca7f77031fbeec19c0c99febcb868d0d13f5bb38c4171c717bd92ad5b27\": container with ID starting with cd961ca7f77031fbeec19c0c99febcb868d0d13f5bb38c4171c717bd92ad5b27 not found: ID does not exist" Oct 01 15:10:39 crc kubenswrapper[4771]: I1001 15:10:39.998719 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0745ade3-bbcb-4ac3-b60d-4956c63d1be6" path="/var/lib/kubelet/pods/0745ade3-bbcb-4ac3-b60d-4956c63d1be6/volumes" Oct 01 15:10:40 crc kubenswrapper[4771]: I1001 15:10:40.563208 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-577574bf4d-p8zrk" Oct 01 15:10:47 crc kubenswrapper[4771]: I1001 15:10:47.330950 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-577574bf4d-p8zrk" Oct 01 15:10:52 crc kubenswrapper[4771]: E1001 15:10:52.523811 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:a517abc6427ab73fed93b0bd89a6eb52d0311fbfb0c00752f889baf8ffd5068f" Oct 01 15:10:52 crc kubenswrapper[4771]: E1001 15:10:52.525026 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:a517abc6427ab73fed93b0bd89a6eb52d0311fbfb0c00752f889baf8ffd5068f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cv5wt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-64cd67b5cb-l85hk_openstack-operators(863ec596-646c-41a0-b3e4-e33ad84c79aa): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 15:10:52 crc kubenswrapper[4771]: E1001 15:10:52.775054 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-l85hk" podUID="863ec596-646c-41a0-b3e4-e33ad84c79aa" Oct 01 15:10:53 crc kubenswrapper[4771]: I1001 15:10:53.696914 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-6vp9n" event={"ID":"41e9ddd0-3d55-4c3d-a4e9-2fbe4a7ec6f6","Type":"ContainerStarted","Data":"1b815346fb52af3bd501e1cda51a3e93853a9b67d491cc691dd73e2d3d4f531d"} Oct 01 15:10:53 crc kubenswrapper[4771]: I1001 15:10:53.702592 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-zcdfw" event={"ID":"fcb3eff6-0c4e-4046-a829-fab3a5942d21","Type":"ContainerStarted","Data":"1e829a11e007b0347ea8190a4d73ec5ed0397664670f184e50576144448e9f15"} Oct 01 15:10:53 crc kubenswrapper[4771]: I1001 15:10:53.702622 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-zcdfw" event={"ID":"fcb3eff6-0c4e-4046-a829-fab3a5942d21","Type":"ContainerStarted","Data":"4ba133d899fd7d81ad175f054b800be5b73a892f5909f39519323014b30ed10e"} Oct 01 15:10:53 crc kubenswrapper[4771]: I1001 15:10:53.702843 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-zcdfw" Oct 01 15:10:53 crc kubenswrapper[4771]: I1001 15:10:53.713861 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-nv495" event={"ID":"4c10ae08-be13-4725-b9be-c55ce015f33e","Type":"ContainerStarted","Data":"76039a295aa29cfb51ef5b4f85a6fca3911776ebc3dc39dc103d29a721b3650b"} Oct 01 15:10:53 crc kubenswrapper[4771]: I1001 15:10:53.736063 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-zcdfw" podStartSLOduration=3.740096948 podStartE2EDuration="18.736043345s" podCreationTimestamp="2025-10-01 15:10:35 +0000 UTC" firstStartedPulling="2025-10-01 15:10:37.649078215 +0000 UTC m=+882.268253376" lastFinishedPulling="2025-10-01 15:10:52.645024602 +0000 UTC m=+897.264199773" observedRunningTime="2025-10-01 15:10:53.734245751 +0000 UTC m=+898.353420912" watchObservedRunningTime="2025-10-01 15:10:53.736043345 +0000 UTC m=+898.355218516" Oct 01 15:10:53 crc kubenswrapper[4771]: I1001 15:10:53.747165 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-7tpgh" event={"ID":"46077b26-3930-4245-86b4-2d836a165664","Type":"ContainerStarted","Data":"45e3f8b08acae5c4459426ad2bc78d835c1833841bc4c40812aee30681814ee8"} Oct 01 15:10:53 crc kubenswrapper[4771]: I1001 15:10:53.774490 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-mwl7z" event={"ID":"681d5bbd-36b0-497f-9d27-f8cc7473399a","Type":"ContainerStarted","Data":"f7ae3a1bbbb24912ae298201594b3c70bcb13fc6b54c0e94d8b313eee503811d"} Oct 01 15:10:53 crc kubenswrapper[4771]: I1001 15:10:53.810112 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-qnlxq" event={"ID":"b90ad79b-e447-4a96-82a1-4ae8cb5b9959","Type":"ContainerStarted","Data":"9a96ce56a3d7c3a533db6fc6706fc316e58a2b873feb8bf44bcfbe7436731764"} Oct 01 15:10:53 crc kubenswrapper[4771]: I1001 15:10:53.813589 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-wljgn" event={"ID":"193791c4-4d63-4f50-a743-439b664c16b7","Type":"ContainerStarted","Data":"2d07ce64de225d02a868587495bd085bcbb54907a1ad5e6c2c886ed953597a90"} Oct 01 15:10:53 crc kubenswrapper[4771]: I1001 15:10:53.822845 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-gbllm" event={"ID":"31b30f6f-3f4e-4c6b-9517-0eb866b2c68c","Type":"ContainerStarted","Data":"681786b0762ff0a4fe7d6b43752010a6a50b77d330bb19465b6fd6eb11cccfaa"} Oct 01 15:10:53 crc kubenswrapper[4771]: I1001 15:10:53.846112 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-55pmn" event={"ID":"43a1a358-9eba-46eb-90c5-a34e0fad09d6","Type":"ContainerStarted","Data":"cca9b0a10db03bde17199cc65620cdd77f8892a68a1c4313aa2d5fa73de355f7"} Oct 01 15:10:53 crc kubenswrapper[4771]: I1001 15:10:53.846173 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-55pmn" event={"ID":"43a1a358-9eba-46eb-90c5-a34e0fad09d6","Type":"ContainerStarted","Data":"f74807767a04022fa6a02945339f7b26566e924a24dc6eccd99fa5aed913c383"} Oct 01 15:10:53 crc kubenswrapper[4771]: I1001 15:10:53.847157 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-55pmn" Oct 01 15:10:53 crc kubenswrapper[4771]: I1001 15:10:53.859748 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-5jmlk" event={"ID":"a1c30bd9-dd78-4d92-9423-16597bf7d758","Type":"ContainerStarted","Data":"a0b5d3bf3440b1b337c3c8f87e7da1de22e10bb6a17997e9731e5d9e68a38506"} Oct 01 15:10:53 crc kubenswrapper[4771]: I1001 15:10:53.863896 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-l85hk" event={"ID":"863ec596-646c-41a0-b3e4-e33ad84c79aa","Type":"ContainerStarted","Data":"2a8fff0f6beac775847a18f64a5c256975b48e8b79dc40918638f84a2a015883"} Oct 01 15:10:53 crc kubenswrapper[4771]: E1001 15:10:53.865433 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:a517abc6427ab73fed93b0bd89a6eb52d0311fbfb0c00752f889baf8ffd5068f\\\"\"" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-l85hk" podUID="863ec596-646c-41a0-b3e4-e33ad84c79aa" Oct 01 15:10:53 crc kubenswrapper[4771]: I1001 15:10:53.866474 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-99nfn" event={"ID":"60253a13-4845-4234-81f4-329e6f35a86e","Type":"ContainerStarted","Data":"d4ff445fcef7fb385687bdde6097ff0737f7c4995301edff59260371e63fd2fc"} Oct 01 15:10:53 crc kubenswrapper[4771]: I1001 15:10:53.868041 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-bkpfg" event={"ID":"9c1c4098-4bbf-4d54-a09d-44b29ef352c3","Type":"ContainerStarted","Data":"4899884cbe2504b7e0c5e6fa7d6581887622665f019d55b919f4bcce764d3b08"} Oct 01 15:10:53 crc kubenswrapper[4771]: I1001 15:10:53.880690 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-55pmn" podStartSLOduration=3.514696978 podStartE2EDuration="18.880677608s" podCreationTimestamp="2025-10-01 15:10:35 +0000 UTC" firstStartedPulling="2025-10-01 15:10:37.227887741 +0000 UTC m=+881.847062912" lastFinishedPulling="2025-10-01 15:10:52.593868371 +0000 UTC m=+897.213043542" observedRunningTime="2025-10-01 15:10:53.87835202 +0000 UTC m=+898.497527191" watchObservedRunningTime="2025-10-01 15:10:53.880677608 +0000 UTC m=+898.499852779" Oct 01 15:10:53 crc kubenswrapper[4771]: I1001 15:10:53.889772 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-wps64" event={"ID":"0c5bf036-417b-4f93-94a0-7c8ddc9028d7","Type":"ContainerStarted","Data":"1751e12b61f2db2873d73616f3244561b29a5d75d34fea302dfded097711654d"} Oct 01 15:10:53 crc kubenswrapper[4771]: I1001 15:10:53.889821 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-wps64" event={"ID":"0c5bf036-417b-4f93-94a0-7c8ddc9028d7","Type":"ContainerStarted","Data":"f554132ec873d691878016059f2b9379bbd9865be70588958a93a45f8cd87adf"} Oct 01 15:10:53 crc kubenswrapper[4771]: I1001 15:10:53.890445 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-wps64" Oct 01 15:10:53 crc kubenswrapper[4771]: I1001 15:10:53.932670 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-wps64" podStartSLOduration=4.008424228 podStartE2EDuration="18.932653768s" podCreationTimestamp="2025-10-01 15:10:35 +0000 UTC" firstStartedPulling="2025-10-01 15:10:37.648010479 +0000 UTC m=+882.267185640" lastFinishedPulling="2025-10-01 15:10:52.572239989 +0000 UTC m=+897.191415180" observedRunningTime="2025-10-01 15:10:53.930990776 +0000 UTC m=+898.550165947" watchObservedRunningTime="2025-10-01 15:10:53.932653768 +0000 UTC m=+898.551828939" Oct 01 15:10:54 crc kubenswrapper[4771]: I1001 15:10:54.905416 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-gbllm" event={"ID":"31b30f6f-3f4e-4c6b-9517-0eb866b2c68c","Type":"ContainerStarted","Data":"0ceaa3d8e33b5087df74413cf15115cc59e01f59f2575b653c6a5507ebb1cfab"} Oct 01 15:10:54 crc kubenswrapper[4771]: I1001 15:10:54.905498 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-gbllm" Oct 01 15:10:54 crc kubenswrapper[4771]: I1001 15:10:54.910000 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-99nfn" event={"ID":"60253a13-4845-4234-81f4-329e6f35a86e","Type":"ContainerStarted","Data":"326f0b3b9c8744235c7854878d5853770d2cb319798e6c42e4dd54b1e670e165"} Oct 01 15:10:54 crc kubenswrapper[4771]: I1001 15:10:54.910529 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-99nfn" Oct 01 15:10:54 crc kubenswrapper[4771]: I1001 15:10:54.919225 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-5jmlk" event={"ID":"a1c30bd9-dd78-4d92-9423-16597bf7d758","Type":"ContainerStarted","Data":"201cd78dfa472052eebd3f55091ed2edd80d8e5b8354bcea010b378ae4ff514f"} Oct 01 15:10:54 crc kubenswrapper[4771]: I1001 15:10:54.919465 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-88c7-5jmlk" Oct 01 15:10:54 crc kubenswrapper[4771]: I1001 15:10:54.923592 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-gbllm" podStartSLOduration=4.405838326 podStartE2EDuration="19.923572695s" podCreationTimestamp="2025-10-01 15:10:35 +0000 UTC" firstStartedPulling="2025-10-01 15:10:37.097794246 +0000 UTC m=+881.716969417" lastFinishedPulling="2025-10-01 15:10:52.615528615 +0000 UTC m=+897.234703786" observedRunningTime="2025-10-01 15:10:54.918006218 +0000 UTC m=+899.537181389" watchObservedRunningTime="2025-10-01 15:10:54.923572695 +0000 UTC m=+899.542747866" Oct 01 15:10:54 crc kubenswrapper[4771]: I1001 15:10:54.926098 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-bkpfg" event={"ID":"9c1c4098-4bbf-4d54-a09d-44b29ef352c3","Type":"ContainerStarted","Data":"436380a3e5f90c0280f6938591ddab2c9493eed793efb5d31776d077170007f9"} Oct 01 15:10:54 crc kubenswrapper[4771]: I1001 15:10:54.926250 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-bkpfg" Oct 01 15:10:54 crc kubenswrapper[4771]: I1001 15:10:54.938548 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-99nfn" podStartSLOduration=4.439899026 podStartE2EDuration="19.938531784s" podCreationTimestamp="2025-10-01 15:10:35 +0000 UTC" firstStartedPulling="2025-10-01 15:10:37.116854136 +0000 UTC m=+881.736029307" lastFinishedPulling="2025-10-01 15:10:52.615486904 +0000 UTC m=+897.234662065" observedRunningTime="2025-10-01 15:10:54.934189217 +0000 UTC m=+899.553364388" watchObservedRunningTime="2025-10-01 15:10:54.938531784 +0000 UTC m=+899.557706955" Oct 01 15:10:54 crc kubenswrapper[4771]: I1001 15:10:54.945120 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-6vp9n" event={"ID":"41e9ddd0-3d55-4c3d-a4e9-2fbe4a7ec6f6","Type":"ContainerStarted","Data":"17d3712339689901eaea9fd56db538164dedfb9f9823b368f7299b50d22a683c"} Oct 01 15:10:54 crc kubenswrapper[4771]: I1001 15:10:54.945249 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-6vp9n" Oct 01 15:10:54 crc kubenswrapper[4771]: I1001 15:10:54.949288 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-nv495" event={"ID":"4c10ae08-be13-4725-b9be-c55ce015f33e","Type":"ContainerStarted","Data":"627019218c498c906457924f7847868bcb7be23fdefc0107ee9f3fc706e0e403"} Oct 01 15:10:54 crc kubenswrapper[4771]: I1001 15:10:54.949343 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-nv495" Oct 01 15:10:54 crc kubenswrapper[4771]: I1001 15:10:54.951539 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-7tpgh" event={"ID":"46077b26-3930-4245-86b4-2d836a165664","Type":"ContainerStarted","Data":"b3b958edcf1ecc4c1f2fcc5f06adbd2b6806a5bcd7adbd167f4a4789e2776bfa"} Oct 01 15:10:54 crc kubenswrapper[4771]: I1001 15:10:54.952369 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-7tpgh" Oct 01 15:10:54 crc kubenswrapper[4771]: I1001 15:10:54.956109 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-bkpfg" podStartSLOduration=3.960697703 podStartE2EDuration="18.956096127s" podCreationTimestamp="2025-10-01 15:10:36 +0000 UTC" firstStartedPulling="2025-10-01 15:10:37.618521962 +0000 UTC m=+882.237697133" lastFinishedPulling="2025-10-01 15:10:52.613920386 +0000 UTC m=+897.233095557" observedRunningTime="2025-10-01 15:10:54.953279457 +0000 UTC m=+899.572454628" watchObservedRunningTime="2025-10-01 15:10:54.956096127 +0000 UTC m=+899.575271298" Oct 01 15:10:54 crc kubenswrapper[4771]: I1001 15:10:54.960637 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-wljgn" event={"ID":"193791c4-4d63-4f50-a743-439b664c16b7","Type":"ContainerStarted","Data":"15d74df3e932e3e2744cb6994d362d8fdb06ff7a0becbf79e5c83231fef41439"} Oct 01 15:10:54 crc kubenswrapper[4771]: I1001 15:10:54.960786 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-wljgn" Oct 01 15:10:54 crc kubenswrapper[4771]: I1001 15:10:54.964427 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-mwl7z" event={"ID":"681d5bbd-36b0-497f-9d27-f8cc7473399a","Type":"ContainerStarted","Data":"bb862c9c644b811033a66b7959587aec4447d0afb1d2ac851947c2f036b58294"} Oct 01 15:10:54 crc kubenswrapper[4771]: I1001 15:10:54.968398 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-mwl7z" Oct 01 15:10:54 crc kubenswrapper[4771]: I1001 15:10:54.976227 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-88c7-5jmlk" podStartSLOduration=4.963510072 podStartE2EDuration="19.976208012s" podCreationTimestamp="2025-10-01 15:10:35 +0000 UTC" firstStartedPulling="2025-10-01 15:10:37.615151689 +0000 UTC m=+882.234326860" lastFinishedPulling="2025-10-01 15:10:52.627849629 +0000 UTC m=+897.247024800" observedRunningTime="2025-10-01 15:10:54.967426236 +0000 UTC m=+899.586601407" watchObservedRunningTime="2025-10-01 15:10:54.976208012 +0000 UTC m=+899.595383173" Oct 01 15:10:54 crc kubenswrapper[4771]: I1001 15:10:54.976754 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-qnlxq" event={"ID":"b90ad79b-e447-4a96-82a1-4ae8cb5b9959","Type":"ContainerStarted","Data":"9092f1120383a1f0aea9f4ff231432d200cc3e1f776ab19c3c812c049810d9e1"} Oct 01 15:10:54 crc kubenswrapper[4771]: E1001 15:10:54.980005 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:a517abc6427ab73fed93b0bd89a6eb52d0311fbfb0c00752f889baf8ffd5068f\\\"\"" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-l85hk" podUID="863ec596-646c-41a0-b3e4-e33ad84c79aa" Oct 01 15:10:54 crc kubenswrapper[4771]: I1001 15:10:54.997102 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-wljgn" podStartSLOduration=4.307201088 podStartE2EDuration="18.997081696s" podCreationTimestamp="2025-10-01 15:10:36 +0000 UTC" firstStartedPulling="2025-10-01 15:10:37.925981265 +0000 UTC m=+882.545156436" lastFinishedPulling="2025-10-01 15:10:52.615861873 +0000 UTC m=+897.235037044" observedRunningTime="2025-10-01 15:10:54.988828793 +0000 UTC m=+899.608003954" watchObservedRunningTime="2025-10-01 15:10:54.997081696 +0000 UTC m=+899.616256867" Oct 01 15:10:55 crc kubenswrapper[4771]: I1001 15:10:55.007383 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-nv495" podStartSLOduration=5.04422029 podStartE2EDuration="20.007363839s" podCreationTimestamp="2025-10-01 15:10:35 +0000 UTC" firstStartedPulling="2025-10-01 15:10:37.651579316 +0000 UTC m=+882.270754487" lastFinishedPulling="2025-10-01 15:10:52.614722865 +0000 UTC m=+897.233898036" observedRunningTime="2025-10-01 15:10:55.002464369 +0000 UTC m=+899.621639540" watchObservedRunningTime="2025-10-01 15:10:55.007363839 +0000 UTC m=+899.626539010" Oct 01 15:10:55 crc kubenswrapper[4771]: I1001 15:10:55.027496 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-7tpgh" podStartSLOduration=4.540292137 podStartE2EDuration="20.027476515s" podCreationTimestamp="2025-10-01 15:10:35 +0000 UTC" firstStartedPulling="2025-10-01 15:10:37.064877085 +0000 UTC m=+881.684052256" lastFinishedPulling="2025-10-01 15:10:52.552061452 +0000 UTC m=+897.171236634" observedRunningTime="2025-10-01 15:10:55.017598112 +0000 UTC m=+899.636773283" watchObservedRunningTime="2025-10-01 15:10:55.027476515 +0000 UTC m=+899.646651676" Oct 01 15:10:55 crc kubenswrapper[4771]: I1001 15:10:55.055128 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-6vp9n" podStartSLOduration=4.48718374 podStartE2EDuration="20.055106755s" podCreationTimestamp="2025-10-01 15:10:35 +0000 UTC" firstStartedPulling="2025-10-01 15:10:37.046981765 +0000 UTC m=+881.666156936" lastFinishedPulling="2025-10-01 15:10:52.61490478 +0000 UTC m=+897.234079951" observedRunningTime="2025-10-01 15:10:55.036721782 +0000 UTC m=+899.655896963" watchObservedRunningTime="2025-10-01 15:10:55.055106755 +0000 UTC m=+899.674281926" Oct 01 15:10:55 crc kubenswrapper[4771]: I1001 15:10:55.061399 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-mwl7z" podStartSLOduration=5.031172619 podStartE2EDuration="20.06138486s" podCreationTimestamp="2025-10-01 15:10:35 +0000 UTC" firstStartedPulling="2025-10-01 15:10:37.521963194 +0000 UTC m=+882.141138365" lastFinishedPulling="2025-10-01 15:10:52.552175435 +0000 UTC m=+897.171350606" observedRunningTime="2025-10-01 15:10:55.050966613 +0000 UTC m=+899.670141784" watchObservedRunningTime="2025-10-01 15:10:55.06138486 +0000 UTC m=+899.680560031" Oct 01 15:10:55 crc kubenswrapper[4771]: I1001 15:10:55.070766 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-qnlxq" podStartSLOduration=4.959438412 podStartE2EDuration="20.0707513s" podCreationTimestamp="2025-10-01 15:10:35 +0000 UTC" firstStartedPulling="2025-10-01 15:10:37.537177469 +0000 UTC m=+882.156352640" lastFinishedPulling="2025-10-01 15:10:52.648490357 +0000 UTC m=+897.267665528" observedRunningTime="2025-10-01 15:10:55.070444593 +0000 UTC m=+899.689619794" watchObservedRunningTime="2025-10-01 15:10:55.0707513 +0000 UTC m=+899.689926471" Oct 01 15:10:55 crc kubenswrapper[4771]: I1001 15:10:55.996316 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-qnlxq" Oct 01 15:11:00 crc kubenswrapper[4771]: I1001 15:11:00.016044 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-nv6q5" event={"ID":"c8125ce6-7c9a-45d3-b820-698fd30d3471","Type":"ContainerStarted","Data":"7078933e37a9a05d955df1a09c90b6b89c2ac9a7fce3bf9c9e1aa639074936ab"} Oct 01 15:11:00 crc kubenswrapper[4771]: I1001 15:11:00.017457 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-nv6q5" Oct 01 15:11:00 crc kubenswrapper[4771]: I1001 15:11:00.017637 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7clp8" event={"ID":"b5efb91f-7e66-488b-ab6f-e52dbf63bc3c","Type":"ContainerStarted","Data":"ae430c6322c9292cbac3d29796f7e23394cdb16a59137ec65c00ac2b5c7070c5"} Oct 01 15:11:00 crc kubenswrapper[4771]: I1001 15:11:00.019447 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-c495dbccb-25dzd" event={"ID":"9abd1d50-adca-4d9a-8c33-89c3242174a5","Type":"ContainerStarted","Data":"b816e3202ab5a1bb52412803c9859e222e89abf1feb844601e8b6e6d79b8be61"} Oct 01 15:11:00 crc kubenswrapper[4771]: I1001 15:11:00.020050 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-c495dbccb-25dzd" Oct 01 15:11:00 crc kubenswrapper[4771]: I1001 15:11:00.021215 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8c62g57" event={"ID":"c87cb84f-c539-4562-8492-b1106b6181f1","Type":"ContainerStarted","Data":"92ba0284833e50476c6ca60a5f1b47a2524fbe29762e1a40cff6f275fd162af3"} Oct 01 15:11:00 crc kubenswrapper[4771]: I1001 15:11:00.021443 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8c62g57" Oct 01 15:11:00 crc kubenswrapper[4771]: I1001 15:11:00.022922 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-crn74" event={"ID":"a0517b85-5f5f-4d87-92f8-901564af068c","Type":"ContainerStarted","Data":"d82ca787a587460f1c9af044300d62dc5279025a9e23a7ae99381fc5a06b0deb"} Oct 01 15:11:00 crc kubenswrapper[4771]: I1001 15:11:00.023093 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-crn74" Oct 01 15:11:00 crc kubenswrapper[4771]: I1001 15:11:00.024917 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-fx47m" event={"ID":"417e6338-ae16-4903-8381-5bb1c3a92c75","Type":"ContainerStarted","Data":"97c9dbfcd60a6d1e6b222ae5f5cdd0c5ae616cfa4b64592fa2cb2b7952648c90"} Oct 01 15:11:00 crc kubenswrapper[4771]: I1001 15:11:00.025234 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-fx47m" Oct 01 15:11:00 crc kubenswrapper[4771]: I1001 15:11:00.026511 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-85777745bb-vvkwz" event={"ID":"e6d65172-53ec-4aae-a508-b955072cdd2a","Type":"ContainerStarted","Data":"22ffc84c71760dd61958e3c74f849652be57acc64b1473ac139b1d4748270cf6"} Oct 01 15:11:00 crc kubenswrapper[4771]: I1001 15:11:00.026705 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-85777745bb-vvkwz" Oct 01 15:11:00 crc kubenswrapper[4771]: I1001 15:11:00.028167 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-xkmp2" event={"ID":"f7c18d5d-6ebb-4c31-a348-6ae7feebfafc","Type":"ContainerStarted","Data":"3ad02420e02580907ba7ca7a846ef0a6a96490880004038b116c2764f3c38378"} Oct 01 15:11:00 crc kubenswrapper[4771]: I1001 15:11:00.028334 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-xkmp2" Oct 01 15:11:00 crc kubenswrapper[4771]: I1001 15:11:00.097969 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-c495dbccb-25dzd" podStartSLOduration=2.702359898 podStartE2EDuration="24.097947716s" podCreationTimestamp="2025-10-01 15:10:36 +0000 UTC" firstStartedPulling="2025-10-01 15:10:37.927460002 +0000 UTC m=+882.546635173" lastFinishedPulling="2025-10-01 15:10:59.32304782 +0000 UTC m=+903.942222991" observedRunningTime="2025-10-01 15:11:00.097062534 +0000 UTC m=+904.716237705" watchObservedRunningTime="2025-10-01 15:11:00.097947716 +0000 UTC m=+904.717122887" Oct 01 15:11:00 crc kubenswrapper[4771]: I1001 15:11:00.100439 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-nv6q5" podStartSLOduration=3.466447869 podStartE2EDuration="25.100427408s" podCreationTimestamp="2025-10-01 15:10:35 +0000 UTC" firstStartedPulling="2025-10-01 15:10:37.651990837 +0000 UTC m=+882.271166008" lastFinishedPulling="2025-10-01 15:10:59.285970376 +0000 UTC m=+903.905145547" observedRunningTime="2025-10-01 15:11:00.056128006 +0000 UTC m=+904.675303177" watchObservedRunningTime="2025-10-01 15:11:00.100427408 +0000 UTC m=+904.719602579" Oct 01 15:11:00 crc kubenswrapper[4771]: I1001 15:11:00.137797 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8c62g57" podStartSLOduration=3.870670514 podStartE2EDuration="25.137781377s" podCreationTimestamp="2025-10-01 15:10:35 +0000 UTC" firstStartedPulling="2025-10-01 15:10:38.017838508 +0000 UTC m=+882.637013679" lastFinishedPulling="2025-10-01 15:10:59.284949371 +0000 UTC m=+903.904124542" observedRunningTime="2025-10-01 15:11:00.134701712 +0000 UTC m=+904.753876883" watchObservedRunningTime="2025-10-01 15:11:00.137781377 +0000 UTC m=+904.756956548" Oct 01 15:11:00 crc kubenswrapper[4771]: I1001 15:11:00.173283 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-85777745bb-vvkwz" podStartSLOduration=2.893047586 podStartE2EDuration="24.173266722s" podCreationTimestamp="2025-10-01 15:10:36 +0000 UTC" firstStartedPulling="2025-10-01 15:10:38.004844398 +0000 UTC m=+882.624019569" lastFinishedPulling="2025-10-01 15:10:59.285063534 +0000 UTC m=+903.904238705" observedRunningTime="2025-10-01 15:11:00.171401265 +0000 UTC m=+904.790576446" watchObservedRunningTime="2025-10-01 15:11:00.173266722 +0000 UTC m=+904.792441893" Oct 01 15:11:00 crc kubenswrapper[4771]: I1001 15:11:00.224262 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-crn74" podStartSLOduration=2.830795052 podStartE2EDuration="24.224246467s" podCreationTimestamp="2025-10-01 15:10:36 +0000 UTC" firstStartedPulling="2025-10-01 15:10:37.928031926 +0000 UTC m=+882.547207097" lastFinishedPulling="2025-10-01 15:10:59.321483341 +0000 UTC m=+903.940658512" observedRunningTime="2025-10-01 15:11:00.200573974 +0000 UTC m=+904.819749145" watchObservedRunningTime="2025-10-01 15:11:00.224246467 +0000 UTC m=+904.843421638" Oct 01 15:11:00 crc kubenswrapper[4771]: I1001 15:11:00.225936 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-xkmp2" podStartSLOduration=3.546807895 podStartE2EDuration="25.225929988s" podCreationTimestamp="2025-10-01 15:10:35 +0000 UTC" firstStartedPulling="2025-10-01 15:10:37.655603465 +0000 UTC m=+882.274778636" lastFinishedPulling="2025-10-01 15:10:59.334725557 +0000 UTC m=+903.953900729" observedRunningTime="2025-10-01 15:11:00.222785681 +0000 UTC m=+904.841960862" watchObservedRunningTime="2025-10-01 15:11:00.225929988 +0000 UTC m=+904.845105159" Oct 01 15:11:00 crc kubenswrapper[4771]: I1001 15:11:00.289392 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-fx47m" podStartSLOduration=3.848018047 podStartE2EDuration="25.289375052s" podCreationTimestamp="2025-10-01 15:10:35 +0000 UTC" firstStartedPulling="2025-10-01 15:10:37.843837352 +0000 UTC m=+882.463012523" lastFinishedPulling="2025-10-01 15:10:59.285194357 +0000 UTC m=+903.904369528" observedRunningTime="2025-10-01 15:11:00.24548418 +0000 UTC m=+904.864659351" watchObservedRunningTime="2025-10-01 15:11:00.289375052 +0000 UTC m=+904.908550223" Oct 01 15:11:00 crc kubenswrapper[4771]: I1001 15:11:00.289565 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-7clp8" podStartSLOduration=2.943278824 podStartE2EDuration="24.289562146s" podCreationTimestamp="2025-10-01 15:10:36 +0000 UTC" firstStartedPulling="2025-10-01 15:10:37.973802074 +0000 UTC m=+882.592977245" lastFinishedPulling="2025-10-01 15:10:59.320085396 +0000 UTC m=+903.939260567" observedRunningTime="2025-10-01 15:11:00.285604919 +0000 UTC m=+904.904780090" watchObservedRunningTime="2025-10-01 15:11:00.289562146 +0000 UTC m=+904.908737317" Oct 01 15:11:05 crc kubenswrapper[4771]: I1001 15:11:05.998868 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-6vp9n" Oct 01 15:11:06 crc kubenswrapper[4771]: I1001 15:11:06.010696 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-gbllm" Oct 01 15:11:06 crc kubenswrapper[4771]: I1001 15:11:06.031413 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-7tpgh" Oct 01 15:11:06 crc kubenswrapper[4771]: I1001 15:11:06.051440 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-99nfn" Oct 01 15:11:06 crc kubenswrapper[4771]: I1001 15:11:06.086155 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-55pmn" Oct 01 15:11:06 crc kubenswrapper[4771]: I1001 15:11:06.106788 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-mwl7z" Oct 01 15:11:06 crc kubenswrapper[4771]: I1001 15:11:06.276483 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5cd4858477-zcdfw" Oct 01 15:11:06 crc kubenswrapper[4771]: I1001 15:11:06.363406 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-wps64" Oct 01 15:11:06 crc kubenswrapper[4771]: I1001 15:11:06.474568 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-88c7-5jmlk" Oct 01 15:11:06 crc kubenswrapper[4771]: I1001 15:11:06.511967 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-849d5b9b84-nv6q5" Oct 01 15:11:06 crc kubenswrapper[4771]: I1001 15:11:06.566139 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-qnlxq" Oct 01 15:11:06 crc kubenswrapper[4771]: I1001 15:11:06.647862 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-bkpfg" Oct 01 15:11:06 crc kubenswrapper[4771]: I1001 15:11:06.731708 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-nv495" Oct 01 15:11:06 crc kubenswrapper[4771]: I1001 15:11:06.765712 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-9d6c5db85-xkmp2" Oct 01 15:11:06 crc kubenswrapper[4771]: I1001 15:11:06.844501 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-7b787867f4-fx47m" Oct 01 15:11:06 crc kubenswrapper[4771]: I1001 15:11:06.895132 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-84d6b4b759-wljgn" Oct 01 15:11:06 crc kubenswrapper[4771]: I1001 15:11:06.920873 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-c495dbccb-25dzd" Oct 01 15:11:06 crc kubenswrapper[4771]: I1001 15:11:06.937694 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-85777745bb-vvkwz" Oct 01 15:11:06 crc kubenswrapper[4771]: I1001 15:11:06.952956 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6b9957f54f-crn74" Oct 01 15:11:07 crc kubenswrapper[4771]: I1001 15:11:07.194475 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-77b9676b8c62g57" Oct 01 15:11:10 crc kubenswrapper[4771]: I1001 15:11:10.104785 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-l85hk" event={"ID":"863ec596-646c-41a0-b3e4-e33ad84c79aa","Type":"ContainerStarted","Data":"54e99c60d4629dd0cadc67cb95f244faf6f012d08d380951400f07933907cdae"} Oct 01 15:11:10 crc kubenswrapper[4771]: I1001 15:11:10.105785 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-l85hk" Oct 01 15:11:10 crc kubenswrapper[4771]: I1001 15:11:10.134086 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-l85hk" podStartSLOduration=3.212248607 podStartE2EDuration="35.134062999s" podCreationTimestamp="2025-10-01 15:10:35 +0000 UTC" firstStartedPulling="2025-10-01 15:10:37.615162419 +0000 UTC m=+882.234337590" lastFinishedPulling="2025-10-01 15:11:09.536976811 +0000 UTC m=+914.156151982" observedRunningTime="2025-10-01 15:11:10.129180908 +0000 UTC m=+914.748356119" watchObservedRunningTime="2025-10-01 15:11:10.134062999 +0000 UTC m=+914.753238210" Oct 01 15:11:12 crc kubenswrapper[4771]: I1001 15:11:12.177336 4771 patch_prober.go:28] interesting pod/machine-config-daemon-vck47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 15:11:12 crc kubenswrapper[4771]: I1001 15:11:12.177418 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 15:11:16 crc kubenswrapper[4771]: I1001 15:11:16.529836 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-64cd67b5cb-l85hk" Oct 01 15:11:33 crc kubenswrapper[4771]: I1001 15:11:33.166517 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-rv87g"] Oct 01 15:11:33 crc kubenswrapper[4771]: E1001 15:11:33.167276 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0745ade3-bbcb-4ac3-b60d-4956c63d1be6" containerName="extract-content" Oct 01 15:11:33 crc kubenswrapper[4771]: I1001 15:11:33.167289 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="0745ade3-bbcb-4ac3-b60d-4956c63d1be6" containerName="extract-content" Oct 01 15:11:33 crc kubenswrapper[4771]: E1001 15:11:33.167317 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0745ade3-bbcb-4ac3-b60d-4956c63d1be6" containerName="extract-utilities" Oct 01 15:11:33 crc kubenswrapper[4771]: I1001 15:11:33.167323 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="0745ade3-bbcb-4ac3-b60d-4956c63d1be6" containerName="extract-utilities" Oct 01 15:11:33 crc kubenswrapper[4771]: E1001 15:11:33.167337 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0745ade3-bbcb-4ac3-b60d-4956c63d1be6" containerName="registry-server" Oct 01 15:11:33 crc kubenswrapper[4771]: I1001 15:11:33.167344 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="0745ade3-bbcb-4ac3-b60d-4956c63d1be6" containerName="registry-server" Oct 01 15:11:33 crc kubenswrapper[4771]: I1001 15:11:33.167466 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="0745ade3-bbcb-4ac3-b60d-4956c63d1be6" containerName="registry-server" Oct 01 15:11:33 crc kubenswrapper[4771]: I1001 15:11:33.168272 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-rv87g" Oct 01 15:11:33 crc kubenswrapper[4771]: I1001 15:11:33.170907 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-bdwq8" Oct 01 15:11:33 crc kubenswrapper[4771]: I1001 15:11:33.171055 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 01 15:11:33 crc kubenswrapper[4771]: I1001 15:11:33.171160 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 01 15:11:33 crc kubenswrapper[4771]: I1001 15:11:33.171317 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 01 15:11:33 crc kubenswrapper[4771]: I1001 15:11:33.181880 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-rv87g"] Oct 01 15:11:33 crc kubenswrapper[4771]: I1001 15:11:33.231205 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-xd9fd"] Oct 01 15:11:33 crc kubenswrapper[4771]: I1001 15:11:33.232341 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-xd9fd" Oct 01 15:11:33 crc kubenswrapper[4771]: I1001 15:11:33.234211 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 01 15:11:33 crc kubenswrapper[4771]: I1001 15:11:33.251305 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-xd9fd"] Oct 01 15:11:33 crc kubenswrapper[4771]: I1001 15:11:33.293154 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdb9l\" (UniqueName: \"kubernetes.io/projected/ffa0e70f-f409-4edb-af80-45683a7ec8cb-kube-api-access-mdb9l\") pod \"dnsmasq-dns-78dd6ddcc-xd9fd\" (UID: \"ffa0e70f-f409-4edb-af80-45683a7ec8cb\") " pod="openstack/dnsmasq-dns-78dd6ddcc-xd9fd" Oct 01 15:11:33 crc kubenswrapper[4771]: I1001 15:11:33.293213 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pbwq\" (UniqueName: \"kubernetes.io/projected/d2091b34-b70f-4185-b37a-99e3bc0db9e7-kube-api-access-4pbwq\") pod \"dnsmasq-dns-675f4bcbfc-rv87g\" (UID: \"d2091b34-b70f-4185-b37a-99e3bc0db9e7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-rv87g" Oct 01 15:11:33 crc kubenswrapper[4771]: I1001 15:11:33.293244 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ffa0e70f-f409-4edb-af80-45683a7ec8cb-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-xd9fd\" (UID: \"ffa0e70f-f409-4edb-af80-45683a7ec8cb\") " pod="openstack/dnsmasq-dns-78dd6ddcc-xd9fd" Oct 01 15:11:33 crc kubenswrapper[4771]: I1001 15:11:33.293285 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffa0e70f-f409-4edb-af80-45683a7ec8cb-config\") pod \"dnsmasq-dns-78dd6ddcc-xd9fd\" (UID: \"ffa0e70f-f409-4edb-af80-45683a7ec8cb\") " pod="openstack/dnsmasq-dns-78dd6ddcc-xd9fd" Oct 01 15:11:33 crc kubenswrapper[4771]: I1001 15:11:33.293358 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2091b34-b70f-4185-b37a-99e3bc0db9e7-config\") pod \"dnsmasq-dns-675f4bcbfc-rv87g\" (UID: \"d2091b34-b70f-4185-b37a-99e3bc0db9e7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-rv87g" Oct 01 15:11:33 crc kubenswrapper[4771]: I1001 15:11:33.394368 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffa0e70f-f409-4edb-af80-45683a7ec8cb-config\") pod \"dnsmasq-dns-78dd6ddcc-xd9fd\" (UID: \"ffa0e70f-f409-4edb-af80-45683a7ec8cb\") " pod="openstack/dnsmasq-dns-78dd6ddcc-xd9fd" Oct 01 15:11:33 crc kubenswrapper[4771]: I1001 15:11:33.394465 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2091b34-b70f-4185-b37a-99e3bc0db9e7-config\") pod \"dnsmasq-dns-675f4bcbfc-rv87g\" (UID: \"d2091b34-b70f-4185-b37a-99e3bc0db9e7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-rv87g" Oct 01 15:11:33 crc kubenswrapper[4771]: I1001 15:11:33.394524 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdb9l\" (UniqueName: \"kubernetes.io/projected/ffa0e70f-f409-4edb-af80-45683a7ec8cb-kube-api-access-mdb9l\") pod \"dnsmasq-dns-78dd6ddcc-xd9fd\" (UID: \"ffa0e70f-f409-4edb-af80-45683a7ec8cb\") " pod="openstack/dnsmasq-dns-78dd6ddcc-xd9fd" Oct 01 15:11:33 crc kubenswrapper[4771]: I1001 15:11:33.394549 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pbwq\" (UniqueName: \"kubernetes.io/projected/d2091b34-b70f-4185-b37a-99e3bc0db9e7-kube-api-access-4pbwq\") pod \"dnsmasq-dns-675f4bcbfc-rv87g\" (UID: \"d2091b34-b70f-4185-b37a-99e3bc0db9e7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-rv87g" Oct 01 15:11:33 crc kubenswrapper[4771]: I1001 15:11:33.394575 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ffa0e70f-f409-4edb-af80-45683a7ec8cb-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-xd9fd\" (UID: \"ffa0e70f-f409-4edb-af80-45683a7ec8cb\") " pod="openstack/dnsmasq-dns-78dd6ddcc-xd9fd" Oct 01 15:11:33 crc kubenswrapper[4771]: I1001 15:11:33.395298 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffa0e70f-f409-4edb-af80-45683a7ec8cb-config\") pod \"dnsmasq-dns-78dd6ddcc-xd9fd\" (UID: \"ffa0e70f-f409-4edb-af80-45683a7ec8cb\") " pod="openstack/dnsmasq-dns-78dd6ddcc-xd9fd" Oct 01 15:11:33 crc kubenswrapper[4771]: I1001 15:11:33.395520 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ffa0e70f-f409-4edb-af80-45683a7ec8cb-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-xd9fd\" (UID: \"ffa0e70f-f409-4edb-af80-45683a7ec8cb\") " pod="openstack/dnsmasq-dns-78dd6ddcc-xd9fd" Oct 01 15:11:33 crc kubenswrapper[4771]: I1001 15:11:33.395752 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2091b34-b70f-4185-b37a-99e3bc0db9e7-config\") pod \"dnsmasq-dns-675f4bcbfc-rv87g\" (UID: \"d2091b34-b70f-4185-b37a-99e3bc0db9e7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-rv87g" Oct 01 15:11:33 crc kubenswrapper[4771]: I1001 15:11:33.415695 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pbwq\" (UniqueName: \"kubernetes.io/projected/d2091b34-b70f-4185-b37a-99e3bc0db9e7-kube-api-access-4pbwq\") pod \"dnsmasq-dns-675f4bcbfc-rv87g\" (UID: \"d2091b34-b70f-4185-b37a-99e3bc0db9e7\") " pod="openstack/dnsmasq-dns-675f4bcbfc-rv87g" Oct 01 15:11:33 crc kubenswrapper[4771]: I1001 15:11:33.415747 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdb9l\" (UniqueName: \"kubernetes.io/projected/ffa0e70f-f409-4edb-af80-45683a7ec8cb-kube-api-access-mdb9l\") pod \"dnsmasq-dns-78dd6ddcc-xd9fd\" (UID: \"ffa0e70f-f409-4edb-af80-45683a7ec8cb\") " pod="openstack/dnsmasq-dns-78dd6ddcc-xd9fd" Oct 01 15:11:33 crc kubenswrapper[4771]: I1001 15:11:33.487563 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-rv87g" Oct 01 15:11:33 crc kubenswrapper[4771]: I1001 15:11:33.546493 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-xd9fd" Oct 01 15:11:34 crc kubenswrapper[4771]: I1001 15:11:34.036835 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-rv87g"] Oct 01 15:11:34 crc kubenswrapper[4771]: I1001 15:11:34.040494 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 15:11:34 crc kubenswrapper[4771]: I1001 15:11:34.116320 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-xd9fd"] Oct 01 15:11:34 crc kubenswrapper[4771]: W1001 15:11:34.119971 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podffa0e70f_f409_4edb_af80_45683a7ec8cb.slice/crio-de65681cc3f5f466adcbace8a9e4ebf156377eedafe3e3a935afa3cdfd64cca2 WatchSource:0}: Error finding container de65681cc3f5f466adcbace8a9e4ebf156377eedafe3e3a935afa3cdfd64cca2: Status 404 returned error can't find the container with id de65681cc3f5f466adcbace8a9e4ebf156377eedafe3e3a935afa3cdfd64cca2 Oct 01 15:11:34 crc kubenswrapper[4771]: I1001 15:11:34.326955 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-xd9fd" event={"ID":"ffa0e70f-f409-4edb-af80-45683a7ec8cb","Type":"ContainerStarted","Data":"de65681cc3f5f466adcbace8a9e4ebf156377eedafe3e3a935afa3cdfd64cca2"} Oct 01 15:11:34 crc kubenswrapper[4771]: I1001 15:11:34.327914 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-rv87g" event={"ID":"d2091b34-b70f-4185-b37a-99e3bc0db9e7","Type":"ContainerStarted","Data":"b94a90db235b38137eb65eacece325fb4cb3fa9dadee4afe3f537aed7b35e84e"} Oct 01 15:11:36 crc kubenswrapper[4771]: I1001 15:11:36.091960 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-rv87g"] Oct 01 15:11:36 crc kubenswrapper[4771]: I1001 15:11:36.117503 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-nbmcf"] Oct 01 15:11:36 crc kubenswrapper[4771]: I1001 15:11:36.119013 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-nbmcf" Oct 01 15:11:36 crc kubenswrapper[4771]: I1001 15:11:36.138632 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-nbmcf"] Oct 01 15:11:36 crc kubenswrapper[4771]: I1001 15:11:36.249867 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn4gm\" (UniqueName: \"kubernetes.io/projected/182a382f-1b8f-49c4-8386-e9aa14cc4ffc-kube-api-access-qn4gm\") pod \"dnsmasq-dns-666b6646f7-nbmcf\" (UID: \"182a382f-1b8f-49c4-8386-e9aa14cc4ffc\") " pod="openstack/dnsmasq-dns-666b6646f7-nbmcf" Oct 01 15:11:36 crc kubenswrapper[4771]: I1001 15:11:36.250022 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/182a382f-1b8f-49c4-8386-e9aa14cc4ffc-dns-svc\") pod \"dnsmasq-dns-666b6646f7-nbmcf\" (UID: \"182a382f-1b8f-49c4-8386-e9aa14cc4ffc\") " pod="openstack/dnsmasq-dns-666b6646f7-nbmcf" Oct 01 15:11:36 crc kubenswrapper[4771]: I1001 15:11:36.250095 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/182a382f-1b8f-49c4-8386-e9aa14cc4ffc-config\") pod \"dnsmasq-dns-666b6646f7-nbmcf\" (UID: \"182a382f-1b8f-49c4-8386-e9aa14cc4ffc\") " pod="openstack/dnsmasq-dns-666b6646f7-nbmcf" Oct 01 15:11:36 crc kubenswrapper[4771]: I1001 15:11:36.351563 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qn4gm\" (UniqueName: \"kubernetes.io/projected/182a382f-1b8f-49c4-8386-e9aa14cc4ffc-kube-api-access-qn4gm\") pod \"dnsmasq-dns-666b6646f7-nbmcf\" (UID: \"182a382f-1b8f-49c4-8386-e9aa14cc4ffc\") " pod="openstack/dnsmasq-dns-666b6646f7-nbmcf" Oct 01 15:11:36 crc kubenswrapper[4771]: I1001 15:11:36.353920 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/182a382f-1b8f-49c4-8386-e9aa14cc4ffc-dns-svc\") pod \"dnsmasq-dns-666b6646f7-nbmcf\" (UID: \"182a382f-1b8f-49c4-8386-e9aa14cc4ffc\") " pod="openstack/dnsmasq-dns-666b6646f7-nbmcf" Oct 01 15:11:36 crc kubenswrapper[4771]: I1001 15:11:36.354007 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/182a382f-1b8f-49c4-8386-e9aa14cc4ffc-config\") pod \"dnsmasq-dns-666b6646f7-nbmcf\" (UID: \"182a382f-1b8f-49c4-8386-e9aa14cc4ffc\") " pod="openstack/dnsmasq-dns-666b6646f7-nbmcf" Oct 01 15:11:36 crc kubenswrapper[4771]: I1001 15:11:36.355527 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/182a382f-1b8f-49c4-8386-e9aa14cc4ffc-dns-svc\") pod \"dnsmasq-dns-666b6646f7-nbmcf\" (UID: \"182a382f-1b8f-49c4-8386-e9aa14cc4ffc\") " pod="openstack/dnsmasq-dns-666b6646f7-nbmcf" Oct 01 15:11:36 crc kubenswrapper[4771]: I1001 15:11:36.356687 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/182a382f-1b8f-49c4-8386-e9aa14cc4ffc-config\") pod \"dnsmasq-dns-666b6646f7-nbmcf\" (UID: \"182a382f-1b8f-49c4-8386-e9aa14cc4ffc\") " pod="openstack/dnsmasq-dns-666b6646f7-nbmcf" Oct 01 15:11:36 crc kubenswrapper[4771]: I1001 15:11:36.414865 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn4gm\" (UniqueName: \"kubernetes.io/projected/182a382f-1b8f-49c4-8386-e9aa14cc4ffc-kube-api-access-qn4gm\") pod \"dnsmasq-dns-666b6646f7-nbmcf\" (UID: \"182a382f-1b8f-49c4-8386-e9aa14cc4ffc\") " pod="openstack/dnsmasq-dns-666b6646f7-nbmcf" Oct 01 15:11:36 crc kubenswrapper[4771]: I1001 15:11:36.449381 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-xd9fd"] Oct 01 15:11:36 crc kubenswrapper[4771]: I1001 15:11:36.463453 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-nbmcf" Oct 01 15:11:36 crc kubenswrapper[4771]: I1001 15:11:36.464417 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gtppt"] Oct 01 15:11:36 crc kubenswrapper[4771]: I1001 15:11:36.466081 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-gtppt" Oct 01 15:11:36 crc kubenswrapper[4771]: I1001 15:11:36.475848 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gtppt"] Oct 01 15:11:36 crc kubenswrapper[4771]: I1001 15:11:36.565101 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8860d5a6-498b-4dcd-9648-93aa7afe9d41-config\") pod \"dnsmasq-dns-57d769cc4f-gtppt\" (UID: \"8860d5a6-498b-4dcd-9648-93aa7afe9d41\") " pod="openstack/dnsmasq-dns-57d769cc4f-gtppt" Oct 01 15:11:36 crc kubenswrapper[4771]: I1001 15:11:36.565182 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfbq2\" (UniqueName: \"kubernetes.io/projected/8860d5a6-498b-4dcd-9648-93aa7afe9d41-kube-api-access-qfbq2\") pod \"dnsmasq-dns-57d769cc4f-gtppt\" (UID: \"8860d5a6-498b-4dcd-9648-93aa7afe9d41\") " pod="openstack/dnsmasq-dns-57d769cc4f-gtppt" Oct 01 15:11:36 crc kubenswrapper[4771]: I1001 15:11:36.565234 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8860d5a6-498b-4dcd-9648-93aa7afe9d41-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-gtppt\" (UID: \"8860d5a6-498b-4dcd-9648-93aa7afe9d41\") " pod="openstack/dnsmasq-dns-57d769cc4f-gtppt" Oct 01 15:11:36 crc kubenswrapper[4771]: I1001 15:11:36.667050 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfbq2\" (UniqueName: \"kubernetes.io/projected/8860d5a6-498b-4dcd-9648-93aa7afe9d41-kube-api-access-qfbq2\") pod \"dnsmasq-dns-57d769cc4f-gtppt\" (UID: \"8860d5a6-498b-4dcd-9648-93aa7afe9d41\") " pod="openstack/dnsmasq-dns-57d769cc4f-gtppt" Oct 01 15:11:36 crc kubenswrapper[4771]: I1001 15:11:36.670616 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8860d5a6-498b-4dcd-9648-93aa7afe9d41-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-gtppt\" (UID: \"8860d5a6-498b-4dcd-9648-93aa7afe9d41\") " pod="openstack/dnsmasq-dns-57d769cc4f-gtppt" Oct 01 15:11:36 crc kubenswrapper[4771]: I1001 15:11:36.670818 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8860d5a6-498b-4dcd-9648-93aa7afe9d41-config\") pod \"dnsmasq-dns-57d769cc4f-gtppt\" (UID: \"8860d5a6-498b-4dcd-9648-93aa7afe9d41\") " pod="openstack/dnsmasq-dns-57d769cc4f-gtppt" Oct 01 15:11:36 crc kubenswrapper[4771]: I1001 15:11:36.674568 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8860d5a6-498b-4dcd-9648-93aa7afe9d41-config\") pod \"dnsmasq-dns-57d769cc4f-gtppt\" (UID: \"8860d5a6-498b-4dcd-9648-93aa7afe9d41\") " pod="openstack/dnsmasq-dns-57d769cc4f-gtppt" Oct 01 15:11:36 crc kubenswrapper[4771]: I1001 15:11:36.682894 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8860d5a6-498b-4dcd-9648-93aa7afe9d41-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-gtppt\" (UID: \"8860d5a6-498b-4dcd-9648-93aa7afe9d41\") " pod="openstack/dnsmasq-dns-57d769cc4f-gtppt" Oct 01 15:11:36 crc kubenswrapper[4771]: I1001 15:11:36.704770 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfbq2\" (UniqueName: \"kubernetes.io/projected/8860d5a6-498b-4dcd-9648-93aa7afe9d41-kube-api-access-qfbq2\") pod \"dnsmasq-dns-57d769cc4f-gtppt\" (UID: \"8860d5a6-498b-4dcd-9648-93aa7afe9d41\") " pod="openstack/dnsmasq-dns-57d769cc4f-gtppt" Oct 01 15:11:36 crc kubenswrapper[4771]: I1001 15:11:36.794407 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-gtppt" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.022407 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-nbmcf"] Oct 01 15:11:37 crc kubenswrapper[4771]: W1001 15:11:37.023251 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod182a382f_1b8f_49c4_8386_e9aa14cc4ffc.slice/crio-fbebe17b53d58468161ee2cd5f74c468eb70c813d82505a849538357e5069523 WatchSource:0}: Error finding container fbebe17b53d58468161ee2cd5f74c468eb70c813d82505a849538357e5069523: Status 404 returned error can't find the container with id fbebe17b53d58468161ee2cd5f74c468eb70c813d82505a849538357e5069523 Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.261061 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gtppt"] Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.270261 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.271842 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.274012 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.274222 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.274328 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.274014 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.274567 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.274936 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-p24np" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.275915 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.278440 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.357997 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-gtppt" event={"ID":"8860d5a6-498b-4dcd-9648-93aa7afe9d41","Type":"ContainerStarted","Data":"7bb78ef999d8ba08e4daaf2ac5c653b29570675bab09a7b94bdfd05f1c08f541"} Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.363103 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-nbmcf" event={"ID":"182a382f-1b8f-49c4-8386-e9aa14cc4ffc","Type":"ContainerStarted","Data":"fbebe17b53d58468161ee2cd5f74c468eb70c813d82505a849538357e5069523"} Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.382126 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4ef68dad-0f62-4a2d-aa86-23997c284df0-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4ef68dad-0f62-4a2d-aa86-23997c284df0\") " pod="openstack/rabbitmq-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.382177 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4ef68dad-0f62-4a2d-aa86-23997c284df0-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4ef68dad-0f62-4a2d-aa86-23997c284df0\") " pod="openstack/rabbitmq-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.382209 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4ef68dad-0f62-4a2d-aa86-23997c284df0-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4ef68dad-0f62-4a2d-aa86-23997c284df0\") " pod="openstack/rabbitmq-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.382316 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4ef68dad-0f62-4a2d-aa86-23997c284df0-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4ef68dad-0f62-4a2d-aa86-23997c284df0\") " pod="openstack/rabbitmq-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.382401 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4ef68dad-0f62-4a2d-aa86-23997c284df0-config-data\") pod \"rabbitmq-server-0\" (UID: \"4ef68dad-0f62-4a2d-aa86-23997c284df0\") " pod="openstack/rabbitmq-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.382517 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4ef68dad-0f62-4a2d-aa86-23997c284df0-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4ef68dad-0f62-4a2d-aa86-23997c284df0\") " pod="openstack/rabbitmq-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.382580 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4ef68dad-0f62-4a2d-aa86-23997c284df0-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4ef68dad-0f62-4a2d-aa86-23997c284df0\") " pod="openstack/rabbitmq-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.382868 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4ef68dad-0f62-4a2d-aa86-23997c284df0-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4ef68dad-0f62-4a2d-aa86-23997c284df0\") " pod="openstack/rabbitmq-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.382914 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj5v4\" (UniqueName: \"kubernetes.io/projected/4ef68dad-0f62-4a2d-aa86-23997c284df0-kube-api-access-vj5v4\") pod \"rabbitmq-server-0\" (UID: \"4ef68dad-0f62-4a2d-aa86-23997c284df0\") " pod="openstack/rabbitmq-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.382929 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4ef68dad-0f62-4a2d-aa86-23997c284df0-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4ef68dad-0f62-4a2d-aa86-23997c284df0\") " pod="openstack/rabbitmq-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.382966 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"4ef68dad-0f62-4a2d-aa86-23997c284df0\") " pod="openstack/rabbitmq-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.484991 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4ef68dad-0f62-4a2d-aa86-23997c284df0-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4ef68dad-0f62-4a2d-aa86-23997c284df0\") " pod="openstack/rabbitmq-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.485049 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4ef68dad-0f62-4a2d-aa86-23997c284df0-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4ef68dad-0f62-4a2d-aa86-23997c284df0\") " pod="openstack/rabbitmq-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.485078 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4ef68dad-0f62-4a2d-aa86-23997c284df0-config-data\") pod \"rabbitmq-server-0\" (UID: \"4ef68dad-0f62-4a2d-aa86-23997c284df0\") " pod="openstack/rabbitmq-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.485173 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4ef68dad-0f62-4a2d-aa86-23997c284df0-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4ef68dad-0f62-4a2d-aa86-23997c284df0\") " pod="openstack/rabbitmq-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.485218 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4ef68dad-0f62-4a2d-aa86-23997c284df0-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4ef68dad-0f62-4a2d-aa86-23997c284df0\") " pod="openstack/rabbitmq-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.485247 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4ef68dad-0f62-4a2d-aa86-23997c284df0-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4ef68dad-0f62-4a2d-aa86-23997c284df0\") " pod="openstack/rabbitmq-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.485275 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vj5v4\" (UniqueName: \"kubernetes.io/projected/4ef68dad-0f62-4a2d-aa86-23997c284df0-kube-api-access-vj5v4\") pod \"rabbitmq-server-0\" (UID: \"4ef68dad-0f62-4a2d-aa86-23997c284df0\") " pod="openstack/rabbitmq-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.485293 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4ef68dad-0f62-4a2d-aa86-23997c284df0-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4ef68dad-0f62-4a2d-aa86-23997c284df0\") " pod="openstack/rabbitmq-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.485321 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"4ef68dad-0f62-4a2d-aa86-23997c284df0\") " pod="openstack/rabbitmq-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.485349 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4ef68dad-0f62-4a2d-aa86-23997c284df0-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4ef68dad-0f62-4a2d-aa86-23997c284df0\") " pod="openstack/rabbitmq-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.485370 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4ef68dad-0f62-4a2d-aa86-23997c284df0-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4ef68dad-0f62-4a2d-aa86-23997c284df0\") " pod="openstack/rabbitmq-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.486721 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4ef68dad-0f62-4a2d-aa86-23997c284df0-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4ef68dad-0f62-4a2d-aa86-23997c284df0\") " pod="openstack/rabbitmq-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.488343 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4ef68dad-0f62-4a2d-aa86-23997c284df0-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4ef68dad-0f62-4a2d-aa86-23997c284df0\") " pod="openstack/rabbitmq-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.489083 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4ef68dad-0f62-4a2d-aa86-23997c284df0-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4ef68dad-0f62-4a2d-aa86-23997c284df0\") " pod="openstack/rabbitmq-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.489252 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4ef68dad-0f62-4a2d-aa86-23997c284df0-config-data\") pod \"rabbitmq-server-0\" (UID: \"4ef68dad-0f62-4a2d-aa86-23997c284df0\") " pod="openstack/rabbitmq-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.489592 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4ef68dad-0f62-4a2d-aa86-23997c284df0-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4ef68dad-0f62-4a2d-aa86-23997c284df0\") " pod="openstack/rabbitmq-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.490459 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"4ef68dad-0f62-4a2d-aa86-23997c284df0\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.494242 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4ef68dad-0f62-4a2d-aa86-23997c284df0-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4ef68dad-0f62-4a2d-aa86-23997c284df0\") " pod="openstack/rabbitmq-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.495589 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4ef68dad-0f62-4a2d-aa86-23997c284df0-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4ef68dad-0f62-4a2d-aa86-23997c284df0\") " pod="openstack/rabbitmq-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.500249 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4ef68dad-0f62-4a2d-aa86-23997c284df0-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4ef68dad-0f62-4a2d-aa86-23997c284df0\") " pod="openstack/rabbitmq-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.508003 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4ef68dad-0f62-4a2d-aa86-23997c284df0-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4ef68dad-0f62-4a2d-aa86-23997c284df0\") " pod="openstack/rabbitmq-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.512779 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj5v4\" (UniqueName: \"kubernetes.io/projected/4ef68dad-0f62-4a2d-aa86-23997c284df0-kube-api-access-vj5v4\") pod \"rabbitmq-server-0\" (UID: \"4ef68dad-0f62-4a2d-aa86-23997c284df0\") " pod="openstack/rabbitmq-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.524523 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"4ef68dad-0f62-4a2d-aa86-23997c284df0\") " pod="openstack/rabbitmq-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.576609 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.577710 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.583580 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.583654 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.583725 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.583804 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.583837 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-7mknv" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.585949 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.586022 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.592042 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.597656 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.687812 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.687853 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.687883 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.687973 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.688003 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcsk9\" (UniqueName: \"kubernetes.io/projected/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-kube-api-access-tcsk9\") pod \"rabbitmq-cell1-server-0\" (UID: \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.688022 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.688048 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.688158 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.688193 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.688217 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.688264 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.789866 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.790265 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.790379 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.790405 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcsk9\" (UniqueName: \"kubernetes.io/projected/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-kube-api-access-tcsk9\") pod \"rabbitmq-cell1-server-0\" (UID: \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.790425 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.790488 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.790592 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.790676 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.790984 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.791319 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.791534 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.791600 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.791608 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.792101 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.792189 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.792392 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.796426 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.806430 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.816570 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.817805 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.821751 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.824615 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcsk9\" (UniqueName: \"kubernetes.io/projected/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-kube-api-access-tcsk9\") pod \"rabbitmq-cell1-server-0\" (UID: \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.827159 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:11:37 crc kubenswrapper[4771]: I1001 15:11:37.909064 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:11:38 crc kubenswrapper[4771]: I1001 15:11:38.087366 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 15:11:38 crc kubenswrapper[4771]: W1001 15:11:38.150877 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ef68dad_0f62_4a2d_aa86_23997c284df0.slice/crio-ba6d3137207476036e159eb1e79b492d458e6bd71f21bb58a90d1f0aafb6d4da WatchSource:0}: Error finding container ba6d3137207476036e159eb1e79b492d458e6bd71f21bb58a90d1f0aafb6d4da: Status 404 returned error can't find the container with id ba6d3137207476036e159eb1e79b492d458e6bd71f21bb58a90d1f0aafb6d4da Oct 01 15:11:38 crc kubenswrapper[4771]: I1001 15:11:38.385633 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4ef68dad-0f62-4a2d-aa86-23997c284df0","Type":"ContainerStarted","Data":"ba6d3137207476036e159eb1e79b492d458e6bd71f21bb58a90d1f0aafb6d4da"} Oct 01 15:11:38 crc kubenswrapper[4771]: I1001 15:11:38.514170 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 15:11:38 crc kubenswrapper[4771]: W1001 15:11:38.535996 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf11eb6e5_8306_4db5_af63_ef4d869f7e2c.slice/crio-990a821ffedc702f5b50a8e7d9be36c92d0f3e39dacf89d89368fbcb6026ccf2 WatchSource:0}: Error finding container 990a821ffedc702f5b50a8e7d9be36c92d0f3e39dacf89d89368fbcb6026ccf2: Status 404 returned error can't find the container with id 990a821ffedc702f5b50a8e7d9be36c92d0f3e39dacf89d89368fbcb6026ccf2 Oct 01 15:11:39 crc kubenswrapper[4771]: I1001 15:11:39.254472 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 01 15:11:39 crc kubenswrapper[4771]: I1001 15:11:39.260572 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 01 15:11:39 crc kubenswrapper[4771]: I1001 15:11:39.263498 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 01 15:11:39 crc kubenswrapper[4771]: I1001 15:11:39.278845 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 01 15:11:39 crc kubenswrapper[4771]: I1001 15:11:39.278992 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 01 15:11:39 crc kubenswrapper[4771]: I1001 15:11:39.279169 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 01 15:11:39 crc kubenswrapper[4771]: I1001 15:11:39.279235 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 01 15:11:39 crc kubenswrapper[4771]: I1001 15:11:39.279172 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-x8hln" Oct 01 15:11:39 crc kubenswrapper[4771]: I1001 15:11:39.279649 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 01 15:11:39 crc kubenswrapper[4771]: I1001 15:11:39.395881 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f11eb6e5-8306-4db5-af63-ef4d869f7e2c","Type":"ContainerStarted","Data":"990a821ffedc702f5b50a8e7d9be36c92d0f3e39dacf89d89368fbcb6026ccf2"} Oct 01 15:11:39 crc kubenswrapper[4771]: I1001 15:11:39.427755 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e902a14c-a59a-4278-b560-33de2cb50d32-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e902a14c-a59a-4278-b560-33de2cb50d32\") " pod="openstack/openstack-galera-0" Oct 01 15:11:39 crc kubenswrapper[4771]: I1001 15:11:39.427810 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5825\" (UniqueName: \"kubernetes.io/projected/e902a14c-a59a-4278-b560-33de2cb50d32-kube-api-access-p5825\") pod \"openstack-galera-0\" (UID: \"e902a14c-a59a-4278-b560-33de2cb50d32\") " pod="openstack/openstack-galera-0" Oct 01 15:11:39 crc kubenswrapper[4771]: I1001 15:11:39.427944 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e902a14c-a59a-4278-b560-33de2cb50d32-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e902a14c-a59a-4278-b560-33de2cb50d32\") " pod="openstack/openstack-galera-0" Oct 01 15:11:39 crc kubenswrapper[4771]: I1001 15:11:39.428032 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/e902a14c-a59a-4278-b560-33de2cb50d32-secrets\") pod \"openstack-galera-0\" (UID: \"e902a14c-a59a-4278-b560-33de2cb50d32\") " pod="openstack/openstack-galera-0" Oct 01 15:11:39 crc kubenswrapper[4771]: I1001 15:11:39.428070 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e902a14c-a59a-4278-b560-33de2cb50d32-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e902a14c-a59a-4278-b560-33de2cb50d32\") " pod="openstack/openstack-galera-0" Oct 01 15:11:39 crc kubenswrapper[4771]: I1001 15:11:39.428104 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e902a14c-a59a-4278-b560-33de2cb50d32-kolla-config\") pod \"openstack-galera-0\" (UID: \"e902a14c-a59a-4278-b560-33de2cb50d32\") " pod="openstack/openstack-galera-0" Oct 01 15:11:39 crc kubenswrapper[4771]: I1001 15:11:39.428117 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e902a14c-a59a-4278-b560-33de2cb50d32-config-data-default\") pod \"openstack-galera-0\" (UID: \"e902a14c-a59a-4278-b560-33de2cb50d32\") " pod="openstack/openstack-galera-0" Oct 01 15:11:39 crc kubenswrapper[4771]: I1001 15:11:39.428200 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"e902a14c-a59a-4278-b560-33de2cb50d32\") " pod="openstack/openstack-galera-0" Oct 01 15:11:39 crc kubenswrapper[4771]: I1001 15:11:39.428543 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e902a14c-a59a-4278-b560-33de2cb50d32-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e902a14c-a59a-4278-b560-33de2cb50d32\") " pod="openstack/openstack-galera-0" Oct 01 15:11:39 crc kubenswrapper[4771]: I1001 15:11:39.529683 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"e902a14c-a59a-4278-b560-33de2cb50d32\") " pod="openstack/openstack-galera-0" Oct 01 15:11:39 crc kubenswrapper[4771]: I1001 15:11:39.529786 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e902a14c-a59a-4278-b560-33de2cb50d32-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e902a14c-a59a-4278-b560-33de2cb50d32\") " pod="openstack/openstack-galera-0" Oct 01 15:11:39 crc kubenswrapper[4771]: I1001 15:11:39.529824 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e902a14c-a59a-4278-b560-33de2cb50d32-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e902a14c-a59a-4278-b560-33de2cb50d32\") " pod="openstack/openstack-galera-0" Oct 01 15:11:39 crc kubenswrapper[4771]: I1001 15:11:39.529845 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5825\" (UniqueName: \"kubernetes.io/projected/e902a14c-a59a-4278-b560-33de2cb50d32-kube-api-access-p5825\") pod \"openstack-galera-0\" (UID: \"e902a14c-a59a-4278-b560-33de2cb50d32\") " pod="openstack/openstack-galera-0" Oct 01 15:11:39 crc kubenswrapper[4771]: I1001 15:11:39.529881 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e902a14c-a59a-4278-b560-33de2cb50d32-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e902a14c-a59a-4278-b560-33de2cb50d32\") " pod="openstack/openstack-galera-0" Oct 01 15:11:39 crc kubenswrapper[4771]: I1001 15:11:39.529909 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/e902a14c-a59a-4278-b560-33de2cb50d32-secrets\") pod \"openstack-galera-0\" (UID: \"e902a14c-a59a-4278-b560-33de2cb50d32\") " pod="openstack/openstack-galera-0" Oct 01 15:11:39 crc kubenswrapper[4771]: I1001 15:11:39.529929 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e902a14c-a59a-4278-b560-33de2cb50d32-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e902a14c-a59a-4278-b560-33de2cb50d32\") " pod="openstack/openstack-galera-0" Oct 01 15:11:39 crc kubenswrapper[4771]: I1001 15:11:39.529955 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e902a14c-a59a-4278-b560-33de2cb50d32-kolla-config\") pod \"openstack-galera-0\" (UID: \"e902a14c-a59a-4278-b560-33de2cb50d32\") " pod="openstack/openstack-galera-0" Oct 01 15:11:39 crc kubenswrapper[4771]: I1001 15:11:39.529970 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e902a14c-a59a-4278-b560-33de2cb50d32-config-data-default\") pod \"openstack-galera-0\" (UID: \"e902a14c-a59a-4278-b560-33de2cb50d32\") " pod="openstack/openstack-galera-0" Oct 01 15:11:39 crc kubenswrapper[4771]: I1001 15:11:39.530034 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"e902a14c-a59a-4278-b560-33de2cb50d32\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/openstack-galera-0" Oct 01 15:11:39 crc kubenswrapper[4771]: I1001 15:11:39.530608 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e902a14c-a59a-4278-b560-33de2cb50d32-config-data-generated\") pod \"openstack-galera-0\" (UID: \"e902a14c-a59a-4278-b560-33de2cb50d32\") " pod="openstack/openstack-galera-0" Oct 01 15:11:39 crc kubenswrapper[4771]: I1001 15:11:39.531068 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e902a14c-a59a-4278-b560-33de2cb50d32-config-data-default\") pod \"openstack-galera-0\" (UID: \"e902a14c-a59a-4278-b560-33de2cb50d32\") " pod="openstack/openstack-galera-0" Oct 01 15:11:39 crc kubenswrapper[4771]: I1001 15:11:39.531598 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e902a14c-a59a-4278-b560-33de2cb50d32-kolla-config\") pod \"openstack-galera-0\" (UID: \"e902a14c-a59a-4278-b560-33de2cb50d32\") " pod="openstack/openstack-galera-0" Oct 01 15:11:39 crc kubenswrapper[4771]: I1001 15:11:39.532339 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e902a14c-a59a-4278-b560-33de2cb50d32-operator-scripts\") pod \"openstack-galera-0\" (UID: \"e902a14c-a59a-4278-b560-33de2cb50d32\") " pod="openstack/openstack-galera-0" Oct 01 15:11:39 crc kubenswrapper[4771]: I1001 15:11:39.540984 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/e902a14c-a59a-4278-b560-33de2cb50d32-secrets\") pod \"openstack-galera-0\" (UID: \"e902a14c-a59a-4278-b560-33de2cb50d32\") " pod="openstack/openstack-galera-0" Oct 01 15:11:39 crc kubenswrapper[4771]: I1001 15:11:39.541943 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e902a14c-a59a-4278-b560-33de2cb50d32-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"e902a14c-a59a-4278-b560-33de2cb50d32\") " pod="openstack/openstack-galera-0" Oct 01 15:11:39 crc kubenswrapper[4771]: I1001 15:11:39.543571 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e902a14c-a59a-4278-b560-33de2cb50d32-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"e902a14c-a59a-4278-b560-33de2cb50d32\") " pod="openstack/openstack-galera-0" Oct 01 15:11:39 crc kubenswrapper[4771]: I1001 15:11:39.551784 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5825\" (UniqueName: \"kubernetes.io/projected/e902a14c-a59a-4278-b560-33de2cb50d32-kube-api-access-p5825\") pod \"openstack-galera-0\" (UID: \"e902a14c-a59a-4278-b560-33de2cb50d32\") " pod="openstack/openstack-galera-0" Oct 01 15:11:39 crc kubenswrapper[4771]: I1001 15:11:39.572288 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"e902a14c-a59a-4278-b560-33de2cb50d32\") " pod="openstack/openstack-galera-0" Oct 01 15:11:39 crc kubenswrapper[4771]: I1001 15:11:39.602749 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 01 15:11:40 crc kubenswrapper[4771]: I1001 15:11:40.210172 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 01 15:11:40 crc kubenswrapper[4771]: W1001 15:11:40.230229 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode902a14c_a59a_4278_b560_33de2cb50d32.slice/crio-9b5ab227379932bdc65ffba8420f434364cfff3afb85ed18cbbfb416eed474d0 WatchSource:0}: Error finding container 9b5ab227379932bdc65ffba8420f434364cfff3afb85ed18cbbfb416eed474d0: Status 404 returned error can't find the container with id 9b5ab227379932bdc65ffba8420f434364cfff3afb85ed18cbbfb416eed474d0 Oct 01 15:11:40 crc kubenswrapper[4771]: I1001 15:11:40.408765 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e902a14c-a59a-4278-b560-33de2cb50d32","Type":"ContainerStarted","Data":"9b5ab227379932bdc65ffba8420f434364cfff3afb85ed18cbbfb416eed474d0"} Oct 01 15:11:40 crc kubenswrapper[4771]: I1001 15:11:40.566156 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 01 15:11:40 crc kubenswrapper[4771]: I1001 15:11:40.567473 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 01 15:11:40 crc kubenswrapper[4771]: I1001 15:11:40.576961 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 01 15:11:40 crc kubenswrapper[4771]: I1001 15:11:40.589595 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 01 15:11:40 crc kubenswrapper[4771]: I1001 15:11:40.590187 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 01 15:11:40 crc kubenswrapper[4771]: I1001 15:11:40.590822 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-r2997" Oct 01 15:11:40 crc kubenswrapper[4771]: I1001 15:11:40.590966 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 01 15:11:40 crc kubenswrapper[4771]: I1001 15:11:40.655381 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37c38012-d257-4269-86fa-8cf3ef4de4cd-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"37c38012-d257-4269-86fa-8cf3ef4de4cd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 15:11:40 crc kubenswrapper[4771]: I1001 15:11:40.655482 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw2gm\" (UniqueName: \"kubernetes.io/projected/37c38012-d257-4269-86fa-8cf3ef4de4cd-kube-api-access-qw2gm\") pod \"openstack-cell1-galera-0\" (UID: \"37c38012-d257-4269-86fa-8cf3ef4de4cd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 15:11:40 crc kubenswrapper[4771]: I1001 15:11:40.655503 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/37c38012-d257-4269-86fa-8cf3ef4de4cd-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"37c38012-d257-4269-86fa-8cf3ef4de4cd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 15:11:40 crc kubenswrapper[4771]: I1001 15:11:40.655545 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/37c38012-d257-4269-86fa-8cf3ef4de4cd-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"37c38012-d257-4269-86fa-8cf3ef4de4cd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 15:11:40 crc kubenswrapper[4771]: I1001 15:11:40.655636 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37c38012-d257-4269-86fa-8cf3ef4de4cd-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"37c38012-d257-4269-86fa-8cf3ef4de4cd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 15:11:40 crc kubenswrapper[4771]: I1001 15:11:40.655713 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/37c38012-d257-4269-86fa-8cf3ef4de4cd-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"37c38012-d257-4269-86fa-8cf3ef4de4cd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 15:11:40 crc kubenswrapper[4771]: I1001 15:11:40.655768 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"37c38012-d257-4269-86fa-8cf3ef4de4cd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 15:11:40 crc kubenswrapper[4771]: I1001 15:11:40.655791 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/37c38012-d257-4269-86fa-8cf3ef4de4cd-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"37c38012-d257-4269-86fa-8cf3ef4de4cd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 15:11:40 crc kubenswrapper[4771]: I1001 15:11:40.655854 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/37c38012-d257-4269-86fa-8cf3ef4de4cd-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"37c38012-d257-4269-86fa-8cf3ef4de4cd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 15:11:40 crc kubenswrapper[4771]: I1001 15:11:40.757439 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/37c38012-d257-4269-86fa-8cf3ef4de4cd-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"37c38012-d257-4269-86fa-8cf3ef4de4cd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 15:11:40 crc kubenswrapper[4771]: I1001 15:11:40.757556 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37c38012-d257-4269-86fa-8cf3ef4de4cd-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"37c38012-d257-4269-86fa-8cf3ef4de4cd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 15:11:40 crc kubenswrapper[4771]: I1001 15:11:40.757586 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qw2gm\" (UniqueName: \"kubernetes.io/projected/37c38012-d257-4269-86fa-8cf3ef4de4cd-kube-api-access-qw2gm\") pod \"openstack-cell1-galera-0\" (UID: \"37c38012-d257-4269-86fa-8cf3ef4de4cd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 15:11:40 crc kubenswrapper[4771]: I1001 15:11:40.757614 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/37c38012-d257-4269-86fa-8cf3ef4de4cd-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"37c38012-d257-4269-86fa-8cf3ef4de4cd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 15:11:40 crc kubenswrapper[4771]: I1001 15:11:40.757641 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/37c38012-d257-4269-86fa-8cf3ef4de4cd-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"37c38012-d257-4269-86fa-8cf3ef4de4cd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 15:11:40 crc kubenswrapper[4771]: I1001 15:11:40.757669 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37c38012-d257-4269-86fa-8cf3ef4de4cd-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"37c38012-d257-4269-86fa-8cf3ef4de4cd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 15:11:40 crc kubenswrapper[4771]: I1001 15:11:40.757710 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/37c38012-d257-4269-86fa-8cf3ef4de4cd-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"37c38012-d257-4269-86fa-8cf3ef4de4cd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 15:11:40 crc kubenswrapper[4771]: I1001 15:11:40.757760 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"37c38012-d257-4269-86fa-8cf3ef4de4cd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 15:11:40 crc kubenswrapper[4771]: I1001 15:11:40.757988 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/37c38012-d257-4269-86fa-8cf3ef4de4cd-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"37c38012-d257-4269-86fa-8cf3ef4de4cd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 15:11:40 crc kubenswrapper[4771]: I1001 15:11:40.760025 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37c38012-d257-4269-86fa-8cf3ef4de4cd-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"37c38012-d257-4269-86fa-8cf3ef4de4cd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 15:11:40 crc kubenswrapper[4771]: I1001 15:11:40.760640 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/37c38012-d257-4269-86fa-8cf3ef4de4cd-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"37c38012-d257-4269-86fa-8cf3ef4de4cd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 15:11:40 crc kubenswrapper[4771]: I1001 15:11:40.760901 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"37c38012-d257-4269-86fa-8cf3ef4de4cd\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-cell1-galera-0" Oct 01 15:11:40 crc kubenswrapper[4771]: I1001 15:11:40.761047 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/37c38012-d257-4269-86fa-8cf3ef4de4cd-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"37c38012-d257-4269-86fa-8cf3ef4de4cd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 15:11:40 crc kubenswrapper[4771]: I1001 15:11:40.767210 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37c38012-d257-4269-86fa-8cf3ef4de4cd-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"37c38012-d257-4269-86fa-8cf3ef4de4cd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 15:11:40 crc kubenswrapper[4771]: I1001 15:11:40.767926 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/37c38012-d257-4269-86fa-8cf3ef4de4cd-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"37c38012-d257-4269-86fa-8cf3ef4de4cd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 15:11:40 crc kubenswrapper[4771]: I1001 15:11:40.768956 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/37c38012-d257-4269-86fa-8cf3ef4de4cd-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"37c38012-d257-4269-86fa-8cf3ef4de4cd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 15:11:40 crc kubenswrapper[4771]: I1001 15:11:40.788056 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/37c38012-d257-4269-86fa-8cf3ef4de4cd-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"37c38012-d257-4269-86fa-8cf3ef4de4cd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 15:11:40 crc kubenswrapper[4771]: I1001 15:11:40.797606 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw2gm\" (UniqueName: \"kubernetes.io/projected/37c38012-d257-4269-86fa-8cf3ef4de4cd-kube-api-access-qw2gm\") pod \"openstack-cell1-galera-0\" (UID: \"37c38012-d257-4269-86fa-8cf3ef4de4cd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 15:11:40 crc kubenswrapper[4771]: I1001 15:11:40.797675 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"37c38012-d257-4269-86fa-8cf3ef4de4cd\") " pod="openstack/openstack-cell1-galera-0" Oct 01 15:11:40 crc kubenswrapper[4771]: I1001 15:11:40.853747 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 01 15:11:40 crc kubenswrapper[4771]: I1001 15:11:40.854729 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 01 15:11:40 crc kubenswrapper[4771]: I1001 15:11:40.863794 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-q4ldc" Oct 01 15:11:40 crc kubenswrapper[4771]: I1001 15:11:40.868235 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 01 15:11:40 crc kubenswrapper[4771]: I1001 15:11:40.868407 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 01 15:11:40 crc kubenswrapper[4771]: I1001 15:11:40.869423 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 01 15:11:40 crc kubenswrapper[4771]: I1001 15:11:40.910471 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 01 15:11:40 crc kubenswrapper[4771]: I1001 15:11:40.966600 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9m5x\" (UniqueName: \"kubernetes.io/projected/107cd834-196a-4454-b70b-cbb3ab3631df-kube-api-access-f9m5x\") pod \"memcached-0\" (UID: \"107cd834-196a-4454-b70b-cbb3ab3631df\") " pod="openstack/memcached-0" Oct 01 15:11:40 crc kubenswrapper[4771]: I1001 15:11:40.966648 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/107cd834-196a-4454-b70b-cbb3ab3631df-combined-ca-bundle\") pod \"memcached-0\" (UID: \"107cd834-196a-4454-b70b-cbb3ab3631df\") " pod="openstack/memcached-0" Oct 01 15:11:40 crc kubenswrapper[4771]: I1001 15:11:40.966700 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/107cd834-196a-4454-b70b-cbb3ab3631df-memcached-tls-certs\") pod \"memcached-0\" (UID: \"107cd834-196a-4454-b70b-cbb3ab3631df\") " pod="openstack/memcached-0" Oct 01 15:11:40 crc kubenswrapper[4771]: I1001 15:11:40.966812 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/107cd834-196a-4454-b70b-cbb3ab3631df-config-data\") pod \"memcached-0\" (UID: \"107cd834-196a-4454-b70b-cbb3ab3631df\") " pod="openstack/memcached-0" Oct 01 15:11:40 crc kubenswrapper[4771]: I1001 15:11:40.966871 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/107cd834-196a-4454-b70b-cbb3ab3631df-kolla-config\") pod \"memcached-0\" (UID: \"107cd834-196a-4454-b70b-cbb3ab3631df\") " pod="openstack/memcached-0" Oct 01 15:11:41 crc kubenswrapper[4771]: I1001 15:11:41.069085 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/107cd834-196a-4454-b70b-cbb3ab3631df-config-data\") pod \"memcached-0\" (UID: \"107cd834-196a-4454-b70b-cbb3ab3631df\") " pod="openstack/memcached-0" Oct 01 15:11:41 crc kubenswrapper[4771]: I1001 15:11:41.069176 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/107cd834-196a-4454-b70b-cbb3ab3631df-kolla-config\") pod \"memcached-0\" (UID: \"107cd834-196a-4454-b70b-cbb3ab3631df\") " pod="openstack/memcached-0" Oct 01 15:11:41 crc kubenswrapper[4771]: I1001 15:11:41.069201 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9m5x\" (UniqueName: \"kubernetes.io/projected/107cd834-196a-4454-b70b-cbb3ab3631df-kube-api-access-f9m5x\") pod \"memcached-0\" (UID: \"107cd834-196a-4454-b70b-cbb3ab3631df\") " pod="openstack/memcached-0" Oct 01 15:11:41 crc kubenswrapper[4771]: I1001 15:11:41.069221 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/107cd834-196a-4454-b70b-cbb3ab3631df-combined-ca-bundle\") pod \"memcached-0\" (UID: \"107cd834-196a-4454-b70b-cbb3ab3631df\") " pod="openstack/memcached-0" Oct 01 15:11:41 crc kubenswrapper[4771]: I1001 15:11:41.069273 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/107cd834-196a-4454-b70b-cbb3ab3631df-memcached-tls-certs\") pod \"memcached-0\" (UID: \"107cd834-196a-4454-b70b-cbb3ab3631df\") " pod="openstack/memcached-0" Oct 01 15:11:41 crc kubenswrapper[4771]: I1001 15:11:41.071075 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/107cd834-196a-4454-b70b-cbb3ab3631df-kolla-config\") pod \"memcached-0\" (UID: \"107cd834-196a-4454-b70b-cbb3ab3631df\") " pod="openstack/memcached-0" Oct 01 15:11:41 crc kubenswrapper[4771]: I1001 15:11:41.071507 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/107cd834-196a-4454-b70b-cbb3ab3631df-config-data\") pod \"memcached-0\" (UID: \"107cd834-196a-4454-b70b-cbb3ab3631df\") " pod="openstack/memcached-0" Oct 01 15:11:41 crc kubenswrapper[4771]: I1001 15:11:41.085388 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/107cd834-196a-4454-b70b-cbb3ab3631df-combined-ca-bundle\") pod \"memcached-0\" (UID: \"107cd834-196a-4454-b70b-cbb3ab3631df\") " pod="openstack/memcached-0" Oct 01 15:11:41 crc kubenswrapper[4771]: I1001 15:11:41.085788 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/107cd834-196a-4454-b70b-cbb3ab3631df-memcached-tls-certs\") pod \"memcached-0\" (UID: \"107cd834-196a-4454-b70b-cbb3ab3631df\") " pod="openstack/memcached-0" Oct 01 15:11:41 crc kubenswrapper[4771]: I1001 15:11:41.087908 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9m5x\" (UniqueName: \"kubernetes.io/projected/107cd834-196a-4454-b70b-cbb3ab3631df-kube-api-access-f9m5x\") pod \"memcached-0\" (UID: \"107cd834-196a-4454-b70b-cbb3ab3631df\") " pod="openstack/memcached-0" Oct 01 15:11:41 crc kubenswrapper[4771]: I1001 15:11:41.200819 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 01 15:11:42 crc kubenswrapper[4771]: I1001 15:11:42.176860 4771 patch_prober.go:28] interesting pod/machine-config-daemon-vck47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 15:11:42 crc kubenswrapper[4771]: I1001 15:11:42.176909 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 15:11:42 crc kubenswrapper[4771]: I1001 15:11:42.841195 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 15:11:42 crc kubenswrapper[4771]: I1001 15:11:42.842374 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 15:11:42 crc kubenswrapper[4771]: I1001 15:11:42.844902 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-gdj7g" Oct 01 15:11:42 crc kubenswrapper[4771]: I1001 15:11:42.852039 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 15:11:42 crc kubenswrapper[4771]: I1001 15:11:42.900855 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9z5w\" (UniqueName: \"kubernetes.io/projected/eca5bbfa-3927-4c5b-b973-7dce060db69b-kube-api-access-h9z5w\") pod \"kube-state-metrics-0\" (UID: \"eca5bbfa-3927-4c5b-b973-7dce060db69b\") " pod="openstack/kube-state-metrics-0" Oct 01 15:11:43 crc kubenswrapper[4771]: I1001 15:11:43.002710 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9z5w\" (UniqueName: \"kubernetes.io/projected/eca5bbfa-3927-4c5b-b973-7dce060db69b-kube-api-access-h9z5w\") pod \"kube-state-metrics-0\" (UID: \"eca5bbfa-3927-4c5b-b973-7dce060db69b\") " pod="openstack/kube-state-metrics-0" Oct 01 15:11:43 crc kubenswrapper[4771]: I1001 15:11:43.029608 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9z5w\" (UniqueName: \"kubernetes.io/projected/eca5bbfa-3927-4c5b-b973-7dce060db69b-kube-api-access-h9z5w\") pod \"kube-state-metrics-0\" (UID: \"eca5bbfa-3927-4c5b-b973-7dce060db69b\") " pod="openstack/kube-state-metrics-0" Oct 01 15:11:43 crc kubenswrapper[4771]: I1001 15:11:43.174060 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.511095 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-zpdvh"] Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.523222 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zpdvh"] Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.565353 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zpdvh" Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.569211 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.569849 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-tsgbx" Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.572156 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.616905 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-rv4hj"] Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.618765 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-rv4hj" Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.625416 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-rv4hj"] Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.668708 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20d8761e-4ce2-4312-8a80-8c3ce8908f2c-combined-ca-bundle\") pod \"ovn-controller-zpdvh\" (UID: \"20d8761e-4ce2-4312-8a80-8c3ce8908f2c\") " pod="openstack/ovn-controller-zpdvh" Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.668785 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhwnl\" (UniqueName: \"kubernetes.io/projected/20d8761e-4ce2-4312-8a80-8c3ce8908f2c-kube-api-access-bhwnl\") pod \"ovn-controller-zpdvh\" (UID: \"20d8761e-4ce2-4312-8a80-8c3ce8908f2c\") " pod="openstack/ovn-controller-zpdvh" Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.668839 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/20d8761e-4ce2-4312-8a80-8c3ce8908f2c-ovn-controller-tls-certs\") pod \"ovn-controller-zpdvh\" (UID: \"20d8761e-4ce2-4312-8a80-8c3ce8908f2c\") " pod="openstack/ovn-controller-zpdvh" Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.668896 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/20d8761e-4ce2-4312-8a80-8c3ce8908f2c-var-run\") pod \"ovn-controller-zpdvh\" (UID: \"20d8761e-4ce2-4312-8a80-8c3ce8908f2c\") " pod="openstack/ovn-controller-zpdvh" Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.669084 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/20d8761e-4ce2-4312-8a80-8c3ce8908f2c-var-log-ovn\") pod \"ovn-controller-zpdvh\" (UID: \"20d8761e-4ce2-4312-8a80-8c3ce8908f2c\") " pod="openstack/ovn-controller-zpdvh" Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.669135 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/20d8761e-4ce2-4312-8a80-8c3ce8908f2c-scripts\") pod \"ovn-controller-zpdvh\" (UID: \"20d8761e-4ce2-4312-8a80-8c3ce8908f2c\") " pod="openstack/ovn-controller-zpdvh" Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.669205 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/20d8761e-4ce2-4312-8a80-8c3ce8908f2c-var-run-ovn\") pod \"ovn-controller-zpdvh\" (UID: \"20d8761e-4ce2-4312-8a80-8c3ce8908f2c\") " pod="openstack/ovn-controller-zpdvh" Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.770694 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/20d8761e-4ce2-4312-8a80-8c3ce8908f2c-scripts\") pod \"ovn-controller-zpdvh\" (UID: \"20d8761e-4ce2-4312-8a80-8c3ce8908f2c\") " pod="openstack/ovn-controller-zpdvh" Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.770811 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/74fee09d-11ad-45f4-a779-4c352b6dc67f-scripts\") pod \"ovn-controller-ovs-rv4hj\" (UID: \"74fee09d-11ad-45f4-a779-4c352b6dc67f\") " pod="openstack/ovn-controller-ovs-rv4hj" Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.770844 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/20d8761e-4ce2-4312-8a80-8c3ce8908f2c-var-run-ovn\") pod \"ovn-controller-zpdvh\" (UID: \"20d8761e-4ce2-4312-8a80-8c3ce8908f2c\") " pod="openstack/ovn-controller-zpdvh" Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.770866 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/74fee09d-11ad-45f4-a779-4c352b6dc67f-var-log\") pod \"ovn-controller-ovs-rv4hj\" (UID: \"74fee09d-11ad-45f4-a779-4c352b6dc67f\") " pod="openstack/ovn-controller-ovs-rv4hj" Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.770895 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/74fee09d-11ad-45f4-a779-4c352b6dc67f-var-run\") pod \"ovn-controller-ovs-rv4hj\" (UID: \"74fee09d-11ad-45f4-a779-4c352b6dc67f\") " pod="openstack/ovn-controller-ovs-rv4hj" Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.770938 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20d8761e-4ce2-4312-8a80-8c3ce8908f2c-combined-ca-bundle\") pod \"ovn-controller-zpdvh\" (UID: \"20d8761e-4ce2-4312-8a80-8c3ce8908f2c\") " pod="openstack/ovn-controller-zpdvh" Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.770969 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhwnl\" (UniqueName: \"kubernetes.io/projected/20d8761e-4ce2-4312-8a80-8c3ce8908f2c-kube-api-access-bhwnl\") pod \"ovn-controller-zpdvh\" (UID: \"20d8761e-4ce2-4312-8a80-8c3ce8908f2c\") " pod="openstack/ovn-controller-zpdvh" Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.771005 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw9qx\" (UniqueName: \"kubernetes.io/projected/74fee09d-11ad-45f4-a779-4c352b6dc67f-kube-api-access-sw9qx\") pod \"ovn-controller-ovs-rv4hj\" (UID: \"74fee09d-11ad-45f4-a779-4c352b6dc67f\") " pod="openstack/ovn-controller-ovs-rv4hj" Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.771030 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/20d8761e-4ce2-4312-8a80-8c3ce8908f2c-ovn-controller-tls-certs\") pod \"ovn-controller-zpdvh\" (UID: \"20d8761e-4ce2-4312-8a80-8c3ce8908f2c\") " pod="openstack/ovn-controller-zpdvh" Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.771086 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/20d8761e-4ce2-4312-8a80-8c3ce8908f2c-var-run\") pod \"ovn-controller-zpdvh\" (UID: \"20d8761e-4ce2-4312-8a80-8c3ce8908f2c\") " pod="openstack/ovn-controller-zpdvh" Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.771110 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/74fee09d-11ad-45f4-a779-4c352b6dc67f-etc-ovs\") pod \"ovn-controller-ovs-rv4hj\" (UID: \"74fee09d-11ad-45f4-a779-4c352b6dc67f\") " pod="openstack/ovn-controller-ovs-rv4hj" Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.771139 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/74fee09d-11ad-45f4-a779-4c352b6dc67f-var-lib\") pod \"ovn-controller-ovs-rv4hj\" (UID: \"74fee09d-11ad-45f4-a779-4c352b6dc67f\") " pod="openstack/ovn-controller-ovs-rv4hj" Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.771165 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/20d8761e-4ce2-4312-8a80-8c3ce8908f2c-var-log-ovn\") pod \"ovn-controller-zpdvh\" (UID: \"20d8761e-4ce2-4312-8a80-8c3ce8908f2c\") " pod="openstack/ovn-controller-zpdvh" Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.772938 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/20d8761e-4ce2-4312-8a80-8c3ce8908f2c-var-run-ovn\") pod \"ovn-controller-zpdvh\" (UID: \"20d8761e-4ce2-4312-8a80-8c3ce8908f2c\") " pod="openstack/ovn-controller-zpdvh" Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.772992 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/20d8761e-4ce2-4312-8a80-8c3ce8908f2c-var-run\") pod \"ovn-controller-zpdvh\" (UID: \"20d8761e-4ce2-4312-8a80-8c3ce8908f2c\") " pod="openstack/ovn-controller-zpdvh" Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.773980 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/20d8761e-4ce2-4312-8a80-8c3ce8908f2c-var-log-ovn\") pod \"ovn-controller-zpdvh\" (UID: \"20d8761e-4ce2-4312-8a80-8c3ce8908f2c\") " pod="openstack/ovn-controller-zpdvh" Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.774808 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/20d8761e-4ce2-4312-8a80-8c3ce8908f2c-scripts\") pod \"ovn-controller-zpdvh\" (UID: \"20d8761e-4ce2-4312-8a80-8c3ce8908f2c\") " pod="openstack/ovn-controller-zpdvh" Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.777583 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20d8761e-4ce2-4312-8a80-8c3ce8908f2c-combined-ca-bundle\") pod \"ovn-controller-zpdvh\" (UID: \"20d8761e-4ce2-4312-8a80-8c3ce8908f2c\") " pod="openstack/ovn-controller-zpdvh" Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.791011 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/20d8761e-4ce2-4312-8a80-8c3ce8908f2c-ovn-controller-tls-certs\") pod \"ovn-controller-zpdvh\" (UID: \"20d8761e-4ce2-4312-8a80-8c3ce8908f2c\") " pod="openstack/ovn-controller-zpdvh" Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.793424 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhwnl\" (UniqueName: \"kubernetes.io/projected/20d8761e-4ce2-4312-8a80-8c3ce8908f2c-kube-api-access-bhwnl\") pod \"ovn-controller-zpdvh\" (UID: \"20d8761e-4ce2-4312-8a80-8c3ce8908f2c\") " pod="openstack/ovn-controller-zpdvh" Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.872385 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/74fee09d-11ad-45f4-a779-4c352b6dc67f-var-log\") pod \"ovn-controller-ovs-rv4hj\" (UID: \"74fee09d-11ad-45f4-a779-4c352b6dc67f\") " pod="openstack/ovn-controller-ovs-rv4hj" Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.872437 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/74fee09d-11ad-45f4-a779-4c352b6dc67f-var-run\") pod \"ovn-controller-ovs-rv4hj\" (UID: \"74fee09d-11ad-45f4-a779-4c352b6dc67f\") " pod="openstack/ovn-controller-ovs-rv4hj" Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.872493 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sw9qx\" (UniqueName: \"kubernetes.io/projected/74fee09d-11ad-45f4-a779-4c352b6dc67f-kube-api-access-sw9qx\") pod \"ovn-controller-ovs-rv4hj\" (UID: \"74fee09d-11ad-45f4-a779-4c352b6dc67f\") " pod="openstack/ovn-controller-ovs-rv4hj" Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.872537 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/74fee09d-11ad-45f4-a779-4c352b6dc67f-etc-ovs\") pod \"ovn-controller-ovs-rv4hj\" (UID: \"74fee09d-11ad-45f4-a779-4c352b6dc67f\") " pod="openstack/ovn-controller-ovs-rv4hj" Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.872557 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/74fee09d-11ad-45f4-a779-4c352b6dc67f-var-lib\") pod \"ovn-controller-ovs-rv4hj\" (UID: \"74fee09d-11ad-45f4-a779-4c352b6dc67f\") " pod="openstack/ovn-controller-ovs-rv4hj" Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.872593 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/74fee09d-11ad-45f4-a779-4c352b6dc67f-scripts\") pod \"ovn-controller-ovs-rv4hj\" (UID: \"74fee09d-11ad-45f4-a779-4c352b6dc67f\") " pod="openstack/ovn-controller-ovs-rv4hj" Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.873136 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/74fee09d-11ad-45f4-a779-4c352b6dc67f-var-run\") pod \"ovn-controller-ovs-rv4hj\" (UID: \"74fee09d-11ad-45f4-a779-4c352b6dc67f\") " pod="openstack/ovn-controller-ovs-rv4hj" Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.873273 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/74fee09d-11ad-45f4-a779-4c352b6dc67f-etc-ovs\") pod \"ovn-controller-ovs-rv4hj\" (UID: \"74fee09d-11ad-45f4-a779-4c352b6dc67f\") " pod="openstack/ovn-controller-ovs-rv4hj" Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.873304 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/74fee09d-11ad-45f4-a779-4c352b6dc67f-var-log\") pod \"ovn-controller-ovs-rv4hj\" (UID: \"74fee09d-11ad-45f4-a779-4c352b6dc67f\") " pod="openstack/ovn-controller-ovs-rv4hj" Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.873392 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/74fee09d-11ad-45f4-a779-4c352b6dc67f-var-lib\") pod \"ovn-controller-ovs-rv4hj\" (UID: \"74fee09d-11ad-45f4-a779-4c352b6dc67f\") " pod="openstack/ovn-controller-ovs-rv4hj" Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.874442 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/74fee09d-11ad-45f4-a779-4c352b6dc67f-scripts\") pod \"ovn-controller-ovs-rv4hj\" (UID: \"74fee09d-11ad-45f4-a779-4c352b6dc67f\") " pod="openstack/ovn-controller-ovs-rv4hj" Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.891124 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zpdvh" Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.892462 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw9qx\" (UniqueName: \"kubernetes.io/projected/74fee09d-11ad-45f4-a779-4c352b6dc67f-kube-api-access-sw9qx\") pod \"ovn-controller-ovs-rv4hj\" (UID: \"74fee09d-11ad-45f4-a779-4c352b6dc67f\") " pod="openstack/ovn-controller-ovs-rv4hj" Oct 01 15:11:46 crc kubenswrapper[4771]: I1001 15:11:46.937423 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-rv4hj" Oct 01 15:11:47 crc kubenswrapper[4771]: I1001 15:11:47.495800 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 01 15:11:47 crc kubenswrapper[4771]: I1001 15:11:47.499578 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 01 15:11:47 crc kubenswrapper[4771]: I1001 15:11:47.501776 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 01 15:11:47 crc kubenswrapper[4771]: I1001 15:11:47.502000 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-22wtc" Oct 01 15:11:47 crc kubenswrapper[4771]: I1001 15:11:47.502121 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 01 15:11:47 crc kubenswrapper[4771]: I1001 15:11:47.502238 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 01 15:11:47 crc kubenswrapper[4771]: I1001 15:11:47.503679 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 01 15:11:47 crc kubenswrapper[4771]: I1001 15:11:47.522632 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 01 15:11:47 crc kubenswrapper[4771]: I1001 15:11:47.683783 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98zc4\" (UniqueName: \"kubernetes.io/projected/064359ac-92c0-4674-a919-ccb8ffc0a5df-kube-api-access-98zc4\") pod \"ovsdbserver-nb-0\" (UID: \"064359ac-92c0-4674-a919-ccb8ffc0a5df\") " pod="openstack/ovsdbserver-nb-0" Oct 01 15:11:47 crc kubenswrapper[4771]: I1001 15:11:47.683835 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/064359ac-92c0-4674-a919-ccb8ffc0a5df-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"064359ac-92c0-4674-a919-ccb8ffc0a5df\") " pod="openstack/ovsdbserver-nb-0" Oct 01 15:11:47 crc kubenswrapper[4771]: I1001 15:11:47.683871 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/064359ac-92c0-4674-a919-ccb8ffc0a5df-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"064359ac-92c0-4674-a919-ccb8ffc0a5df\") " pod="openstack/ovsdbserver-nb-0" Oct 01 15:11:47 crc kubenswrapper[4771]: I1001 15:11:47.683903 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/064359ac-92c0-4674-a919-ccb8ffc0a5df-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"064359ac-92c0-4674-a919-ccb8ffc0a5df\") " pod="openstack/ovsdbserver-nb-0" Oct 01 15:11:47 crc kubenswrapper[4771]: I1001 15:11:47.683927 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"064359ac-92c0-4674-a919-ccb8ffc0a5df\") " pod="openstack/ovsdbserver-nb-0" Oct 01 15:11:47 crc kubenswrapper[4771]: I1001 15:11:47.684007 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/064359ac-92c0-4674-a919-ccb8ffc0a5df-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"064359ac-92c0-4674-a919-ccb8ffc0a5df\") " pod="openstack/ovsdbserver-nb-0" Oct 01 15:11:47 crc kubenswrapper[4771]: I1001 15:11:47.684051 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/064359ac-92c0-4674-a919-ccb8ffc0a5df-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"064359ac-92c0-4674-a919-ccb8ffc0a5df\") " pod="openstack/ovsdbserver-nb-0" Oct 01 15:11:47 crc kubenswrapper[4771]: I1001 15:11:47.684074 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/064359ac-92c0-4674-a919-ccb8ffc0a5df-config\") pod \"ovsdbserver-nb-0\" (UID: \"064359ac-92c0-4674-a919-ccb8ffc0a5df\") " pod="openstack/ovsdbserver-nb-0" Oct 01 15:11:47 crc kubenswrapper[4771]: I1001 15:11:47.785095 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/064359ac-92c0-4674-a919-ccb8ffc0a5df-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"064359ac-92c0-4674-a919-ccb8ffc0a5df\") " pod="openstack/ovsdbserver-nb-0" Oct 01 15:11:47 crc kubenswrapper[4771]: I1001 15:11:47.785172 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/064359ac-92c0-4674-a919-ccb8ffc0a5df-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"064359ac-92c0-4674-a919-ccb8ffc0a5df\") " pod="openstack/ovsdbserver-nb-0" Oct 01 15:11:47 crc kubenswrapper[4771]: I1001 15:11:47.785205 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/064359ac-92c0-4674-a919-ccb8ffc0a5df-config\") pod \"ovsdbserver-nb-0\" (UID: \"064359ac-92c0-4674-a919-ccb8ffc0a5df\") " pod="openstack/ovsdbserver-nb-0" Oct 01 15:11:47 crc kubenswrapper[4771]: I1001 15:11:47.785257 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98zc4\" (UniqueName: \"kubernetes.io/projected/064359ac-92c0-4674-a919-ccb8ffc0a5df-kube-api-access-98zc4\") pod \"ovsdbserver-nb-0\" (UID: \"064359ac-92c0-4674-a919-ccb8ffc0a5df\") " pod="openstack/ovsdbserver-nb-0" Oct 01 15:11:47 crc kubenswrapper[4771]: I1001 15:11:47.785289 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/064359ac-92c0-4674-a919-ccb8ffc0a5df-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"064359ac-92c0-4674-a919-ccb8ffc0a5df\") " pod="openstack/ovsdbserver-nb-0" Oct 01 15:11:47 crc kubenswrapper[4771]: I1001 15:11:47.785313 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/064359ac-92c0-4674-a919-ccb8ffc0a5df-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"064359ac-92c0-4674-a919-ccb8ffc0a5df\") " pod="openstack/ovsdbserver-nb-0" Oct 01 15:11:47 crc kubenswrapper[4771]: I1001 15:11:47.785346 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/064359ac-92c0-4674-a919-ccb8ffc0a5df-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"064359ac-92c0-4674-a919-ccb8ffc0a5df\") " pod="openstack/ovsdbserver-nb-0" Oct 01 15:11:47 crc kubenswrapper[4771]: I1001 15:11:47.785368 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"064359ac-92c0-4674-a919-ccb8ffc0a5df\") " pod="openstack/ovsdbserver-nb-0" Oct 01 15:11:47 crc kubenswrapper[4771]: I1001 15:11:47.785746 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"064359ac-92c0-4674-a919-ccb8ffc0a5df\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-nb-0" Oct 01 15:11:47 crc kubenswrapper[4771]: I1001 15:11:47.786124 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/064359ac-92c0-4674-a919-ccb8ffc0a5df-config\") pod \"ovsdbserver-nb-0\" (UID: \"064359ac-92c0-4674-a919-ccb8ffc0a5df\") " pod="openstack/ovsdbserver-nb-0" Oct 01 15:11:47 crc kubenswrapper[4771]: I1001 15:11:47.786396 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/064359ac-92c0-4674-a919-ccb8ffc0a5df-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"064359ac-92c0-4674-a919-ccb8ffc0a5df\") " pod="openstack/ovsdbserver-nb-0" Oct 01 15:11:47 crc kubenswrapper[4771]: I1001 15:11:47.786608 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/064359ac-92c0-4674-a919-ccb8ffc0a5df-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"064359ac-92c0-4674-a919-ccb8ffc0a5df\") " pod="openstack/ovsdbserver-nb-0" Oct 01 15:11:47 crc kubenswrapper[4771]: I1001 15:11:47.790254 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/064359ac-92c0-4674-a919-ccb8ffc0a5df-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"064359ac-92c0-4674-a919-ccb8ffc0a5df\") " pod="openstack/ovsdbserver-nb-0" Oct 01 15:11:47 crc kubenswrapper[4771]: I1001 15:11:47.792284 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/064359ac-92c0-4674-a919-ccb8ffc0a5df-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"064359ac-92c0-4674-a919-ccb8ffc0a5df\") " pod="openstack/ovsdbserver-nb-0" Oct 01 15:11:47 crc kubenswrapper[4771]: I1001 15:11:47.802764 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98zc4\" (UniqueName: \"kubernetes.io/projected/064359ac-92c0-4674-a919-ccb8ffc0a5df-kube-api-access-98zc4\") pod \"ovsdbserver-nb-0\" (UID: \"064359ac-92c0-4674-a919-ccb8ffc0a5df\") " pod="openstack/ovsdbserver-nb-0" Oct 01 15:11:47 crc kubenswrapper[4771]: I1001 15:11:47.815093 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"064359ac-92c0-4674-a919-ccb8ffc0a5df\") " pod="openstack/ovsdbserver-nb-0" Oct 01 15:11:47 crc kubenswrapper[4771]: I1001 15:11:47.824423 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/064359ac-92c0-4674-a919-ccb8ffc0a5df-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"064359ac-92c0-4674-a919-ccb8ffc0a5df\") " pod="openstack/ovsdbserver-nb-0" Oct 01 15:11:47 crc kubenswrapper[4771]: I1001 15:11:47.829095 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 01 15:11:50 crc kubenswrapper[4771]: I1001 15:11:50.487711 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 01 15:11:50 crc kubenswrapper[4771]: I1001 15:11:50.489447 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 01 15:11:50 crc kubenswrapper[4771]: I1001 15:11:50.495539 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-q9hfj" Oct 01 15:11:50 crc kubenswrapper[4771]: I1001 15:11:50.495726 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 01 15:11:50 crc kubenswrapper[4771]: I1001 15:11:50.495883 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 01 15:11:50 crc kubenswrapper[4771]: I1001 15:11:50.497653 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 01 15:11:50 crc kubenswrapper[4771]: I1001 15:11:50.517119 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 01 15:11:50 crc kubenswrapper[4771]: I1001 15:11:50.630297 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"999431d1-6d92-46de-ba0f-b253f96fe627\") " pod="openstack/ovsdbserver-sb-0" Oct 01 15:11:50 crc kubenswrapper[4771]: I1001 15:11:50.630359 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/999431d1-6d92-46de-ba0f-b253f96fe627-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"999431d1-6d92-46de-ba0f-b253f96fe627\") " pod="openstack/ovsdbserver-sb-0" Oct 01 15:11:50 crc kubenswrapper[4771]: I1001 15:11:50.630386 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/999431d1-6d92-46de-ba0f-b253f96fe627-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"999431d1-6d92-46de-ba0f-b253f96fe627\") " pod="openstack/ovsdbserver-sb-0" Oct 01 15:11:50 crc kubenswrapper[4771]: I1001 15:11:50.630404 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/999431d1-6d92-46de-ba0f-b253f96fe627-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"999431d1-6d92-46de-ba0f-b253f96fe627\") " pod="openstack/ovsdbserver-sb-0" Oct 01 15:11:50 crc kubenswrapper[4771]: I1001 15:11:50.630432 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br45z\" (UniqueName: \"kubernetes.io/projected/999431d1-6d92-46de-ba0f-b253f96fe627-kube-api-access-br45z\") pod \"ovsdbserver-sb-0\" (UID: \"999431d1-6d92-46de-ba0f-b253f96fe627\") " pod="openstack/ovsdbserver-sb-0" Oct 01 15:11:50 crc kubenswrapper[4771]: I1001 15:11:50.630460 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/999431d1-6d92-46de-ba0f-b253f96fe627-config\") pod \"ovsdbserver-sb-0\" (UID: \"999431d1-6d92-46de-ba0f-b253f96fe627\") " pod="openstack/ovsdbserver-sb-0" Oct 01 15:11:50 crc kubenswrapper[4771]: I1001 15:11:50.630478 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/999431d1-6d92-46de-ba0f-b253f96fe627-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"999431d1-6d92-46de-ba0f-b253f96fe627\") " pod="openstack/ovsdbserver-sb-0" Oct 01 15:11:50 crc kubenswrapper[4771]: I1001 15:11:50.630497 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/999431d1-6d92-46de-ba0f-b253f96fe627-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"999431d1-6d92-46de-ba0f-b253f96fe627\") " pod="openstack/ovsdbserver-sb-0" Oct 01 15:11:50 crc kubenswrapper[4771]: I1001 15:11:50.732014 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"999431d1-6d92-46de-ba0f-b253f96fe627\") " pod="openstack/ovsdbserver-sb-0" Oct 01 15:11:50 crc kubenswrapper[4771]: I1001 15:11:50.732079 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/999431d1-6d92-46de-ba0f-b253f96fe627-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"999431d1-6d92-46de-ba0f-b253f96fe627\") " pod="openstack/ovsdbserver-sb-0" Oct 01 15:11:50 crc kubenswrapper[4771]: I1001 15:11:50.732110 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/999431d1-6d92-46de-ba0f-b253f96fe627-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"999431d1-6d92-46de-ba0f-b253f96fe627\") " pod="openstack/ovsdbserver-sb-0" Oct 01 15:11:50 crc kubenswrapper[4771]: I1001 15:11:50.732126 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/999431d1-6d92-46de-ba0f-b253f96fe627-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"999431d1-6d92-46de-ba0f-b253f96fe627\") " pod="openstack/ovsdbserver-sb-0" Oct 01 15:11:50 crc kubenswrapper[4771]: I1001 15:11:50.732157 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-br45z\" (UniqueName: \"kubernetes.io/projected/999431d1-6d92-46de-ba0f-b253f96fe627-kube-api-access-br45z\") pod \"ovsdbserver-sb-0\" (UID: \"999431d1-6d92-46de-ba0f-b253f96fe627\") " pod="openstack/ovsdbserver-sb-0" Oct 01 15:11:50 crc kubenswrapper[4771]: I1001 15:11:50.732183 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/999431d1-6d92-46de-ba0f-b253f96fe627-config\") pod \"ovsdbserver-sb-0\" (UID: \"999431d1-6d92-46de-ba0f-b253f96fe627\") " pod="openstack/ovsdbserver-sb-0" Oct 01 15:11:50 crc kubenswrapper[4771]: I1001 15:11:50.732203 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/999431d1-6d92-46de-ba0f-b253f96fe627-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"999431d1-6d92-46de-ba0f-b253f96fe627\") " pod="openstack/ovsdbserver-sb-0" Oct 01 15:11:50 crc kubenswrapper[4771]: I1001 15:11:50.732223 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/999431d1-6d92-46de-ba0f-b253f96fe627-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"999431d1-6d92-46de-ba0f-b253f96fe627\") " pod="openstack/ovsdbserver-sb-0" Oct 01 15:11:50 crc kubenswrapper[4771]: I1001 15:11:50.732289 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"999431d1-6d92-46de-ba0f-b253f96fe627\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-sb-0" Oct 01 15:11:50 crc kubenswrapper[4771]: I1001 15:11:50.732754 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/999431d1-6d92-46de-ba0f-b253f96fe627-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"999431d1-6d92-46de-ba0f-b253f96fe627\") " pod="openstack/ovsdbserver-sb-0" Oct 01 15:11:50 crc kubenswrapper[4771]: I1001 15:11:50.733142 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/999431d1-6d92-46de-ba0f-b253f96fe627-config\") pod \"ovsdbserver-sb-0\" (UID: \"999431d1-6d92-46de-ba0f-b253f96fe627\") " pod="openstack/ovsdbserver-sb-0" Oct 01 15:11:50 crc kubenswrapper[4771]: I1001 15:11:50.733933 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/999431d1-6d92-46de-ba0f-b253f96fe627-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"999431d1-6d92-46de-ba0f-b253f96fe627\") " pod="openstack/ovsdbserver-sb-0" Oct 01 15:11:50 crc kubenswrapper[4771]: I1001 15:11:50.752404 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/999431d1-6d92-46de-ba0f-b253f96fe627-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"999431d1-6d92-46de-ba0f-b253f96fe627\") " pod="openstack/ovsdbserver-sb-0" Oct 01 15:11:50 crc kubenswrapper[4771]: I1001 15:11:50.752481 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/999431d1-6d92-46de-ba0f-b253f96fe627-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"999431d1-6d92-46de-ba0f-b253f96fe627\") " pod="openstack/ovsdbserver-sb-0" Oct 01 15:11:50 crc kubenswrapper[4771]: I1001 15:11:50.752694 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/999431d1-6d92-46de-ba0f-b253f96fe627-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"999431d1-6d92-46de-ba0f-b253f96fe627\") " pod="openstack/ovsdbserver-sb-0" Oct 01 15:11:50 crc kubenswrapper[4771]: I1001 15:11:50.761659 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-br45z\" (UniqueName: \"kubernetes.io/projected/999431d1-6d92-46de-ba0f-b253f96fe627-kube-api-access-br45z\") pod \"ovsdbserver-sb-0\" (UID: \"999431d1-6d92-46de-ba0f-b253f96fe627\") " pod="openstack/ovsdbserver-sb-0" Oct 01 15:11:50 crc kubenswrapper[4771]: I1001 15:11:50.775233 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-sb-0\" (UID: \"999431d1-6d92-46de-ba0f-b253f96fe627\") " pod="openstack/ovsdbserver-sb-0" Oct 01 15:11:50 crc kubenswrapper[4771]: I1001 15:11:50.811008 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 01 15:11:55 crc kubenswrapper[4771]: E1001 15:11:55.105651 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Oct 01 15:11:55 crc kubenswrapper[4771]: E1001 15:11:55.106288 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vj5v4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(4ef68dad-0f62-4a2d-aa86-23997c284df0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 15:11:55 crc kubenswrapper[4771]: E1001 15:11:55.107487 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="4ef68dad-0f62-4a2d-aa86-23997c284df0" Oct 01 15:11:55 crc kubenswrapper[4771]: E1001 15:11:55.589319 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="4ef68dad-0f62-4a2d-aa86-23997c284df0" Oct 01 15:12:04 crc kubenswrapper[4771]: E1001 15:12:04.665691 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Oct 01 15:12:04 crc kubenswrapper[4771]: E1001 15:12:04.667378 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:DB_ROOT_PASSWORD,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:DbRootPassword,Optional:nil,},},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:secrets,ReadOnly:true,MountPath:/var/lib/secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p5825,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(e902a14c-a59a-4278-b560-33de2cb50d32): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 15:12:04 crc kubenswrapper[4771]: E1001 15:12:04.668632 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="e902a14c-a59a-4278-b560-33de2cb50d32" Oct 01 15:12:04 crc kubenswrapper[4771]: E1001 15:12:04.725193 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Oct 01 15:12:04 crc kubenswrapper[4771]: E1001 15:12:04.725479 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tcsk9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(f11eb6e5-8306-4db5-af63-ef4d869f7e2c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 15:12:04 crc kubenswrapper[4771]: E1001 15:12:04.726716 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="f11eb6e5-8306-4db5-af63-ef4d869f7e2c" Oct 01 15:12:05 crc kubenswrapper[4771]: E1001 15:12:05.670544 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="f11eb6e5-8306-4db5-af63-ef4d869f7e2c" Oct 01 15:12:05 crc kubenswrapper[4771]: E1001 15:12:05.670604 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-galera-0" podUID="e902a14c-a59a-4278-b560-33de2cb50d32" Oct 01 15:12:05 crc kubenswrapper[4771]: E1001 15:12:05.831641 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 01 15:12:05 crc kubenswrapper[4771]: E1001 15:12:05.831912 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qfbq2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-gtppt_openstack(8860d5a6-498b-4dcd-9648-93aa7afe9d41): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 15:12:05 crc kubenswrapper[4771]: E1001 15:12:05.833096 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-gtppt" podUID="8860d5a6-498b-4dcd-9648-93aa7afe9d41" Oct 01 15:12:05 crc kubenswrapper[4771]: E1001 15:12:05.908240 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 01 15:12:05 crc kubenswrapper[4771]: E1001 15:12:05.908422 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mdb9l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-xd9fd_openstack(ffa0e70f-f409-4edb-af80-45683a7ec8cb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 15:12:05 crc kubenswrapper[4771]: E1001 15:12:05.909671 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-xd9fd" podUID="ffa0e70f-f409-4edb-af80-45683a7ec8cb" Oct 01 15:12:05 crc kubenswrapper[4771]: E1001 15:12:05.942147 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 01 15:12:05 crc kubenswrapper[4771]: E1001 15:12:05.942320 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4pbwq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-rv87g_openstack(d2091b34-b70f-4185-b37a-99e3bc0db9e7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 15:12:05 crc kubenswrapper[4771]: E1001 15:12:05.943490 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-rv87g" podUID="d2091b34-b70f-4185-b37a-99e3bc0db9e7" Oct 01 15:12:05 crc kubenswrapper[4771]: E1001 15:12:05.947540 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Oct 01 15:12:05 crc kubenswrapper[4771]: E1001 15:12:05.947664 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qn4gm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-nbmcf_openstack(182a382f-1b8f-49c4-8386-e9aa14cc4ffc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 15:12:05 crc kubenswrapper[4771]: E1001 15:12:05.948882 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-nbmcf" podUID="182a382f-1b8f-49c4-8386-e9aa14cc4ffc" Oct 01 15:12:06 crc kubenswrapper[4771]: I1001 15:12:06.300174 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zpdvh"] Oct 01 15:12:06 crc kubenswrapper[4771]: I1001 15:12:06.320489 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 01 15:12:06 crc kubenswrapper[4771]: I1001 15:12:06.327605 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 01 15:12:06 crc kubenswrapper[4771]: I1001 15:12:06.332484 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 15:12:06 crc kubenswrapper[4771]: I1001 15:12:06.422214 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 01 15:12:06 crc kubenswrapper[4771]: I1001 15:12:06.549169 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 01 15:12:06 crc kubenswrapper[4771]: W1001 15:12:06.558975 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod999431d1_6d92_46de_ba0f_b253f96fe627.slice/crio-f549b30cca54f0948e40aa37582bc2be5f19cd9c504be2eeea412aef41a5ea40 WatchSource:0}: Error finding container f549b30cca54f0948e40aa37582bc2be5f19cd9c504be2eeea412aef41a5ea40: Status 404 returned error can't find the container with id f549b30cca54f0948e40aa37582bc2be5f19cd9c504be2eeea412aef41a5ea40 Oct 01 15:12:06 crc kubenswrapper[4771]: I1001 15:12:06.681939 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"37c38012-d257-4269-86fa-8cf3ef4de4cd","Type":"ContainerStarted","Data":"e4e7e19e67f533c386c0aee968ab82b2fb6e879a12f575ab1fa358fd159f6f48"} Oct 01 15:12:06 crc kubenswrapper[4771]: I1001 15:12:06.684952 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zpdvh" event={"ID":"20d8761e-4ce2-4312-8a80-8c3ce8908f2c","Type":"ContainerStarted","Data":"9afa106c98684b17096e00dcd1a66b4478c3654a7172d84c811243123acdd687"} Oct 01 15:12:06 crc kubenswrapper[4771]: I1001 15:12:06.687474 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"eca5bbfa-3927-4c5b-b973-7dce060db69b","Type":"ContainerStarted","Data":"bbafb8ac2a540e25e11b91db5a3bb2c49a8d1897e1e007ba4a5875748efa30ca"} Oct 01 15:12:06 crc kubenswrapper[4771]: I1001 15:12:06.689179 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"999431d1-6d92-46de-ba0f-b253f96fe627","Type":"ContainerStarted","Data":"f549b30cca54f0948e40aa37582bc2be5f19cd9c504be2eeea412aef41a5ea40"} Oct 01 15:12:06 crc kubenswrapper[4771]: I1001 15:12:06.691001 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"107cd834-196a-4454-b70b-cbb3ab3631df","Type":"ContainerStarted","Data":"2cf55a1293153ace4b91f29ec4b199a9616772c51ab53da9ecf0dceecfff8634"} Oct 01 15:12:06 crc kubenswrapper[4771]: I1001 15:12:06.692380 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"064359ac-92c0-4674-a919-ccb8ffc0a5df","Type":"ContainerStarted","Data":"7ad094edbdc464c224291cec06460cec7ad2054524a5b237495fc05745dbb04d"} Oct 01 15:12:06 crc kubenswrapper[4771]: E1001 15:12:06.695137 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-nbmcf" podUID="182a382f-1b8f-49c4-8386-e9aa14cc4ffc" Oct 01 15:12:06 crc kubenswrapper[4771]: E1001 15:12:06.697240 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-gtppt" podUID="8860d5a6-498b-4dcd-9648-93aa7afe9d41" Oct 01 15:12:07 crc kubenswrapper[4771]: I1001 15:12:07.059254 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-rv87g" Oct 01 15:12:07 crc kubenswrapper[4771]: I1001 15:12:07.069932 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-xd9fd" Oct 01 15:12:07 crc kubenswrapper[4771]: I1001 15:12:07.107092 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-rv4hj"] Oct 01 15:12:07 crc kubenswrapper[4771]: I1001 15:12:07.141695 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ffa0e70f-f409-4edb-af80-45683a7ec8cb-dns-svc\") pod \"ffa0e70f-f409-4edb-af80-45683a7ec8cb\" (UID: \"ffa0e70f-f409-4edb-af80-45683a7ec8cb\") " Oct 01 15:12:07 crc kubenswrapper[4771]: I1001 15:12:07.141838 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2091b34-b70f-4185-b37a-99e3bc0db9e7-config\") pod \"d2091b34-b70f-4185-b37a-99e3bc0db9e7\" (UID: \"d2091b34-b70f-4185-b37a-99e3bc0db9e7\") " Oct 01 15:12:07 crc kubenswrapper[4771]: I1001 15:12:07.141936 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffa0e70f-f409-4edb-af80-45683a7ec8cb-config\") pod \"ffa0e70f-f409-4edb-af80-45683a7ec8cb\" (UID: \"ffa0e70f-f409-4edb-af80-45683a7ec8cb\") " Oct 01 15:12:07 crc kubenswrapper[4771]: I1001 15:12:07.142032 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdb9l\" (UniqueName: \"kubernetes.io/projected/ffa0e70f-f409-4edb-af80-45683a7ec8cb-kube-api-access-mdb9l\") pod \"ffa0e70f-f409-4edb-af80-45683a7ec8cb\" (UID: \"ffa0e70f-f409-4edb-af80-45683a7ec8cb\") " Oct 01 15:12:07 crc kubenswrapper[4771]: I1001 15:12:07.142078 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pbwq\" (UniqueName: \"kubernetes.io/projected/d2091b34-b70f-4185-b37a-99e3bc0db9e7-kube-api-access-4pbwq\") pod \"d2091b34-b70f-4185-b37a-99e3bc0db9e7\" (UID: \"d2091b34-b70f-4185-b37a-99e3bc0db9e7\") " Oct 01 15:12:07 crc kubenswrapper[4771]: I1001 15:12:07.142692 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffa0e70f-f409-4edb-af80-45683a7ec8cb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ffa0e70f-f409-4edb-af80-45683a7ec8cb" (UID: "ffa0e70f-f409-4edb-af80-45683a7ec8cb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:12:07 crc kubenswrapper[4771]: I1001 15:12:07.143627 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d2091b34-b70f-4185-b37a-99e3bc0db9e7-config" (OuterVolumeSpecName: "config") pod "d2091b34-b70f-4185-b37a-99e3bc0db9e7" (UID: "d2091b34-b70f-4185-b37a-99e3bc0db9e7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:12:07 crc kubenswrapper[4771]: I1001 15:12:07.143673 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffa0e70f-f409-4edb-af80-45683a7ec8cb-config" (OuterVolumeSpecName: "config") pod "ffa0e70f-f409-4edb-af80-45683a7ec8cb" (UID: "ffa0e70f-f409-4edb-af80-45683a7ec8cb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:12:07 crc kubenswrapper[4771]: I1001 15:12:07.148868 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffa0e70f-f409-4edb-af80-45683a7ec8cb-kube-api-access-mdb9l" (OuterVolumeSpecName: "kube-api-access-mdb9l") pod "ffa0e70f-f409-4edb-af80-45683a7ec8cb" (UID: "ffa0e70f-f409-4edb-af80-45683a7ec8cb"). InnerVolumeSpecName "kube-api-access-mdb9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:12:07 crc kubenswrapper[4771]: I1001 15:12:07.148989 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2091b34-b70f-4185-b37a-99e3bc0db9e7-kube-api-access-4pbwq" (OuterVolumeSpecName: "kube-api-access-4pbwq") pod "d2091b34-b70f-4185-b37a-99e3bc0db9e7" (UID: "d2091b34-b70f-4185-b37a-99e3bc0db9e7"). InnerVolumeSpecName "kube-api-access-4pbwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:12:07 crc kubenswrapper[4771]: I1001 15:12:07.243277 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ffa0e70f-f409-4edb-af80-45683a7ec8cb-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 15:12:07 crc kubenswrapper[4771]: I1001 15:12:07.243306 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d2091b34-b70f-4185-b37a-99e3bc0db9e7-config\") on node \"crc\" DevicePath \"\"" Oct 01 15:12:07 crc kubenswrapper[4771]: I1001 15:12:07.243318 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffa0e70f-f409-4edb-af80-45683a7ec8cb-config\") on node \"crc\" DevicePath \"\"" Oct 01 15:12:07 crc kubenswrapper[4771]: I1001 15:12:07.243328 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdb9l\" (UniqueName: \"kubernetes.io/projected/ffa0e70f-f409-4edb-af80-45683a7ec8cb-kube-api-access-mdb9l\") on node \"crc\" DevicePath \"\"" Oct 01 15:12:07 crc kubenswrapper[4771]: I1001 15:12:07.243343 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pbwq\" (UniqueName: \"kubernetes.io/projected/d2091b34-b70f-4185-b37a-99e3bc0db9e7-kube-api-access-4pbwq\") on node \"crc\" DevicePath \"\"" Oct 01 15:12:07 crc kubenswrapper[4771]: I1001 15:12:07.701857 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-rv4hj" event={"ID":"74fee09d-11ad-45f4-a779-4c352b6dc67f","Type":"ContainerStarted","Data":"e7496807fd5d0678046d8f44b0c913fe6cfa403b947b37e52a60829211120b9c"} Oct 01 15:12:07 crc kubenswrapper[4771]: I1001 15:12:07.703228 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-xd9fd" event={"ID":"ffa0e70f-f409-4edb-af80-45683a7ec8cb","Type":"ContainerDied","Data":"de65681cc3f5f466adcbace8a9e4ebf156377eedafe3e3a935afa3cdfd64cca2"} Oct 01 15:12:07 crc kubenswrapper[4771]: I1001 15:12:07.703245 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-xd9fd" Oct 01 15:12:07 crc kubenswrapper[4771]: I1001 15:12:07.704856 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-rv87g" event={"ID":"d2091b34-b70f-4185-b37a-99e3bc0db9e7","Type":"ContainerDied","Data":"b94a90db235b38137eb65eacece325fb4cb3fa9dadee4afe3f537aed7b35e84e"} Oct 01 15:12:07 crc kubenswrapper[4771]: I1001 15:12:07.704954 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-rv87g" Oct 01 15:12:07 crc kubenswrapper[4771]: I1001 15:12:07.770808 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-xd9fd"] Oct 01 15:12:07 crc kubenswrapper[4771]: I1001 15:12:07.787401 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-xd9fd"] Oct 01 15:12:07 crc kubenswrapper[4771]: I1001 15:12:07.797410 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-rv87g"] Oct 01 15:12:07 crc kubenswrapper[4771]: I1001 15:12:07.801421 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-rv87g"] Oct 01 15:12:07 crc kubenswrapper[4771]: I1001 15:12:07.998657 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2091b34-b70f-4185-b37a-99e3bc0db9e7" path="/var/lib/kubelet/pods/d2091b34-b70f-4185-b37a-99e3bc0db9e7/volumes" Oct 01 15:12:07 crc kubenswrapper[4771]: I1001 15:12:07.999090 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffa0e70f-f409-4edb-af80-45683a7ec8cb" path="/var/lib/kubelet/pods/ffa0e70f-f409-4edb-af80-45683a7ec8cb/volumes" Oct 01 15:12:08 crc kubenswrapper[4771]: I1001 15:12:08.713569 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4ef68dad-0f62-4a2d-aa86-23997c284df0","Type":"ContainerStarted","Data":"6d718463a0c87a6bd2faba1cb023cacfebbf31e9054a026b22782b0248bd589e"} Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.177524 4771 patch_prober.go:28] interesting pod/machine-config-daemon-vck47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.178179 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.178240 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vck47" Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.179187 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a954616bb5027e4b658bf522da064c2dd70331be4152d83f2506f267347e29d3"} pod="openshift-machine-config-operator/machine-config-daemon-vck47" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.179284 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" containerID="cri-o://a954616bb5027e4b658bf522da064c2dd70331be4152d83f2506f267347e29d3" gracePeriod=600 Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.489891 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-tckkt"] Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.491155 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-tckkt" Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.493376 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.497242 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-tckkt"] Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.525645 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/553f381f-ef83-4876-8a81-df81a5be7dd8-ovs-rundir\") pod \"ovn-controller-metrics-tckkt\" (UID: \"553f381f-ef83-4876-8a81-df81a5be7dd8\") " pod="openstack/ovn-controller-metrics-tckkt" Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.525749 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psx78\" (UniqueName: \"kubernetes.io/projected/553f381f-ef83-4876-8a81-df81a5be7dd8-kube-api-access-psx78\") pod \"ovn-controller-metrics-tckkt\" (UID: \"553f381f-ef83-4876-8a81-df81a5be7dd8\") " pod="openstack/ovn-controller-metrics-tckkt" Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.525782 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/553f381f-ef83-4876-8a81-df81a5be7dd8-ovn-rundir\") pod \"ovn-controller-metrics-tckkt\" (UID: \"553f381f-ef83-4876-8a81-df81a5be7dd8\") " pod="openstack/ovn-controller-metrics-tckkt" Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.525822 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/553f381f-ef83-4876-8a81-df81a5be7dd8-config\") pod \"ovn-controller-metrics-tckkt\" (UID: \"553f381f-ef83-4876-8a81-df81a5be7dd8\") " pod="openstack/ovn-controller-metrics-tckkt" Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.525881 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/553f381f-ef83-4876-8a81-df81a5be7dd8-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-tckkt\" (UID: \"553f381f-ef83-4876-8a81-df81a5be7dd8\") " pod="openstack/ovn-controller-metrics-tckkt" Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.525951 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/553f381f-ef83-4876-8a81-df81a5be7dd8-combined-ca-bundle\") pod \"ovn-controller-metrics-tckkt\" (UID: \"553f381f-ef83-4876-8a81-df81a5be7dd8\") " pod="openstack/ovn-controller-metrics-tckkt" Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.630258 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/553f381f-ef83-4876-8a81-df81a5be7dd8-ovn-rundir\") pod \"ovn-controller-metrics-tckkt\" (UID: \"553f381f-ef83-4876-8a81-df81a5be7dd8\") " pod="openstack/ovn-controller-metrics-tckkt" Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.630669 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/553f381f-ef83-4876-8a81-df81a5be7dd8-config\") pod \"ovn-controller-metrics-tckkt\" (UID: \"553f381f-ef83-4876-8a81-df81a5be7dd8\") " pod="openstack/ovn-controller-metrics-tckkt" Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.630698 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/553f381f-ef83-4876-8a81-df81a5be7dd8-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-tckkt\" (UID: \"553f381f-ef83-4876-8a81-df81a5be7dd8\") " pod="openstack/ovn-controller-metrics-tckkt" Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.630754 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/553f381f-ef83-4876-8a81-df81a5be7dd8-combined-ca-bundle\") pod \"ovn-controller-metrics-tckkt\" (UID: \"553f381f-ef83-4876-8a81-df81a5be7dd8\") " pod="openstack/ovn-controller-metrics-tckkt" Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.630813 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/553f381f-ef83-4876-8a81-df81a5be7dd8-ovs-rundir\") pod \"ovn-controller-metrics-tckkt\" (UID: \"553f381f-ef83-4876-8a81-df81a5be7dd8\") " pod="openstack/ovn-controller-metrics-tckkt" Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.630836 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psx78\" (UniqueName: \"kubernetes.io/projected/553f381f-ef83-4876-8a81-df81a5be7dd8-kube-api-access-psx78\") pod \"ovn-controller-metrics-tckkt\" (UID: \"553f381f-ef83-4876-8a81-df81a5be7dd8\") " pod="openstack/ovn-controller-metrics-tckkt" Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.631372 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/553f381f-ef83-4876-8a81-df81a5be7dd8-ovn-rundir\") pod \"ovn-controller-metrics-tckkt\" (UID: \"553f381f-ef83-4876-8a81-df81a5be7dd8\") " pod="openstack/ovn-controller-metrics-tckkt" Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.632066 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/553f381f-ef83-4876-8a81-df81a5be7dd8-config\") pod \"ovn-controller-metrics-tckkt\" (UID: \"553f381f-ef83-4876-8a81-df81a5be7dd8\") " pod="openstack/ovn-controller-metrics-tckkt" Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.632121 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/553f381f-ef83-4876-8a81-df81a5be7dd8-ovs-rundir\") pod \"ovn-controller-metrics-tckkt\" (UID: \"553f381f-ef83-4876-8a81-df81a5be7dd8\") " pod="openstack/ovn-controller-metrics-tckkt" Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.636426 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/553f381f-ef83-4876-8a81-df81a5be7dd8-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-tckkt\" (UID: \"553f381f-ef83-4876-8a81-df81a5be7dd8\") " pod="openstack/ovn-controller-metrics-tckkt" Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.636596 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/553f381f-ef83-4876-8a81-df81a5be7dd8-combined-ca-bundle\") pod \"ovn-controller-metrics-tckkt\" (UID: \"553f381f-ef83-4876-8a81-df81a5be7dd8\") " pod="openstack/ovn-controller-metrics-tckkt" Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.637775 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gtppt"] Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.648611 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-4jsq2"] Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.651807 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-4jsq2" Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.656775 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.664778 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-4jsq2"] Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.675502 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psx78\" (UniqueName: \"kubernetes.io/projected/553f381f-ef83-4876-8a81-df81a5be7dd8-kube-api-access-psx78\") pod \"ovn-controller-metrics-tckkt\" (UID: \"553f381f-ef83-4876-8a81-df81a5be7dd8\") " pod="openstack/ovn-controller-metrics-tckkt" Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.732387 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f4603c1-a6eb-4815-9126-0555a1089e21-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-4jsq2\" (UID: \"9f4603c1-a6eb-4815-9126-0555a1089e21\") " pod="openstack/dnsmasq-dns-7fd796d7df-4jsq2" Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.732444 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f4603c1-a6eb-4815-9126-0555a1089e21-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-4jsq2\" (UID: \"9f4603c1-a6eb-4815-9126-0555a1089e21\") " pod="openstack/dnsmasq-dns-7fd796d7df-4jsq2" Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.732474 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f4603c1-a6eb-4815-9126-0555a1089e21-config\") pod \"dnsmasq-dns-7fd796d7df-4jsq2\" (UID: \"9f4603c1-a6eb-4815-9126-0555a1089e21\") " pod="openstack/dnsmasq-dns-7fd796d7df-4jsq2" Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.732507 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bp5vp\" (UniqueName: \"kubernetes.io/projected/9f4603c1-a6eb-4815-9126-0555a1089e21-kube-api-access-bp5vp\") pod \"dnsmasq-dns-7fd796d7df-4jsq2\" (UID: \"9f4603c1-a6eb-4815-9126-0555a1089e21\") " pod="openstack/dnsmasq-dns-7fd796d7df-4jsq2" Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.755801 4771 generic.go:334] "Generic (PLEG): container finished" podID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerID="a954616bb5027e4b658bf522da064c2dd70331be4152d83f2506f267347e29d3" exitCode=0 Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.755869 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" event={"ID":"289ee6d3-fabe-417f-964c-76ca03c143cc","Type":"ContainerDied","Data":"a954616bb5027e4b658bf522da064c2dd70331be4152d83f2506f267347e29d3"} Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.755895 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" event={"ID":"289ee6d3-fabe-417f-964c-76ca03c143cc","Type":"ContainerStarted","Data":"a4ed1f9c5d09bf489c874da2478bf24ba55dbfc7c07deabe55036c8bafeb8e52"} Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.755910 4771 scope.go:117] "RemoveContainer" containerID="80215404bb4371102288dbe39becc7517e16d1990f65ff6bea20c4cb7f6f0681" Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.761009 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"107cd834-196a-4454-b70b-cbb3ab3631df","Type":"ContainerStarted","Data":"38ee922e423818f793bc5bbcdef5ec7e8ece503bce477adf35984e8ea39e54da"} Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.761487 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.779185 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"37c38012-d257-4269-86fa-8cf3ef4de4cd","Type":"ContainerStarted","Data":"b263a45a2250733f15572053b39f22d5a28c6d601442591a0c2398614cc4f197"} Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.822901 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-nbmcf"] Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.823435 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=27.499404389 podStartE2EDuration="32.823418431s" podCreationTimestamp="2025-10-01 15:11:40 +0000 UTC" firstStartedPulling="2025-10-01 15:12:06.306649666 +0000 UTC m=+970.925824837" lastFinishedPulling="2025-10-01 15:12:11.630663668 +0000 UTC m=+976.249838879" observedRunningTime="2025-10-01 15:12:12.79667469 +0000 UTC m=+977.415849861" watchObservedRunningTime="2025-10-01 15:12:12.823418431 +0000 UTC m=+977.442593602" Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.829593 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-tckkt" Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.834829 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-zpcj9"] Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.835842 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f4603c1-a6eb-4815-9126-0555a1089e21-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-4jsq2\" (UID: \"9f4603c1-a6eb-4815-9126-0555a1089e21\") " pod="openstack/dnsmasq-dns-7fd796d7df-4jsq2" Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.835899 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f4603c1-a6eb-4815-9126-0555a1089e21-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-4jsq2\" (UID: \"9f4603c1-a6eb-4815-9126-0555a1089e21\") " pod="openstack/dnsmasq-dns-7fd796d7df-4jsq2" Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.835940 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f4603c1-a6eb-4815-9126-0555a1089e21-config\") pod \"dnsmasq-dns-7fd796d7df-4jsq2\" (UID: \"9f4603c1-a6eb-4815-9126-0555a1089e21\") " pod="openstack/dnsmasq-dns-7fd796d7df-4jsq2" Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.835975 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bp5vp\" (UniqueName: \"kubernetes.io/projected/9f4603c1-a6eb-4815-9126-0555a1089e21-kube-api-access-bp5vp\") pod \"dnsmasq-dns-7fd796d7df-4jsq2\" (UID: \"9f4603c1-a6eb-4815-9126-0555a1089e21\") " pod="openstack/dnsmasq-dns-7fd796d7df-4jsq2" Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.837109 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-zpcj9" Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.837960 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f4603c1-a6eb-4815-9126-0555a1089e21-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-4jsq2\" (UID: \"9f4603c1-a6eb-4815-9126-0555a1089e21\") " pod="openstack/dnsmasq-dns-7fd796d7df-4jsq2" Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.839448 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f4603c1-a6eb-4815-9126-0555a1089e21-config\") pod \"dnsmasq-dns-7fd796d7df-4jsq2\" (UID: \"9f4603c1-a6eb-4815-9126-0555a1089e21\") " pod="openstack/dnsmasq-dns-7fd796d7df-4jsq2" Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.840172 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f4603c1-a6eb-4815-9126-0555a1089e21-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-4jsq2\" (UID: \"9f4603c1-a6eb-4815-9126-0555a1089e21\") " pod="openstack/dnsmasq-dns-7fd796d7df-4jsq2" Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.855320 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.875894 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-zpcj9"] Oct 01 15:12:12 crc kubenswrapper[4771]: I1001 15:12:12.887675 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bp5vp\" (UniqueName: \"kubernetes.io/projected/9f4603c1-a6eb-4815-9126-0555a1089e21-kube-api-access-bp5vp\") pod \"dnsmasq-dns-7fd796d7df-4jsq2\" (UID: \"9f4603c1-a6eb-4815-9126-0555a1089e21\") " pod="openstack/dnsmasq-dns-7fd796d7df-4jsq2" Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.017256 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-4jsq2" Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.037150 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-gtppt" Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.044904 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/483f4f4a-6fce-4e64-9775-010730037203-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-zpcj9\" (UID: \"483f4f4a-6fce-4e64-9775-010730037203\") " pod="openstack/dnsmasq-dns-86db49b7ff-zpcj9" Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.044947 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/483f4f4a-6fce-4e64-9775-010730037203-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-zpcj9\" (UID: \"483f4f4a-6fce-4e64-9775-010730037203\") " pod="openstack/dnsmasq-dns-86db49b7ff-zpcj9" Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.044967 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2v99f\" (UniqueName: \"kubernetes.io/projected/483f4f4a-6fce-4e64-9775-010730037203-kube-api-access-2v99f\") pod \"dnsmasq-dns-86db49b7ff-zpcj9\" (UID: \"483f4f4a-6fce-4e64-9775-010730037203\") " pod="openstack/dnsmasq-dns-86db49b7ff-zpcj9" Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.044988 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/483f4f4a-6fce-4e64-9775-010730037203-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-zpcj9\" (UID: \"483f4f4a-6fce-4e64-9775-010730037203\") " pod="openstack/dnsmasq-dns-86db49b7ff-zpcj9" Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.045041 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/483f4f4a-6fce-4e64-9775-010730037203-config\") pod \"dnsmasq-dns-86db49b7ff-zpcj9\" (UID: \"483f4f4a-6fce-4e64-9775-010730037203\") " pod="openstack/dnsmasq-dns-86db49b7ff-zpcj9" Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.123435 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-nbmcf" Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.145693 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8860d5a6-498b-4dcd-9648-93aa7afe9d41-dns-svc\") pod \"8860d5a6-498b-4dcd-9648-93aa7afe9d41\" (UID: \"8860d5a6-498b-4dcd-9648-93aa7afe9d41\") " Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.145769 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8860d5a6-498b-4dcd-9648-93aa7afe9d41-config\") pod \"8860d5a6-498b-4dcd-9648-93aa7afe9d41\" (UID: \"8860d5a6-498b-4dcd-9648-93aa7afe9d41\") " Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.145805 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfbq2\" (UniqueName: \"kubernetes.io/projected/8860d5a6-498b-4dcd-9648-93aa7afe9d41-kube-api-access-qfbq2\") pod \"8860d5a6-498b-4dcd-9648-93aa7afe9d41\" (UID: \"8860d5a6-498b-4dcd-9648-93aa7afe9d41\") " Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.145992 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/483f4f4a-6fce-4e64-9775-010730037203-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-zpcj9\" (UID: \"483f4f4a-6fce-4e64-9775-010730037203\") " pod="openstack/dnsmasq-dns-86db49b7ff-zpcj9" Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.146011 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2v99f\" (UniqueName: \"kubernetes.io/projected/483f4f4a-6fce-4e64-9775-010730037203-kube-api-access-2v99f\") pod \"dnsmasq-dns-86db49b7ff-zpcj9\" (UID: \"483f4f4a-6fce-4e64-9775-010730037203\") " pod="openstack/dnsmasq-dns-86db49b7ff-zpcj9" Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.146031 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/483f4f4a-6fce-4e64-9775-010730037203-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-zpcj9\" (UID: \"483f4f4a-6fce-4e64-9775-010730037203\") " pod="openstack/dnsmasq-dns-86db49b7ff-zpcj9" Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.146080 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/483f4f4a-6fce-4e64-9775-010730037203-config\") pod \"dnsmasq-dns-86db49b7ff-zpcj9\" (UID: \"483f4f4a-6fce-4e64-9775-010730037203\") " pod="openstack/dnsmasq-dns-86db49b7ff-zpcj9" Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.146180 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/483f4f4a-6fce-4e64-9775-010730037203-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-zpcj9\" (UID: \"483f4f4a-6fce-4e64-9775-010730037203\") " pod="openstack/dnsmasq-dns-86db49b7ff-zpcj9" Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.147177 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8860d5a6-498b-4dcd-9648-93aa7afe9d41-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8860d5a6-498b-4dcd-9648-93aa7afe9d41" (UID: "8860d5a6-498b-4dcd-9648-93aa7afe9d41"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.147494 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8860d5a6-498b-4dcd-9648-93aa7afe9d41-config" (OuterVolumeSpecName: "config") pod "8860d5a6-498b-4dcd-9648-93aa7afe9d41" (UID: "8860d5a6-498b-4dcd-9648-93aa7afe9d41"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.149244 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/483f4f4a-6fce-4e64-9775-010730037203-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-zpcj9\" (UID: \"483f4f4a-6fce-4e64-9775-010730037203\") " pod="openstack/dnsmasq-dns-86db49b7ff-zpcj9" Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.150284 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/483f4f4a-6fce-4e64-9775-010730037203-config\") pod \"dnsmasq-dns-86db49b7ff-zpcj9\" (UID: \"483f4f4a-6fce-4e64-9775-010730037203\") " pod="openstack/dnsmasq-dns-86db49b7ff-zpcj9" Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.153383 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/483f4f4a-6fce-4e64-9775-010730037203-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-zpcj9\" (UID: \"483f4f4a-6fce-4e64-9775-010730037203\") " pod="openstack/dnsmasq-dns-86db49b7ff-zpcj9" Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.154166 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/483f4f4a-6fce-4e64-9775-010730037203-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-zpcj9\" (UID: \"483f4f4a-6fce-4e64-9775-010730037203\") " pod="openstack/dnsmasq-dns-86db49b7ff-zpcj9" Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.156995 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8860d5a6-498b-4dcd-9648-93aa7afe9d41-kube-api-access-qfbq2" (OuterVolumeSpecName: "kube-api-access-qfbq2") pod "8860d5a6-498b-4dcd-9648-93aa7afe9d41" (UID: "8860d5a6-498b-4dcd-9648-93aa7afe9d41"). InnerVolumeSpecName "kube-api-access-qfbq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.165222 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2v99f\" (UniqueName: \"kubernetes.io/projected/483f4f4a-6fce-4e64-9775-010730037203-kube-api-access-2v99f\") pod \"dnsmasq-dns-86db49b7ff-zpcj9\" (UID: \"483f4f4a-6fce-4e64-9775-010730037203\") " pod="openstack/dnsmasq-dns-86db49b7ff-zpcj9" Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.248345 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qn4gm\" (UniqueName: \"kubernetes.io/projected/182a382f-1b8f-49c4-8386-e9aa14cc4ffc-kube-api-access-qn4gm\") pod \"182a382f-1b8f-49c4-8386-e9aa14cc4ffc\" (UID: \"182a382f-1b8f-49c4-8386-e9aa14cc4ffc\") " Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.249377 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/182a382f-1b8f-49c4-8386-e9aa14cc4ffc-config\") pod \"182a382f-1b8f-49c4-8386-e9aa14cc4ffc\" (UID: \"182a382f-1b8f-49c4-8386-e9aa14cc4ffc\") " Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.249536 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/182a382f-1b8f-49c4-8386-e9aa14cc4ffc-dns-svc\") pod \"182a382f-1b8f-49c4-8386-e9aa14cc4ffc\" (UID: \"182a382f-1b8f-49c4-8386-e9aa14cc4ffc\") " Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.250101 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8860d5a6-498b-4dcd-9648-93aa7afe9d41-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.250122 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8860d5a6-498b-4dcd-9648-93aa7afe9d41-config\") on node \"crc\" DevicePath \"\"" Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.250134 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfbq2\" (UniqueName: \"kubernetes.io/projected/8860d5a6-498b-4dcd-9648-93aa7afe9d41-kube-api-access-qfbq2\") on node \"crc\" DevicePath \"\"" Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.250697 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/182a382f-1b8f-49c4-8386-e9aa14cc4ffc-config" (OuterVolumeSpecName: "config") pod "182a382f-1b8f-49c4-8386-e9aa14cc4ffc" (UID: "182a382f-1b8f-49c4-8386-e9aa14cc4ffc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.251379 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/182a382f-1b8f-49c4-8386-e9aa14cc4ffc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "182a382f-1b8f-49c4-8386-e9aa14cc4ffc" (UID: "182a382f-1b8f-49c4-8386-e9aa14cc4ffc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.253861 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/182a382f-1b8f-49c4-8386-e9aa14cc4ffc-kube-api-access-qn4gm" (OuterVolumeSpecName: "kube-api-access-qn4gm") pod "182a382f-1b8f-49c4-8386-e9aa14cc4ffc" (UID: "182a382f-1b8f-49c4-8386-e9aa14cc4ffc"). InnerVolumeSpecName "kube-api-access-qn4gm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.256642 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-zpcj9" Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.285408 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-4jsq2"] Oct 01 15:12:13 crc kubenswrapper[4771]: W1001 15:12:13.289387 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f4603c1_a6eb_4815_9126_0555a1089e21.slice/crio-287d5c6d1624b9e49f50cf3eb11e29b0137aedf3de1cafa36724568d0b1a636a WatchSource:0}: Error finding container 287d5c6d1624b9e49f50cf3eb11e29b0137aedf3de1cafa36724568d0b1a636a: Status 404 returned error can't find the container with id 287d5c6d1624b9e49f50cf3eb11e29b0137aedf3de1cafa36724568d0b1a636a Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.359227 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-tckkt"] Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.359406 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/182a382f-1b8f-49c4-8386-e9aa14cc4ffc-config\") on node \"crc\" DevicePath \"\"" Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.359626 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/182a382f-1b8f-49c4-8386-e9aa14cc4ffc-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.359640 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qn4gm\" (UniqueName: \"kubernetes.io/projected/182a382f-1b8f-49c4-8386-e9aa14cc4ffc-kube-api-access-qn4gm\") on node \"crc\" DevicePath \"\"" Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.478193 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-zpcj9"] Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.790594 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-tckkt" event={"ID":"553f381f-ef83-4876-8a81-df81a5be7dd8","Type":"ContainerStarted","Data":"f7bf11ce3fe71911cdc0d112bc52ad8fd960964cd7b4bfd9998f5b112ab5e97d"} Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.792088 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-nbmcf" Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.792114 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-nbmcf" event={"ID":"182a382f-1b8f-49c4-8386-e9aa14cc4ffc","Type":"ContainerDied","Data":"fbebe17b53d58468161ee2cd5f74c468eb70c813d82505a849538357e5069523"} Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.793996 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-4jsq2" event={"ID":"9f4603c1-a6eb-4815-9126-0555a1089e21","Type":"ContainerStarted","Data":"287d5c6d1624b9e49f50cf3eb11e29b0137aedf3de1cafa36724568d0b1a636a"} Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.795585 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-gtppt" event={"ID":"8860d5a6-498b-4dcd-9648-93aa7afe9d41","Type":"ContainerDied","Data":"7bb78ef999d8ba08e4daaf2ac5c653b29570675bab09a7b94bdfd05f1c08f541"} Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.795667 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-gtppt" Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.797952 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zpdvh" event={"ID":"20d8761e-4ce2-4312-8a80-8c3ce8908f2c","Type":"ContainerStarted","Data":"037a040eb6c50a34547d5511dd5e1c38ae8062e8feb7d2e87861cc73007ccbb9"} Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.798212 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-zpdvh" Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.843323 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-zpdvh" podStartSLOduration=21.886266319 podStartE2EDuration="27.843298553s" podCreationTimestamp="2025-10-01 15:11:46 +0000 UTC" firstStartedPulling="2025-10-01 15:12:06.307105067 +0000 UTC m=+970.926280238" lastFinishedPulling="2025-10-01 15:12:12.264137291 +0000 UTC m=+976.883312472" observedRunningTime="2025-10-01 15:12:13.82983286 +0000 UTC m=+978.449008071" watchObservedRunningTime="2025-10-01 15:12:13.843298553 +0000 UTC m=+978.462473754" Oct 01 15:12:13 crc kubenswrapper[4771]: W1001 15:12:13.909658 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod483f4f4a_6fce_4e64_9775_010730037203.slice/crio-dd70b974e90f845193c05dd0d3cece071c433eff899a940a3433d1b852e66975 WatchSource:0}: Error finding container dd70b974e90f845193c05dd0d3cece071c433eff899a940a3433d1b852e66975: Status 404 returned error can't find the container with id dd70b974e90f845193c05dd0d3cece071c433eff899a940a3433d1b852e66975 Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.928892 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-nbmcf"] Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.936619 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-nbmcf"] Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.966477 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gtppt"] Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.970576 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-gtppt"] Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.998467 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="182a382f-1b8f-49c4-8386-e9aa14cc4ffc" path="/var/lib/kubelet/pods/182a382f-1b8f-49c4-8386-e9aa14cc4ffc/volumes" Oct 01 15:12:13 crc kubenswrapper[4771]: I1001 15:12:13.999093 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8860d5a6-498b-4dcd-9648-93aa7afe9d41" path="/var/lib/kubelet/pods/8860d5a6-498b-4dcd-9648-93aa7afe9d41/volumes" Oct 01 15:12:14 crc kubenswrapper[4771]: I1001 15:12:14.814574 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-zpcj9" event={"ID":"483f4f4a-6fce-4e64-9775-010730037203","Type":"ContainerStarted","Data":"dd70b974e90f845193c05dd0d3cece071c433eff899a940a3433d1b852e66975"} Oct 01 15:12:16 crc kubenswrapper[4771]: I1001 15:12:16.207343 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 01 15:12:23 crc kubenswrapper[4771]: I1001 15:12:23.277393 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-4jsq2"] Oct 01 15:12:23 crc kubenswrapper[4771]: I1001 15:12:23.315349 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-4tm2t"] Oct 01 15:12:23 crc kubenswrapper[4771]: I1001 15:12:23.316939 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-4tm2t" Oct 01 15:12:23 crc kubenswrapper[4771]: I1001 15:12:23.324802 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-4tm2t"] Oct 01 15:12:23 crc kubenswrapper[4771]: I1001 15:12:23.444514 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qldhc\" (UniqueName: \"kubernetes.io/projected/a66f8282-966c-40de-9bfd-b6b75d5f519c-kube-api-access-qldhc\") pod \"dnsmasq-dns-698758b865-4tm2t\" (UID: \"a66f8282-966c-40de-9bfd-b6b75d5f519c\") " pod="openstack/dnsmasq-dns-698758b865-4tm2t" Oct 01 15:12:23 crc kubenswrapper[4771]: I1001 15:12:23.444715 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a66f8282-966c-40de-9bfd-b6b75d5f519c-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-4tm2t\" (UID: \"a66f8282-966c-40de-9bfd-b6b75d5f519c\") " pod="openstack/dnsmasq-dns-698758b865-4tm2t" Oct 01 15:12:23 crc kubenswrapper[4771]: I1001 15:12:23.444945 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a66f8282-966c-40de-9bfd-b6b75d5f519c-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-4tm2t\" (UID: \"a66f8282-966c-40de-9bfd-b6b75d5f519c\") " pod="openstack/dnsmasq-dns-698758b865-4tm2t" Oct 01 15:12:23 crc kubenswrapper[4771]: I1001 15:12:23.445022 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a66f8282-966c-40de-9bfd-b6b75d5f519c-dns-svc\") pod \"dnsmasq-dns-698758b865-4tm2t\" (UID: \"a66f8282-966c-40de-9bfd-b6b75d5f519c\") " pod="openstack/dnsmasq-dns-698758b865-4tm2t" Oct 01 15:12:23 crc kubenswrapper[4771]: I1001 15:12:23.445068 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a66f8282-966c-40de-9bfd-b6b75d5f519c-config\") pod \"dnsmasq-dns-698758b865-4tm2t\" (UID: \"a66f8282-966c-40de-9bfd-b6b75d5f519c\") " pod="openstack/dnsmasq-dns-698758b865-4tm2t" Oct 01 15:12:23 crc kubenswrapper[4771]: I1001 15:12:23.546786 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a66f8282-966c-40de-9bfd-b6b75d5f519c-dns-svc\") pod \"dnsmasq-dns-698758b865-4tm2t\" (UID: \"a66f8282-966c-40de-9bfd-b6b75d5f519c\") " pod="openstack/dnsmasq-dns-698758b865-4tm2t" Oct 01 15:12:23 crc kubenswrapper[4771]: I1001 15:12:23.546861 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a66f8282-966c-40de-9bfd-b6b75d5f519c-config\") pod \"dnsmasq-dns-698758b865-4tm2t\" (UID: \"a66f8282-966c-40de-9bfd-b6b75d5f519c\") " pod="openstack/dnsmasq-dns-698758b865-4tm2t" Oct 01 15:12:23 crc kubenswrapper[4771]: I1001 15:12:23.546934 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qldhc\" (UniqueName: \"kubernetes.io/projected/a66f8282-966c-40de-9bfd-b6b75d5f519c-kube-api-access-qldhc\") pod \"dnsmasq-dns-698758b865-4tm2t\" (UID: \"a66f8282-966c-40de-9bfd-b6b75d5f519c\") " pod="openstack/dnsmasq-dns-698758b865-4tm2t" Oct 01 15:12:23 crc kubenswrapper[4771]: I1001 15:12:23.547005 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a66f8282-966c-40de-9bfd-b6b75d5f519c-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-4tm2t\" (UID: \"a66f8282-966c-40de-9bfd-b6b75d5f519c\") " pod="openstack/dnsmasq-dns-698758b865-4tm2t" Oct 01 15:12:23 crc kubenswrapper[4771]: I1001 15:12:23.547076 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a66f8282-966c-40de-9bfd-b6b75d5f519c-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-4tm2t\" (UID: \"a66f8282-966c-40de-9bfd-b6b75d5f519c\") " pod="openstack/dnsmasq-dns-698758b865-4tm2t" Oct 01 15:12:23 crc kubenswrapper[4771]: I1001 15:12:23.548345 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a66f8282-966c-40de-9bfd-b6b75d5f519c-config\") pod \"dnsmasq-dns-698758b865-4tm2t\" (UID: \"a66f8282-966c-40de-9bfd-b6b75d5f519c\") " pod="openstack/dnsmasq-dns-698758b865-4tm2t" Oct 01 15:12:23 crc kubenswrapper[4771]: I1001 15:12:23.548387 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a66f8282-966c-40de-9bfd-b6b75d5f519c-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-4tm2t\" (UID: \"a66f8282-966c-40de-9bfd-b6b75d5f519c\") " pod="openstack/dnsmasq-dns-698758b865-4tm2t" Oct 01 15:12:23 crc kubenswrapper[4771]: I1001 15:12:23.548359 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a66f8282-966c-40de-9bfd-b6b75d5f519c-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-4tm2t\" (UID: \"a66f8282-966c-40de-9bfd-b6b75d5f519c\") " pod="openstack/dnsmasq-dns-698758b865-4tm2t" Oct 01 15:12:23 crc kubenswrapper[4771]: I1001 15:12:23.548723 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a66f8282-966c-40de-9bfd-b6b75d5f519c-dns-svc\") pod \"dnsmasq-dns-698758b865-4tm2t\" (UID: \"a66f8282-966c-40de-9bfd-b6b75d5f519c\") " pod="openstack/dnsmasq-dns-698758b865-4tm2t" Oct 01 15:12:23 crc kubenswrapper[4771]: I1001 15:12:23.568422 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qldhc\" (UniqueName: \"kubernetes.io/projected/a66f8282-966c-40de-9bfd-b6b75d5f519c-kube-api-access-qldhc\") pod \"dnsmasq-dns-698758b865-4tm2t\" (UID: \"a66f8282-966c-40de-9bfd-b6b75d5f519c\") " pod="openstack/dnsmasq-dns-698758b865-4tm2t" Oct 01 15:12:23 crc kubenswrapper[4771]: I1001 15:12:23.640062 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-4tm2t" Oct 01 15:12:24 crc kubenswrapper[4771]: I1001 15:12:24.412865 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 01 15:12:24 crc kubenswrapper[4771]: I1001 15:12:24.418939 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 01 15:12:24 crc kubenswrapper[4771]: I1001 15:12:24.421200 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 01 15:12:24 crc kubenswrapper[4771]: I1001 15:12:24.421203 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 01 15:12:24 crc kubenswrapper[4771]: I1001 15:12:24.421352 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-t989p" Oct 01 15:12:24 crc kubenswrapper[4771]: I1001 15:12:24.423887 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 01 15:12:24 crc kubenswrapper[4771]: I1001 15:12:24.442336 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 01 15:12:24 crc kubenswrapper[4771]: I1001 15:12:24.461116 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/1916131b-f4ff-4f49-8abc-a640dc07abc4-lock\") pod \"swift-storage-0\" (UID: \"1916131b-f4ff-4f49-8abc-a640dc07abc4\") " pod="openstack/swift-storage-0" Oct 01 15:12:24 crc kubenswrapper[4771]: I1001 15:12:24.461196 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1916131b-f4ff-4f49-8abc-a640dc07abc4-cache\") pod \"swift-storage-0\" (UID: \"1916131b-f4ff-4f49-8abc-a640dc07abc4\") " pod="openstack/swift-storage-0" Oct 01 15:12:24 crc kubenswrapper[4771]: I1001 15:12:24.461290 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"1916131b-f4ff-4f49-8abc-a640dc07abc4\") " pod="openstack/swift-storage-0" Oct 01 15:12:24 crc kubenswrapper[4771]: I1001 15:12:24.461321 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwrxm\" (UniqueName: \"kubernetes.io/projected/1916131b-f4ff-4f49-8abc-a640dc07abc4-kube-api-access-hwrxm\") pod \"swift-storage-0\" (UID: \"1916131b-f4ff-4f49-8abc-a640dc07abc4\") " pod="openstack/swift-storage-0" Oct 01 15:12:24 crc kubenswrapper[4771]: I1001 15:12:24.461350 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1916131b-f4ff-4f49-8abc-a640dc07abc4-etc-swift\") pod \"swift-storage-0\" (UID: \"1916131b-f4ff-4f49-8abc-a640dc07abc4\") " pod="openstack/swift-storage-0" Oct 01 15:12:24 crc kubenswrapper[4771]: I1001 15:12:24.562476 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/1916131b-f4ff-4f49-8abc-a640dc07abc4-lock\") pod \"swift-storage-0\" (UID: \"1916131b-f4ff-4f49-8abc-a640dc07abc4\") " pod="openstack/swift-storage-0" Oct 01 15:12:24 crc kubenswrapper[4771]: I1001 15:12:24.562526 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1916131b-f4ff-4f49-8abc-a640dc07abc4-cache\") pod \"swift-storage-0\" (UID: \"1916131b-f4ff-4f49-8abc-a640dc07abc4\") " pod="openstack/swift-storage-0" Oct 01 15:12:24 crc kubenswrapper[4771]: I1001 15:12:24.562591 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"1916131b-f4ff-4f49-8abc-a640dc07abc4\") " pod="openstack/swift-storage-0" Oct 01 15:12:24 crc kubenswrapper[4771]: I1001 15:12:24.562623 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwrxm\" (UniqueName: \"kubernetes.io/projected/1916131b-f4ff-4f49-8abc-a640dc07abc4-kube-api-access-hwrxm\") pod \"swift-storage-0\" (UID: \"1916131b-f4ff-4f49-8abc-a640dc07abc4\") " pod="openstack/swift-storage-0" Oct 01 15:12:24 crc kubenswrapper[4771]: I1001 15:12:24.562644 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1916131b-f4ff-4f49-8abc-a640dc07abc4-etc-swift\") pod \"swift-storage-0\" (UID: \"1916131b-f4ff-4f49-8abc-a640dc07abc4\") " pod="openstack/swift-storage-0" Oct 01 15:12:24 crc kubenswrapper[4771]: E1001 15:12:24.562807 4771 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 01 15:12:24 crc kubenswrapper[4771]: E1001 15:12:24.562822 4771 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 01 15:12:24 crc kubenswrapper[4771]: E1001 15:12:24.562869 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1916131b-f4ff-4f49-8abc-a640dc07abc4-etc-swift podName:1916131b-f4ff-4f49-8abc-a640dc07abc4 nodeName:}" failed. No retries permitted until 2025-10-01 15:12:25.062851842 +0000 UTC m=+989.682027013 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1916131b-f4ff-4f49-8abc-a640dc07abc4-etc-swift") pod "swift-storage-0" (UID: "1916131b-f4ff-4f49-8abc-a640dc07abc4") : configmap "swift-ring-files" not found Oct 01 15:12:24 crc kubenswrapper[4771]: I1001 15:12:24.563060 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/1916131b-f4ff-4f49-8abc-a640dc07abc4-lock\") pod \"swift-storage-0\" (UID: \"1916131b-f4ff-4f49-8abc-a640dc07abc4\") " pod="openstack/swift-storage-0" Oct 01 15:12:24 crc kubenswrapper[4771]: I1001 15:12:24.563190 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"1916131b-f4ff-4f49-8abc-a640dc07abc4\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/swift-storage-0" Oct 01 15:12:24 crc kubenswrapper[4771]: I1001 15:12:24.563238 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/1916131b-f4ff-4f49-8abc-a640dc07abc4-cache\") pod \"swift-storage-0\" (UID: \"1916131b-f4ff-4f49-8abc-a640dc07abc4\") " pod="openstack/swift-storage-0" Oct 01 15:12:24 crc kubenswrapper[4771]: I1001 15:12:24.581432 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwrxm\" (UniqueName: \"kubernetes.io/projected/1916131b-f4ff-4f49-8abc-a640dc07abc4-kube-api-access-hwrxm\") pod \"swift-storage-0\" (UID: \"1916131b-f4ff-4f49-8abc-a640dc07abc4\") " pod="openstack/swift-storage-0" Oct 01 15:12:24 crc kubenswrapper[4771]: I1001 15:12:24.588906 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"swift-storage-0\" (UID: \"1916131b-f4ff-4f49-8abc-a640dc07abc4\") " pod="openstack/swift-storage-0" Oct 01 15:12:25 crc kubenswrapper[4771]: I1001 15:12:25.072046 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1916131b-f4ff-4f49-8abc-a640dc07abc4-etc-swift\") pod \"swift-storage-0\" (UID: \"1916131b-f4ff-4f49-8abc-a640dc07abc4\") " pod="openstack/swift-storage-0" Oct 01 15:12:25 crc kubenswrapper[4771]: E1001 15:12:25.072267 4771 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 01 15:12:25 crc kubenswrapper[4771]: E1001 15:12:25.072285 4771 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 01 15:12:25 crc kubenswrapper[4771]: E1001 15:12:25.072348 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1916131b-f4ff-4f49-8abc-a640dc07abc4-etc-swift podName:1916131b-f4ff-4f49-8abc-a640dc07abc4 nodeName:}" failed. No retries permitted until 2025-10-01 15:12:26.072328742 +0000 UTC m=+990.691503913 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1916131b-f4ff-4f49-8abc-a640dc07abc4-etc-swift") pod "swift-storage-0" (UID: "1916131b-f4ff-4f49-8abc-a640dc07abc4") : configmap "swift-ring-files" not found Oct 01 15:12:26 crc kubenswrapper[4771]: I1001 15:12:26.119763 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1916131b-f4ff-4f49-8abc-a640dc07abc4-etc-swift\") pod \"swift-storage-0\" (UID: \"1916131b-f4ff-4f49-8abc-a640dc07abc4\") " pod="openstack/swift-storage-0" Oct 01 15:12:26 crc kubenswrapper[4771]: E1001 15:12:26.120071 4771 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 01 15:12:26 crc kubenswrapper[4771]: E1001 15:12:26.120098 4771 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 01 15:12:26 crc kubenswrapper[4771]: E1001 15:12:26.120154 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1916131b-f4ff-4f49-8abc-a640dc07abc4-etc-swift podName:1916131b-f4ff-4f49-8abc-a640dc07abc4 nodeName:}" failed. No retries permitted until 2025-10-01 15:12:28.120133974 +0000 UTC m=+992.739309145 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1916131b-f4ff-4f49-8abc-a640dc07abc4-etc-swift") pod "swift-storage-0" (UID: "1916131b-f4ff-4f49-8abc-a640dc07abc4") : configmap "swift-ring-files" not found Oct 01 15:12:28 crc kubenswrapper[4771]: I1001 15:12:28.151790 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1916131b-f4ff-4f49-8abc-a640dc07abc4-etc-swift\") pod \"swift-storage-0\" (UID: \"1916131b-f4ff-4f49-8abc-a640dc07abc4\") " pod="openstack/swift-storage-0" Oct 01 15:12:28 crc kubenswrapper[4771]: E1001 15:12:28.152038 4771 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 01 15:12:28 crc kubenswrapper[4771]: E1001 15:12:28.152566 4771 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 01 15:12:28 crc kubenswrapper[4771]: E1001 15:12:28.152771 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1916131b-f4ff-4f49-8abc-a640dc07abc4-etc-swift podName:1916131b-f4ff-4f49-8abc-a640dc07abc4 nodeName:}" failed. No retries permitted until 2025-10-01 15:12:32.152693971 +0000 UTC m=+996.771869182 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1916131b-f4ff-4f49-8abc-a640dc07abc4-etc-swift") pod "swift-storage-0" (UID: "1916131b-f4ff-4f49-8abc-a640dc07abc4") : configmap "swift-ring-files" not found Oct 01 15:12:28 crc kubenswrapper[4771]: I1001 15:12:28.370118 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-hzvjt"] Oct 01 15:12:28 crc kubenswrapper[4771]: I1001 15:12:28.372472 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-hzvjt" Oct 01 15:12:28 crc kubenswrapper[4771]: I1001 15:12:28.375891 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 01 15:12:28 crc kubenswrapper[4771]: I1001 15:12:28.377024 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 01 15:12:28 crc kubenswrapper[4771]: I1001 15:12:28.383115 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 01 15:12:28 crc kubenswrapper[4771]: I1001 15:12:28.385437 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-hzvjt"] Oct 01 15:12:28 crc kubenswrapper[4771]: I1001 15:12:28.457835 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a56d3441-b413-4629-870b-49c208943243-ring-data-devices\") pod \"swift-ring-rebalance-hzvjt\" (UID: \"a56d3441-b413-4629-870b-49c208943243\") " pod="openstack/swift-ring-rebalance-hzvjt" Oct 01 15:12:28 crc kubenswrapper[4771]: I1001 15:12:28.457888 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a56d3441-b413-4629-870b-49c208943243-swiftconf\") pod \"swift-ring-rebalance-hzvjt\" (UID: \"a56d3441-b413-4629-870b-49c208943243\") " pod="openstack/swift-ring-rebalance-hzvjt" Oct 01 15:12:28 crc kubenswrapper[4771]: I1001 15:12:28.457913 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a56d3441-b413-4629-870b-49c208943243-scripts\") pod \"swift-ring-rebalance-hzvjt\" (UID: \"a56d3441-b413-4629-870b-49c208943243\") " pod="openstack/swift-ring-rebalance-hzvjt" Oct 01 15:12:28 crc kubenswrapper[4771]: I1001 15:12:28.457939 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj8tl\" (UniqueName: \"kubernetes.io/projected/a56d3441-b413-4629-870b-49c208943243-kube-api-access-zj8tl\") pod \"swift-ring-rebalance-hzvjt\" (UID: \"a56d3441-b413-4629-870b-49c208943243\") " pod="openstack/swift-ring-rebalance-hzvjt" Oct 01 15:12:28 crc kubenswrapper[4771]: I1001 15:12:28.457963 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a56d3441-b413-4629-870b-49c208943243-combined-ca-bundle\") pod \"swift-ring-rebalance-hzvjt\" (UID: \"a56d3441-b413-4629-870b-49c208943243\") " pod="openstack/swift-ring-rebalance-hzvjt" Oct 01 15:12:28 crc kubenswrapper[4771]: I1001 15:12:28.457992 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a56d3441-b413-4629-870b-49c208943243-dispersionconf\") pod \"swift-ring-rebalance-hzvjt\" (UID: \"a56d3441-b413-4629-870b-49c208943243\") " pod="openstack/swift-ring-rebalance-hzvjt" Oct 01 15:12:28 crc kubenswrapper[4771]: I1001 15:12:28.458033 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a56d3441-b413-4629-870b-49c208943243-etc-swift\") pod \"swift-ring-rebalance-hzvjt\" (UID: \"a56d3441-b413-4629-870b-49c208943243\") " pod="openstack/swift-ring-rebalance-hzvjt" Oct 01 15:12:28 crc kubenswrapper[4771]: I1001 15:12:28.559352 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a56d3441-b413-4629-870b-49c208943243-swiftconf\") pod \"swift-ring-rebalance-hzvjt\" (UID: \"a56d3441-b413-4629-870b-49c208943243\") " pod="openstack/swift-ring-rebalance-hzvjt" Oct 01 15:12:28 crc kubenswrapper[4771]: I1001 15:12:28.559431 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a56d3441-b413-4629-870b-49c208943243-scripts\") pod \"swift-ring-rebalance-hzvjt\" (UID: \"a56d3441-b413-4629-870b-49c208943243\") " pod="openstack/swift-ring-rebalance-hzvjt" Oct 01 15:12:28 crc kubenswrapper[4771]: I1001 15:12:28.559502 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zj8tl\" (UniqueName: \"kubernetes.io/projected/a56d3441-b413-4629-870b-49c208943243-kube-api-access-zj8tl\") pod \"swift-ring-rebalance-hzvjt\" (UID: \"a56d3441-b413-4629-870b-49c208943243\") " pod="openstack/swift-ring-rebalance-hzvjt" Oct 01 15:12:28 crc kubenswrapper[4771]: I1001 15:12:28.559560 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a56d3441-b413-4629-870b-49c208943243-combined-ca-bundle\") pod \"swift-ring-rebalance-hzvjt\" (UID: \"a56d3441-b413-4629-870b-49c208943243\") " pod="openstack/swift-ring-rebalance-hzvjt" Oct 01 15:12:28 crc kubenswrapper[4771]: I1001 15:12:28.559649 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a56d3441-b413-4629-870b-49c208943243-dispersionconf\") pod \"swift-ring-rebalance-hzvjt\" (UID: \"a56d3441-b413-4629-870b-49c208943243\") " pod="openstack/swift-ring-rebalance-hzvjt" Oct 01 15:12:28 crc kubenswrapper[4771]: I1001 15:12:28.559806 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a56d3441-b413-4629-870b-49c208943243-etc-swift\") pod \"swift-ring-rebalance-hzvjt\" (UID: \"a56d3441-b413-4629-870b-49c208943243\") " pod="openstack/swift-ring-rebalance-hzvjt" Oct 01 15:12:28 crc kubenswrapper[4771]: I1001 15:12:28.559999 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a56d3441-b413-4629-870b-49c208943243-ring-data-devices\") pod \"swift-ring-rebalance-hzvjt\" (UID: \"a56d3441-b413-4629-870b-49c208943243\") " pod="openstack/swift-ring-rebalance-hzvjt" Oct 01 15:12:28 crc kubenswrapper[4771]: I1001 15:12:28.560973 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a56d3441-b413-4629-870b-49c208943243-etc-swift\") pod \"swift-ring-rebalance-hzvjt\" (UID: \"a56d3441-b413-4629-870b-49c208943243\") " pod="openstack/swift-ring-rebalance-hzvjt" Oct 01 15:12:28 crc kubenswrapper[4771]: I1001 15:12:28.561044 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a56d3441-b413-4629-870b-49c208943243-scripts\") pod \"swift-ring-rebalance-hzvjt\" (UID: \"a56d3441-b413-4629-870b-49c208943243\") " pod="openstack/swift-ring-rebalance-hzvjt" Oct 01 15:12:28 crc kubenswrapper[4771]: I1001 15:12:28.561533 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a56d3441-b413-4629-870b-49c208943243-ring-data-devices\") pod \"swift-ring-rebalance-hzvjt\" (UID: \"a56d3441-b413-4629-870b-49c208943243\") " pod="openstack/swift-ring-rebalance-hzvjt" Oct 01 15:12:28 crc kubenswrapper[4771]: I1001 15:12:28.567456 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a56d3441-b413-4629-870b-49c208943243-dispersionconf\") pod \"swift-ring-rebalance-hzvjt\" (UID: \"a56d3441-b413-4629-870b-49c208943243\") " pod="openstack/swift-ring-rebalance-hzvjt" Oct 01 15:12:28 crc kubenswrapper[4771]: I1001 15:12:28.583206 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a56d3441-b413-4629-870b-49c208943243-swiftconf\") pod \"swift-ring-rebalance-hzvjt\" (UID: \"a56d3441-b413-4629-870b-49c208943243\") " pod="openstack/swift-ring-rebalance-hzvjt" Oct 01 15:12:28 crc kubenswrapper[4771]: I1001 15:12:28.584002 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a56d3441-b413-4629-870b-49c208943243-combined-ca-bundle\") pod \"swift-ring-rebalance-hzvjt\" (UID: \"a56d3441-b413-4629-870b-49c208943243\") " pod="openstack/swift-ring-rebalance-hzvjt" Oct 01 15:12:28 crc kubenswrapper[4771]: I1001 15:12:28.585886 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj8tl\" (UniqueName: \"kubernetes.io/projected/a56d3441-b413-4629-870b-49c208943243-kube-api-access-zj8tl\") pod \"swift-ring-rebalance-hzvjt\" (UID: \"a56d3441-b413-4629-870b-49c208943243\") " pod="openstack/swift-ring-rebalance-hzvjt" Oct 01 15:12:28 crc kubenswrapper[4771]: I1001 15:12:28.714979 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-hzvjt" Oct 01 15:12:30 crc kubenswrapper[4771]: E1001 15:12:30.471192 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Oct 01 15:12:30 crc kubenswrapper[4771]: E1001 15:12:30.471569 4771 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Oct 01 15:12:30 crc kubenswrapper[4771]: E1001 15:12:30.471696 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-h9z5w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(eca5bbfa-3927-4c5b-b973-7dce060db69b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Oct 01 15:12:30 crc kubenswrapper[4771]: E1001 15:12:30.472999 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="eca5bbfa-3927-4c5b-b973-7dce060db69b" Oct 01 15:12:30 crc kubenswrapper[4771]: E1001 15:12:30.952040 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified" Oct 01 15:12:30 crc kubenswrapper[4771]: E1001 15:12:30.952452 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:openstack-network-exporter,Image:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,Command:[/app/openstack-network-exporter],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:OPENSTACK_NETWORK_EXPORTER_YAML,Value:/etc/config/openstack-network-exporter.yaml,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovs-rundir,ReadOnly:true,MountPath:/var/run/openvswitch,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovn-rundir,ReadOnly:true,MountPath:/var/run/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-certs-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovnmetrics.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-certs-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovnmetrics.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-certs-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-psx78,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[NET_ADMIN SYS_ADMIN SYS_NICE],Drop:[],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-controller-metrics-tckkt_openstack(553f381f-ef83-4876-8a81-df81a5be7dd8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 15:12:30 crc kubenswrapper[4771]: E1001 15:12:30.953667 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstack-network-exporter\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ovn-controller-metrics-tckkt" podUID="553f381f-ef83-4876-8a81-df81a5be7dd8" Oct 01 15:12:30 crc kubenswrapper[4771]: E1001 15:12:30.955759 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="eca5bbfa-3927-4c5b-b973-7dce060db69b" Oct 01 15:12:31 crc kubenswrapper[4771]: I1001 15:12:31.392043 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-hzvjt"] Oct 01 15:12:31 crc kubenswrapper[4771]: I1001 15:12:31.459220 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-4tm2t"] Oct 01 15:12:31 crc kubenswrapper[4771]: W1001 15:12:31.472517 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda56d3441_b413_4629_870b_49c208943243.slice/crio-e6e29d7ed38f04e3d89347303546f15073e7db2cfb4ec9f2c64f982a7062a1c1 WatchSource:0}: Error finding container e6e29d7ed38f04e3d89347303546f15073e7db2cfb4ec9f2c64f982a7062a1c1: Status 404 returned error can't find the container with id e6e29d7ed38f04e3d89347303546f15073e7db2cfb4ec9f2c64f982a7062a1c1 Oct 01 15:12:31 crc kubenswrapper[4771]: W1001 15:12:31.478265 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda66f8282_966c_40de_9bfd_b6b75d5f519c.slice/crio-148b87f32c66d63c41fa765c5c9ec6366ed06e55a0e8a472fac7399e98760609 WatchSource:0}: Error finding container 148b87f32c66d63c41fa765c5c9ec6366ed06e55a0e8a472fac7399e98760609: Status 404 returned error can't find the container with id 148b87f32c66d63c41fa765c5c9ec6366ed06e55a0e8a472fac7399e98760609 Oct 01 15:12:31 crc kubenswrapper[4771]: I1001 15:12:31.958898 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e902a14c-a59a-4278-b560-33de2cb50d32","Type":"ContainerStarted","Data":"854896a3d2c0c1f8faa1797301701a598a385b198d0044d345ca91b2eef40efc"} Oct 01 15:12:31 crc kubenswrapper[4771]: I1001 15:12:31.962542 4771 generic.go:334] "Generic (PLEG): container finished" podID="a66f8282-966c-40de-9bfd-b6b75d5f519c" containerID="00b63e9bb2fedec4a96328107b84b5fcd377240fd28be4b85603755ca670a80c" exitCode=0 Oct 01 15:12:31 crc kubenswrapper[4771]: I1001 15:12:31.962603 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-4tm2t" event={"ID":"a66f8282-966c-40de-9bfd-b6b75d5f519c","Type":"ContainerDied","Data":"00b63e9bb2fedec4a96328107b84b5fcd377240fd28be4b85603755ca670a80c"} Oct 01 15:12:31 crc kubenswrapper[4771]: I1001 15:12:31.962625 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-4tm2t" event={"ID":"a66f8282-966c-40de-9bfd-b6b75d5f519c","Type":"ContainerStarted","Data":"148b87f32c66d63c41fa765c5c9ec6366ed06e55a0e8a472fac7399e98760609"} Oct 01 15:12:31 crc kubenswrapper[4771]: I1001 15:12:31.965765 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-rv4hj" event={"ID":"74fee09d-11ad-45f4-a779-4c352b6dc67f","Type":"ContainerStarted","Data":"1e022fa4cdd94ea6b2751b20aafd82906790cb634646de2ed43f7df588e0c66f"} Oct 01 15:12:31 crc kubenswrapper[4771]: I1001 15:12:31.966682 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-hzvjt" event={"ID":"a56d3441-b413-4629-870b-49c208943243","Type":"ContainerStarted","Data":"e6e29d7ed38f04e3d89347303546f15073e7db2cfb4ec9f2c64f982a7062a1c1"} Oct 01 15:12:31 crc kubenswrapper[4771]: I1001 15:12:31.968880 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"999431d1-6d92-46de-ba0f-b253f96fe627","Type":"ContainerStarted","Data":"f0e9be94028de35c1d38a8056909c86c5e59278c346b0209a10b1ec9a1142814"} Oct 01 15:12:31 crc kubenswrapper[4771]: I1001 15:12:31.970207 4771 generic.go:334] "Generic (PLEG): container finished" podID="483f4f4a-6fce-4e64-9775-010730037203" containerID="c2dc030037ae16412609923853d07c4b69cff846c3e8a72dedc77e0f06f0693d" exitCode=0 Oct 01 15:12:31 crc kubenswrapper[4771]: I1001 15:12:31.970263 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-zpcj9" event={"ID":"483f4f4a-6fce-4e64-9775-010730037203","Type":"ContainerDied","Data":"c2dc030037ae16412609923853d07c4b69cff846c3e8a72dedc77e0f06f0693d"} Oct 01 15:12:31 crc kubenswrapper[4771]: I1001 15:12:31.972717 4771 generic.go:334] "Generic (PLEG): container finished" podID="9f4603c1-a6eb-4815-9126-0555a1089e21" containerID="2425c56cd80450c4aee90dcfd2a9a554fce6ec8448d14fc8a08cc8786d45434b" exitCode=0 Oct 01 15:12:31 crc kubenswrapper[4771]: I1001 15:12:31.972803 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-4jsq2" event={"ID":"9f4603c1-a6eb-4815-9126-0555a1089e21","Type":"ContainerDied","Data":"2425c56cd80450c4aee90dcfd2a9a554fce6ec8448d14fc8a08cc8786d45434b"} Oct 01 15:12:31 crc kubenswrapper[4771]: I1001 15:12:31.977280 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"064359ac-92c0-4674-a919-ccb8ffc0a5df","Type":"ContainerStarted","Data":"be8db8dd9923bce9839e538d738566027601fae1cc8c331fc085652527b81fae"} Oct 01 15:12:31 crc kubenswrapper[4771]: E1001 15:12:31.979894 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openstack-network-exporter\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified\\\"\"" pod="openstack/ovn-controller-metrics-tckkt" podUID="553f381f-ef83-4876-8a81-df81a5be7dd8" Oct 01 15:12:32 crc kubenswrapper[4771]: I1001 15:12:32.238425 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1916131b-f4ff-4f49-8abc-a640dc07abc4-etc-swift\") pod \"swift-storage-0\" (UID: \"1916131b-f4ff-4f49-8abc-a640dc07abc4\") " pod="openstack/swift-storage-0" Oct 01 15:12:32 crc kubenswrapper[4771]: E1001 15:12:32.238688 4771 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 01 15:12:32 crc kubenswrapper[4771]: E1001 15:12:32.238883 4771 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 01 15:12:32 crc kubenswrapper[4771]: E1001 15:12:32.238945 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1916131b-f4ff-4f49-8abc-a640dc07abc4-etc-swift podName:1916131b-f4ff-4f49-8abc-a640dc07abc4 nodeName:}" failed. No retries permitted until 2025-10-01 15:12:40.238926564 +0000 UTC m=+1004.858101735 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1916131b-f4ff-4f49-8abc-a640dc07abc4-etc-swift") pod "swift-storage-0" (UID: "1916131b-f4ff-4f49-8abc-a640dc07abc4") : configmap "swift-ring-files" not found Oct 01 15:12:32 crc kubenswrapper[4771]: I1001 15:12:32.530583 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-4jsq2" Oct 01 15:12:32 crc kubenswrapper[4771]: E1001 15:12:32.630306 4771 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Oct 01 15:12:32 crc kubenswrapper[4771]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/483f4f4a-6fce-4e64-9775-010730037203/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 01 15:12:32 crc kubenswrapper[4771]: > podSandboxID="dd70b974e90f845193c05dd0d3cece071c433eff899a940a3433d1b852e66975" Oct 01 15:12:32 crc kubenswrapper[4771]: E1001 15:12:32.630459 4771 kuberuntime_manager.go:1274] "Unhandled Error" err=< Oct 01 15:12:32 crc kubenswrapper[4771]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n599h5cbh7ch5d4h66fh676hdbh546h95h88h5ffh55ch7fhch57ch687hddhc7h5fdh57dh674h56fh64ch98h9bh557h55dh646h54ch54fh5c4h597q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-nb,SubPath:ovsdbserver-nb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-sb,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/ovsdbserver-sb,SubPath:ovsdbserver-sb,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2v99f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-86db49b7ff-zpcj9_openstack(483f4f4a-6fce-4e64-9775-010730037203): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/483f4f4a-6fce-4e64-9775-010730037203/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Oct 01 15:12:32 crc kubenswrapper[4771]: > logger="UnhandledError" Oct 01 15:12:32 crc kubenswrapper[4771]: E1001 15:12:32.631761 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/483f4f4a-6fce-4e64-9775-010730037203/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-86db49b7ff-zpcj9" podUID="483f4f4a-6fce-4e64-9775-010730037203" Oct 01 15:12:32 crc kubenswrapper[4771]: I1001 15:12:32.646819 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f4603c1-a6eb-4815-9126-0555a1089e21-config\") pod \"9f4603c1-a6eb-4815-9126-0555a1089e21\" (UID: \"9f4603c1-a6eb-4815-9126-0555a1089e21\") " Oct 01 15:12:32 crc kubenswrapper[4771]: I1001 15:12:32.646888 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bp5vp\" (UniqueName: \"kubernetes.io/projected/9f4603c1-a6eb-4815-9126-0555a1089e21-kube-api-access-bp5vp\") pod \"9f4603c1-a6eb-4815-9126-0555a1089e21\" (UID: \"9f4603c1-a6eb-4815-9126-0555a1089e21\") " Oct 01 15:12:32 crc kubenswrapper[4771]: I1001 15:12:32.646905 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f4603c1-a6eb-4815-9126-0555a1089e21-dns-svc\") pod \"9f4603c1-a6eb-4815-9126-0555a1089e21\" (UID: \"9f4603c1-a6eb-4815-9126-0555a1089e21\") " Oct 01 15:12:32 crc kubenswrapper[4771]: I1001 15:12:32.646954 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f4603c1-a6eb-4815-9126-0555a1089e21-ovsdbserver-nb\") pod \"9f4603c1-a6eb-4815-9126-0555a1089e21\" (UID: \"9f4603c1-a6eb-4815-9126-0555a1089e21\") " Oct 01 15:12:32 crc kubenswrapper[4771]: I1001 15:12:32.656252 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f4603c1-a6eb-4815-9126-0555a1089e21-kube-api-access-bp5vp" (OuterVolumeSpecName: "kube-api-access-bp5vp") pod "9f4603c1-a6eb-4815-9126-0555a1089e21" (UID: "9f4603c1-a6eb-4815-9126-0555a1089e21"). InnerVolumeSpecName "kube-api-access-bp5vp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:12:32 crc kubenswrapper[4771]: I1001 15:12:32.670141 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f4603c1-a6eb-4815-9126-0555a1089e21-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9f4603c1-a6eb-4815-9126-0555a1089e21" (UID: "9f4603c1-a6eb-4815-9126-0555a1089e21"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:12:32 crc kubenswrapper[4771]: I1001 15:12:32.670208 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f4603c1-a6eb-4815-9126-0555a1089e21-config" (OuterVolumeSpecName: "config") pod "9f4603c1-a6eb-4815-9126-0555a1089e21" (UID: "9f4603c1-a6eb-4815-9126-0555a1089e21"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:12:32 crc kubenswrapper[4771]: I1001 15:12:32.672248 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f4603c1-a6eb-4815-9126-0555a1089e21-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9f4603c1-a6eb-4815-9126-0555a1089e21" (UID: "9f4603c1-a6eb-4815-9126-0555a1089e21"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:12:32 crc kubenswrapper[4771]: I1001 15:12:32.748558 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9f4603c1-a6eb-4815-9126-0555a1089e21-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 15:12:32 crc kubenswrapper[4771]: I1001 15:12:32.748595 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f4603c1-a6eb-4815-9126-0555a1089e21-config\") on node \"crc\" DevicePath \"\"" Oct 01 15:12:32 crc kubenswrapper[4771]: I1001 15:12:32.748610 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bp5vp\" (UniqueName: \"kubernetes.io/projected/9f4603c1-a6eb-4815-9126-0555a1089e21-kube-api-access-bp5vp\") on node \"crc\" DevicePath \"\"" Oct 01 15:12:32 crc kubenswrapper[4771]: I1001 15:12:32.748624 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9f4603c1-a6eb-4815-9126-0555a1089e21-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 15:12:32 crc kubenswrapper[4771]: I1001 15:12:32.987240 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-4jsq2" Oct 01 15:12:32 crc kubenswrapper[4771]: I1001 15:12:32.987231 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-4jsq2" event={"ID":"9f4603c1-a6eb-4815-9126-0555a1089e21","Type":"ContainerDied","Data":"287d5c6d1624b9e49f50cf3eb11e29b0137aedf3de1cafa36724568d0b1a636a"} Oct 01 15:12:32 crc kubenswrapper[4771]: I1001 15:12:32.987629 4771 scope.go:117] "RemoveContainer" containerID="2425c56cd80450c4aee90dcfd2a9a554fce6ec8448d14fc8a08cc8786d45434b" Oct 01 15:12:32 crc kubenswrapper[4771]: I1001 15:12:32.988971 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"064359ac-92c0-4674-a919-ccb8ffc0a5df","Type":"ContainerStarted","Data":"3448c9f81b9b80e0e9bb9039a0358750b202817e64730ee8cb0dd99dbded3f2f"} Oct 01 15:12:32 crc kubenswrapper[4771]: I1001 15:12:32.993350 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-4tm2t" event={"ID":"a66f8282-966c-40de-9bfd-b6b75d5f519c","Type":"ContainerStarted","Data":"28bfc8f04d9096eedd6aed5ba39ff9d3b683674cb90a41f4b2e783c56d01aa3b"} Oct 01 15:12:32 crc kubenswrapper[4771]: I1001 15:12:32.993486 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-4tm2t" Oct 01 15:12:32 crc kubenswrapper[4771]: I1001 15:12:32.996875 4771 generic.go:334] "Generic (PLEG): container finished" podID="74fee09d-11ad-45f4-a779-4c352b6dc67f" containerID="1e022fa4cdd94ea6b2751b20aafd82906790cb634646de2ed43f7df588e0c66f" exitCode=0 Oct 01 15:12:32 crc kubenswrapper[4771]: I1001 15:12:32.996956 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-rv4hj" event={"ID":"74fee09d-11ad-45f4-a779-4c352b6dc67f","Type":"ContainerDied","Data":"1e022fa4cdd94ea6b2751b20aafd82906790cb634646de2ed43f7df588e0c66f"} Oct 01 15:12:32 crc kubenswrapper[4771]: I1001 15:12:32.998632 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f11eb6e5-8306-4db5-af63-ef4d869f7e2c","Type":"ContainerStarted","Data":"c3eb3e3832dbb71c6552b52fffc9d46d565d2ed58dc9ada50391280b5d29480a"} Oct 01 15:12:33 crc kubenswrapper[4771]: I1001 15:12:33.011094 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=20.959634072 podStartE2EDuration="47.011072045s" podCreationTimestamp="2025-10-01 15:11:46 +0000 UTC" firstStartedPulling="2025-10-01 15:12:06.425943954 +0000 UTC m=+971.045119125" lastFinishedPulling="2025-10-01 15:12:32.477381937 +0000 UTC m=+997.096557098" observedRunningTime="2025-10-01 15:12:33.004569904 +0000 UTC m=+997.623745075" watchObservedRunningTime="2025-10-01 15:12:33.011072045 +0000 UTC m=+997.630247226" Oct 01 15:12:33 crc kubenswrapper[4771]: I1001 15:12:33.054058 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-4tm2t" podStartSLOduration=10.054040416 podStartE2EDuration="10.054040416s" podCreationTimestamp="2025-10-01 15:12:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:12:33.048821638 +0000 UTC m=+997.667996809" watchObservedRunningTime="2025-10-01 15:12:33.054040416 +0000 UTC m=+997.673215577" Oct 01 15:12:33 crc kubenswrapper[4771]: I1001 15:12:33.119334 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-4jsq2"] Oct 01 15:12:33 crc kubenswrapper[4771]: I1001 15:12:33.126806 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-4jsq2"] Oct 01 15:12:33 crc kubenswrapper[4771]: I1001 15:12:33.996945 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f4603c1-a6eb-4815-9126-0555a1089e21" path="/var/lib/kubelet/pods/9f4603c1-a6eb-4815-9126-0555a1089e21/volumes" Oct 01 15:12:34 crc kubenswrapper[4771]: I1001 15:12:34.010686 4771 generic.go:334] "Generic (PLEG): container finished" podID="37c38012-d257-4269-86fa-8cf3ef4de4cd" containerID="b263a45a2250733f15572053b39f22d5a28c6d601442591a0c2398614cc4f197" exitCode=0 Oct 01 15:12:34 crc kubenswrapper[4771]: I1001 15:12:34.010775 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"37c38012-d257-4269-86fa-8cf3ef4de4cd","Type":"ContainerDied","Data":"b263a45a2250733f15572053b39f22d5a28c6d601442591a0c2398614cc4f197"} Oct 01 15:12:34 crc kubenswrapper[4771]: I1001 15:12:34.013402 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-rv4hj" event={"ID":"74fee09d-11ad-45f4-a779-4c352b6dc67f","Type":"ContainerStarted","Data":"fc8c117f28a209b1ff7be96ad00ae59cee9ac510e8760323b9901e6dd4155fac"} Oct 01 15:12:34 crc kubenswrapper[4771]: I1001 15:12:34.016871 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"999431d1-6d92-46de-ba0f-b253f96fe627","Type":"ContainerStarted","Data":"47028a0121de8b90b1a881090af1647b601051bf9aa4dcb8756731be011c5b59"} Oct 01 15:12:34 crc kubenswrapper[4771]: I1001 15:12:34.019151 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-zpcj9" event={"ID":"483f4f4a-6fce-4e64-9775-010730037203","Type":"ContainerStarted","Data":"281e043cfe8c0685df0e3e9cd2bd4f8c7871140e966b11702312f4c7b863c841"} Oct 01 15:12:34 crc kubenswrapper[4771]: I1001 15:12:34.019329 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-zpcj9" Oct 01 15:12:34 crc kubenswrapper[4771]: I1001 15:12:34.051497 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-zpcj9" podStartSLOduration=5.058957605 podStartE2EDuration="22.051482594s" podCreationTimestamp="2025-10-01 15:12:12 +0000 UTC" firstStartedPulling="2025-10-01 15:12:13.987680011 +0000 UTC m=+978.606855192" lastFinishedPulling="2025-10-01 15:12:30.98020499 +0000 UTC m=+995.599380181" observedRunningTime="2025-10-01 15:12:34.048996873 +0000 UTC m=+998.668172044" watchObservedRunningTime="2025-10-01 15:12:34.051482594 +0000 UTC m=+998.670657765" Oct 01 15:12:34 crc kubenswrapper[4771]: I1001 15:12:34.082211 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=18.532303313 podStartE2EDuration="45.082187273s" podCreationTimestamp="2025-10-01 15:11:49 +0000 UTC" firstStartedPulling="2025-10-01 15:12:06.563654127 +0000 UTC m=+971.182829298" lastFinishedPulling="2025-10-01 15:12:33.113538087 +0000 UTC m=+997.732713258" observedRunningTime="2025-10-01 15:12:34.076222106 +0000 UTC m=+998.695397337" watchObservedRunningTime="2025-10-01 15:12:34.082187273 +0000 UTC m=+998.701362484" Oct 01 15:12:35 crc kubenswrapper[4771]: I1001 15:12:35.812163 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 01 15:12:35 crc kubenswrapper[4771]: I1001 15:12:35.812841 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 01 15:12:35 crc kubenswrapper[4771]: I1001 15:12:35.829623 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 01 15:12:35 crc kubenswrapper[4771]: I1001 15:12:35.860791 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 01 15:12:35 crc kubenswrapper[4771]: I1001 15:12:35.887848 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 01 15:12:36 crc kubenswrapper[4771]: I1001 15:12:36.048609 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"37c38012-d257-4269-86fa-8cf3ef4de4cd","Type":"ContainerStarted","Data":"597f50ee3f3ec16a440467211c4ca3a2c361e9bc6770560e6c931f620980fed6"} Oct 01 15:12:36 crc kubenswrapper[4771]: I1001 15:12:36.054511 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-rv4hj" event={"ID":"74fee09d-11ad-45f4-a779-4c352b6dc67f","Type":"ContainerStarted","Data":"97e9f6248043a0cc56868244e528d0572a3c0f7246c5fd44c083d29f025e6251"} Oct 01 15:12:36 crc kubenswrapper[4771]: I1001 15:12:36.054893 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 01 15:12:36 crc kubenswrapper[4771]: I1001 15:12:36.054931 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-rv4hj" Oct 01 15:12:36 crc kubenswrapper[4771]: I1001 15:12:36.105436 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=51.780377611 podStartE2EDuration="57.105412928s" podCreationTimestamp="2025-10-01 15:11:39 +0000 UTC" firstStartedPulling="2025-10-01 15:12:06.309997909 +0000 UTC m=+970.929173080" lastFinishedPulling="2025-10-01 15:12:11.635033216 +0000 UTC m=+976.254208397" observedRunningTime="2025-10-01 15:12:36.072270819 +0000 UTC m=+1000.691446030" watchObservedRunningTime="2025-10-01 15:12:36.105412928 +0000 UTC m=+1000.724588109" Oct 01 15:12:36 crc kubenswrapper[4771]: I1001 15:12:36.113982 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 01 15:12:36 crc kubenswrapper[4771]: I1001 15:12:36.115864 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-rv4hj" podStartSLOduration=40.301695402 podStartE2EDuration="50.115846266s" podCreationTimestamp="2025-10-01 15:11:46 +0000 UTC" firstStartedPulling="2025-10-01 15:12:07.154524861 +0000 UTC m=+971.773700032" lastFinishedPulling="2025-10-01 15:12:16.968675715 +0000 UTC m=+981.587850896" observedRunningTime="2025-10-01 15:12:36.093180436 +0000 UTC m=+1000.712355627" watchObservedRunningTime="2025-10-01 15:12:36.115846266 +0000 UTC m=+1000.735021437" Oct 01 15:12:36 crc kubenswrapper[4771]: I1001 15:12:36.151103 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 01 15:12:36 crc kubenswrapper[4771]: I1001 15:12:36.367627 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 01 15:12:36 crc kubenswrapper[4771]: E1001 15:12:36.368320 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f4603c1-a6eb-4815-9126-0555a1089e21" containerName="init" Oct 01 15:12:36 crc kubenswrapper[4771]: I1001 15:12:36.368343 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f4603c1-a6eb-4815-9126-0555a1089e21" containerName="init" Oct 01 15:12:36 crc kubenswrapper[4771]: I1001 15:12:36.368586 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f4603c1-a6eb-4815-9126-0555a1089e21" containerName="init" Oct 01 15:12:36 crc kubenswrapper[4771]: I1001 15:12:36.369929 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 01 15:12:36 crc kubenswrapper[4771]: I1001 15:12:36.375039 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 01 15:12:36 crc kubenswrapper[4771]: I1001 15:12:36.375164 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-4gbjv" Oct 01 15:12:36 crc kubenswrapper[4771]: I1001 15:12:36.375235 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 01 15:12:36 crc kubenswrapper[4771]: I1001 15:12:36.375416 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 01 15:12:36 crc kubenswrapper[4771]: I1001 15:12:36.390012 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 01 15:12:36 crc kubenswrapper[4771]: I1001 15:12:36.431537 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/553def8f-6710-4724-a4b7-a9f6e2c310e6-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"553def8f-6710-4724-a4b7-a9f6e2c310e6\") " pod="openstack/ovn-northd-0" Oct 01 15:12:36 crc kubenswrapper[4771]: I1001 15:12:36.431673 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/553def8f-6710-4724-a4b7-a9f6e2c310e6-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"553def8f-6710-4724-a4b7-a9f6e2c310e6\") " pod="openstack/ovn-northd-0" Oct 01 15:12:36 crc kubenswrapper[4771]: I1001 15:12:36.431786 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/553def8f-6710-4724-a4b7-a9f6e2c310e6-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"553def8f-6710-4724-a4b7-a9f6e2c310e6\") " pod="openstack/ovn-northd-0" Oct 01 15:12:36 crc kubenswrapper[4771]: I1001 15:12:36.431880 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/553def8f-6710-4724-a4b7-a9f6e2c310e6-config\") pod \"ovn-northd-0\" (UID: \"553def8f-6710-4724-a4b7-a9f6e2c310e6\") " pod="openstack/ovn-northd-0" Oct 01 15:12:36 crc kubenswrapper[4771]: I1001 15:12:36.431983 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh26w\" (UniqueName: \"kubernetes.io/projected/553def8f-6710-4724-a4b7-a9f6e2c310e6-kube-api-access-dh26w\") pod \"ovn-northd-0\" (UID: \"553def8f-6710-4724-a4b7-a9f6e2c310e6\") " pod="openstack/ovn-northd-0" Oct 01 15:12:36 crc kubenswrapper[4771]: I1001 15:12:36.432199 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/553def8f-6710-4724-a4b7-a9f6e2c310e6-scripts\") pod \"ovn-northd-0\" (UID: \"553def8f-6710-4724-a4b7-a9f6e2c310e6\") " pod="openstack/ovn-northd-0" Oct 01 15:12:36 crc kubenswrapper[4771]: I1001 15:12:36.432234 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/553def8f-6710-4724-a4b7-a9f6e2c310e6-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"553def8f-6710-4724-a4b7-a9f6e2c310e6\") " pod="openstack/ovn-northd-0" Oct 01 15:12:36 crc kubenswrapper[4771]: I1001 15:12:36.533216 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh26w\" (UniqueName: \"kubernetes.io/projected/553def8f-6710-4724-a4b7-a9f6e2c310e6-kube-api-access-dh26w\") pod \"ovn-northd-0\" (UID: \"553def8f-6710-4724-a4b7-a9f6e2c310e6\") " pod="openstack/ovn-northd-0" Oct 01 15:12:36 crc kubenswrapper[4771]: I1001 15:12:36.533300 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/553def8f-6710-4724-a4b7-a9f6e2c310e6-scripts\") pod \"ovn-northd-0\" (UID: \"553def8f-6710-4724-a4b7-a9f6e2c310e6\") " pod="openstack/ovn-northd-0" Oct 01 15:12:36 crc kubenswrapper[4771]: I1001 15:12:36.533318 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/553def8f-6710-4724-a4b7-a9f6e2c310e6-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"553def8f-6710-4724-a4b7-a9f6e2c310e6\") " pod="openstack/ovn-northd-0" Oct 01 15:12:36 crc kubenswrapper[4771]: I1001 15:12:36.533345 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/553def8f-6710-4724-a4b7-a9f6e2c310e6-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"553def8f-6710-4724-a4b7-a9f6e2c310e6\") " pod="openstack/ovn-northd-0" Oct 01 15:12:36 crc kubenswrapper[4771]: I1001 15:12:36.533378 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/553def8f-6710-4724-a4b7-a9f6e2c310e6-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"553def8f-6710-4724-a4b7-a9f6e2c310e6\") " pod="openstack/ovn-northd-0" Oct 01 15:12:36 crc kubenswrapper[4771]: I1001 15:12:36.533431 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/553def8f-6710-4724-a4b7-a9f6e2c310e6-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"553def8f-6710-4724-a4b7-a9f6e2c310e6\") " pod="openstack/ovn-northd-0" Oct 01 15:12:36 crc kubenswrapper[4771]: I1001 15:12:36.533457 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/553def8f-6710-4724-a4b7-a9f6e2c310e6-config\") pod \"ovn-northd-0\" (UID: \"553def8f-6710-4724-a4b7-a9f6e2c310e6\") " pod="openstack/ovn-northd-0" Oct 01 15:12:36 crc kubenswrapper[4771]: I1001 15:12:36.534410 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/553def8f-6710-4724-a4b7-a9f6e2c310e6-config\") pod \"ovn-northd-0\" (UID: \"553def8f-6710-4724-a4b7-a9f6e2c310e6\") " pod="openstack/ovn-northd-0" Oct 01 15:12:36 crc kubenswrapper[4771]: I1001 15:12:36.534500 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/553def8f-6710-4724-a4b7-a9f6e2c310e6-scripts\") pod \"ovn-northd-0\" (UID: \"553def8f-6710-4724-a4b7-a9f6e2c310e6\") " pod="openstack/ovn-northd-0" Oct 01 15:12:36 crc kubenswrapper[4771]: I1001 15:12:36.534840 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/553def8f-6710-4724-a4b7-a9f6e2c310e6-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"553def8f-6710-4724-a4b7-a9f6e2c310e6\") " pod="openstack/ovn-northd-0" Oct 01 15:12:36 crc kubenswrapper[4771]: I1001 15:12:36.541298 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/553def8f-6710-4724-a4b7-a9f6e2c310e6-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"553def8f-6710-4724-a4b7-a9f6e2c310e6\") " pod="openstack/ovn-northd-0" Oct 01 15:12:36 crc kubenswrapper[4771]: I1001 15:12:36.544906 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/553def8f-6710-4724-a4b7-a9f6e2c310e6-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"553def8f-6710-4724-a4b7-a9f6e2c310e6\") " pod="openstack/ovn-northd-0" Oct 01 15:12:36 crc kubenswrapper[4771]: I1001 15:12:36.545131 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/553def8f-6710-4724-a4b7-a9f6e2c310e6-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"553def8f-6710-4724-a4b7-a9f6e2c310e6\") " pod="openstack/ovn-northd-0" Oct 01 15:12:36 crc kubenswrapper[4771]: I1001 15:12:36.552256 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh26w\" (UniqueName: \"kubernetes.io/projected/553def8f-6710-4724-a4b7-a9f6e2c310e6-kube-api-access-dh26w\") pod \"ovn-northd-0\" (UID: \"553def8f-6710-4724-a4b7-a9f6e2c310e6\") " pod="openstack/ovn-northd-0" Oct 01 15:12:36 crc kubenswrapper[4771]: I1001 15:12:36.684321 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 01 15:12:36 crc kubenswrapper[4771]: I1001 15:12:36.937889 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-rv4hj" Oct 01 15:12:37 crc kubenswrapper[4771]: I1001 15:12:37.067438 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-hzvjt" event={"ID":"a56d3441-b413-4629-870b-49c208943243","Type":"ContainerStarted","Data":"17f9310cc8f5cb934e0037a5fbaa6f489410557510252754a9d8b383461b613f"} Oct 01 15:12:37 crc kubenswrapper[4771]: I1001 15:12:37.094457 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-hzvjt" podStartSLOduration=4.588243666 podStartE2EDuration="9.094427817s" podCreationTimestamp="2025-10-01 15:12:28 +0000 UTC" firstStartedPulling="2025-10-01 15:12:31.478279658 +0000 UTC m=+996.097454829" lastFinishedPulling="2025-10-01 15:12:35.984463799 +0000 UTC m=+1000.603638980" observedRunningTime="2025-10-01 15:12:37.087606389 +0000 UTC m=+1001.706781570" watchObservedRunningTime="2025-10-01 15:12:37.094427817 +0000 UTC m=+1001.713602998" Oct 01 15:12:37 crc kubenswrapper[4771]: I1001 15:12:37.136677 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 01 15:12:37 crc kubenswrapper[4771]: W1001 15:12:37.138751 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod553def8f_6710_4724_a4b7_a9f6e2c310e6.slice/crio-be2ebbbbe05383ab1f8536074eafac35e619dc15c497912c7deab51bc71d3777 WatchSource:0}: Error finding container be2ebbbbe05383ab1f8536074eafac35e619dc15c497912c7deab51bc71d3777: Status 404 returned error can't find the container with id be2ebbbbe05383ab1f8536074eafac35e619dc15c497912c7deab51bc71d3777 Oct 01 15:12:38 crc kubenswrapper[4771]: I1001 15:12:38.078115 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"553def8f-6710-4724-a4b7-a9f6e2c310e6","Type":"ContainerStarted","Data":"be2ebbbbe05383ab1f8536074eafac35e619dc15c497912c7deab51bc71d3777"} Oct 01 15:12:38 crc kubenswrapper[4771]: I1001 15:12:38.258856 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-zpcj9" Oct 01 15:12:38 crc kubenswrapper[4771]: I1001 15:12:38.642048 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-4tm2t" Oct 01 15:12:38 crc kubenswrapper[4771]: I1001 15:12:38.712079 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-zpcj9"] Oct 01 15:12:39 crc kubenswrapper[4771]: I1001 15:12:39.085334 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-zpcj9" podUID="483f4f4a-6fce-4e64-9775-010730037203" containerName="dnsmasq-dns" containerID="cri-o://281e043cfe8c0685df0e3e9cd2bd4f8c7871140e966b11702312f4c7b863c841" gracePeriod=10 Oct 01 15:12:40 crc kubenswrapper[4771]: I1001 15:12:40.317864 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1916131b-f4ff-4f49-8abc-a640dc07abc4-etc-swift\") pod \"swift-storage-0\" (UID: \"1916131b-f4ff-4f49-8abc-a640dc07abc4\") " pod="openstack/swift-storage-0" Oct 01 15:12:40 crc kubenswrapper[4771]: E1001 15:12:40.318127 4771 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 01 15:12:40 crc kubenswrapper[4771]: E1001 15:12:40.318160 4771 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 01 15:12:40 crc kubenswrapper[4771]: E1001 15:12:40.318239 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1916131b-f4ff-4f49-8abc-a640dc07abc4-etc-swift podName:1916131b-f4ff-4f49-8abc-a640dc07abc4 nodeName:}" failed. No retries permitted until 2025-10-01 15:12:56.31821623 +0000 UTC m=+1020.937391431 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/1916131b-f4ff-4f49-8abc-a640dc07abc4-etc-swift") pod "swift-storage-0" (UID: "1916131b-f4ff-4f49-8abc-a640dc07abc4") : configmap "swift-ring-files" not found Oct 01 15:12:40 crc kubenswrapper[4771]: I1001 15:12:40.911215 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 01 15:12:40 crc kubenswrapper[4771]: I1001 15:12:40.911592 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 01 15:12:41 crc kubenswrapper[4771]: I1001 15:12:41.108168 4771 generic.go:334] "Generic (PLEG): container finished" podID="483f4f4a-6fce-4e64-9775-010730037203" containerID="281e043cfe8c0685df0e3e9cd2bd4f8c7871140e966b11702312f4c7b863c841" exitCode=0 Oct 01 15:12:41 crc kubenswrapper[4771]: I1001 15:12:41.108243 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-zpcj9" event={"ID":"483f4f4a-6fce-4e64-9775-010730037203","Type":"ContainerDied","Data":"281e043cfe8c0685df0e3e9cd2bd4f8c7871140e966b11702312f4c7b863c841"} Oct 01 15:12:41 crc kubenswrapper[4771]: I1001 15:12:41.843895 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-zpcj9" Oct 01 15:12:41 crc kubenswrapper[4771]: I1001 15:12:41.943063 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/483f4f4a-6fce-4e64-9775-010730037203-config\") pod \"483f4f4a-6fce-4e64-9775-010730037203\" (UID: \"483f4f4a-6fce-4e64-9775-010730037203\") " Oct 01 15:12:41 crc kubenswrapper[4771]: I1001 15:12:41.943139 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2v99f\" (UniqueName: \"kubernetes.io/projected/483f4f4a-6fce-4e64-9775-010730037203-kube-api-access-2v99f\") pod \"483f4f4a-6fce-4e64-9775-010730037203\" (UID: \"483f4f4a-6fce-4e64-9775-010730037203\") " Oct 01 15:12:41 crc kubenswrapper[4771]: I1001 15:12:41.943237 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/483f4f4a-6fce-4e64-9775-010730037203-ovsdbserver-nb\") pod \"483f4f4a-6fce-4e64-9775-010730037203\" (UID: \"483f4f4a-6fce-4e64-9775-010730037203\") " Oct 01 15:12:41 crc kubenswrapper[4771]: I1001 15:12:41.943275 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/483f4f4a-6fce-4e64-9775-010730037203-ovsdbserver-sb\") pod \"483f4f4a-6fce-4e64-9775-010730037203\" (UID: \"483f4f4a-6fce-4e64-9775-010730037203\") " Oct 01 15:12:41 crc kubenswrapper[4771]: I1001 15:12:41.943354 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/483f4f4a-6fce-4e64-9775-010730037203-dns-svc\") pod \"483f4f4a-6fce-4e64-9775-010730037203\" (UID: \"483f4f4a-6fce-4e64-9775-010730037203\") " Oct 01 15:12:41 crc kubenswrapper[4771]: I1001 15:12:41.950498 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/483f4f4a-6fce-4e64-9775-010730037203-kube-api-access-2v99f" (OuterVolumeSpecName: "kube-api-access-2v99f") pod "483f4f4a-6fce-4e64-9775-010730037203" (UID: "483f4f4a-6fce-4e64-9775-010730037203"). InnerVolumeSpecName "kube-api-access-2v99f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:12:41 crc kubenswrapper[4771]: I1001 15:12:41.990332 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/483f4f4a-6fce-4e64-9775-010730037203-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "483f4f4a-6fce-4e64-9775-010730037203" (UID: "483f4f4a-6fce-4e64-9775-010730037203"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:12:42 crc kubenswrapper[4771]: I1001 15:12:42.012101 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/483f4f4a-6fce-4e64-9775-010730037203-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "483f4f4a-6fce-4e64-9775-010730037203" (UID: "483f4f4a-6fce-4e64-9775-010730037203"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:12:42 crc kubenswrapper[4771]: I1001 15:12:42.020147 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/483f4f4a-6fce-4e64-9775-010730037203-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "483f4f4a-6fce-4e64-9775-010730037203" (UID: "483f4f4a-6fce-4e64-9775-010730037203"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:12:42 crc kubenswrapper[4771]: I1001 15:12:42.021430 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/483f4f4a-6fce-4e64-9775-010730037203-config" (OuterVolumeSpecName: "config") pod "483f4f4a-6fce-4e64-9775-010730037203" (UID: "483f4f4a-6fce-4e64-9775-010730037203"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:12:42 crc kubenswrapper[4771]: I1001 15:12:42.045724 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/483f4f4a-6fce-4e64-9775-010730037203-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 15:12:42 crc kubenswrapper[4771]: I1001 15:12:42.045768 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/483f4f4a-6fce-4e64-9775-010730037203-config\") on node \"crc\" DevicePath \"\"" Oct 01 15:12:42 crc kubenswrapper[4771]: I1001 15:12:42.045781 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2v99f\" (UniqueName: \"kubernetes.io/projected/483f4f4a-6fce-4e64-9775-010730037203-kube-api-access-2v99f\") on node \"crc\" DevicePath \"\"" Oct 01 15:12:42 crc kubenswrapper[4771]: I1001 15:12:42.045794 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/483f4f4a-6fce-4e64-9775-010730037203-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 15:12:42 crc kubenswrapper[4771]: I1001 15:12:42.045807 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/483f4f4a-6fce-4e64-9775-010730037203-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 15:12:42 crc kubenswrapper[4771]: I1001 15:12:42.118611 4771 generic.go:334] "Generic (PLEG): container finished" podID="4ef68dad-0f62-4a2d-aa86-23997c284df0" containerID="6d718463a0c87a6bd2faba1cb023cacfebbf31e9054a026b22782b0248bd589e" exitCode=0 Oct 01 15:12:42 crc kubenswrapper[4771]: I1001 15:12:42.118716 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4ef68dad-0f62-4a2d-aa86-23997c284df0","Type":"ContainerDied","Data":"6d718463a0c87a6bd2faba1cb023cacfebbf31e9054a026b22782b0248bd589e"} Oct 01 15:12:42 crc kubenswrapper[4771]: I1001 15:12:42.122627 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-zpcj9" event={"ID":"483f4f4a-6fce-4e64-9775-010730037203","Type":"ContainerDied","Data":"dd70b974e90f845193c05dd0d3cece071c433eff899a940a3433d1b852e66975"} Oct 01 15:12:42 crc kubenswrapper[4771]: I1001 15:12:42.122784 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-zpcj9" Oct 01 15:12:42 crc kubenswrapper[4771]: I1001 15:12:42.122824 4771 scope.go:117] "RemoveContainer" containerID="281e043cfe8c0685df0e3e9cd2bd4f8c7871140e966b11702312f4c7b863c841" Oct 01 15:12:42 crc kubenswrapper[4771]: I1001 15:12:42.156212 4771 scope.go:117] "RemoveContainer" containerID="c2dc030037ae16412609923853d07c4b69cff846c3e8a72dedc77e0f06f0693d" Oct 01 15:12:42 crc kubenswrapper[4771]: I1001 15:12:42.191899 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-zpcj9"] Oct 01 15:12:42 crc kubenswrapper[4771]: I1001 15:12:42.197975 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-zpcj9"] Oct 01 15:12:43 crc kubenswrapper[4771]: I1001 15:12:43.136378 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4ef68dad-0f62-4a2d-aa86-23997c284df0","Type":"ContainerStarted","Data":"880f5a58079daaa348b357479c7cf97273062a832a8ce419c6f4d3ec57ffd1f3"} Oct 01 15:12:43 crc kubenswrapper[4771]: I1001 15:12:43.998037 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="483f4f4a-6fce-4e64-9775-010730037203" path="/var/lib/kubelet/pods/483f4f4a-6fce-4e64-9775-010730037203/volumes" Oct 01 15:12:44 crc kubenswrapper[4771]: I1001 15:12:44.146351 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"553def8f-6710-4724-a4b7-a9f6e2c310e6","Type":"ContainerStarted","Data":"14c92c3a3a667031cbd25248d89b345bf912fc942c51d5182f164dc12e46213b"} Oct 01 15:12:44 crc kubenswrapper[4771]: I1001 15:12:44.148296 4771 generic.go:334] "Generic (PLEG): container finished" podID="e902a14c-a59a-4278-b560-33de2cb50d32" containerID="854896a3d2c0c1f8faa1797301701a598a385b198d0044d345ca91b2eef40efc" exitCode=0 Oct 01 15:12:44 crc kubenswrapper[4771]: I1001 15:12:44.148396 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e902a14c-a59a-4278-b560-33de2cb50d32","Type":"ContainerDied","Data":"854896a3d2c0c1f8faa1797301701a598a385b198d0044d345ca91b2eef40efc"} Oct 01 15:12:44 crc kubenswrapper[4771]: I1001 15:12:44.148561 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 01 15:12:44 crc kubenswrapper[4771]: I1001 15:12:44.198135 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=39.194362477 podStartE2EDuration="1m8.198108145s" podCreationTimestamp="2025-10-01 15:11:36 +0000 UTC" firstStartedPulling="2025-10-01 15:11:38.160063161 +0000 UTC m=+942.779238332" lastFinishedPulling="2025-10-01 15:12:07.163808829 +0000 UTC m=+971.782984000" observedRunningTime="2025-10-01 15:12:44.193007298 +0000 UTC m=+1008.812182489" watchObservedRunningTime="2025-10-01 15:12:44.198108145 +0000 UTC m=+1008.817283356" Oct 01 15:12:45 crc kubenswrapper[4771]: I1001 15:12:45.032019 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 01 15:12:45 crc kubenswrapper[4771]: I1001 15:12:45.105131 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 01 15:12:45 crc kubenswrapper[4771]: I1001 15:12:45.168881 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"553def8f-6710-4724-a4b7-a9f6e2c310e6","Type":"ContainerStarted","Data":"a6cce3e7415c4be8fb771429e12d7b59a0514a7c548a81f6b0d7ac92fd97b576"} Oct 01 15:12:45 crc kubenswrapper[4771]: I1001 15:12:45.168978 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 01 15:12:45 crc kubenswrapper[4771]: I1001 15:12:45.172530 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"e902a14c-a59a-4278-b560-33de2cb50d32","Type":"ContainerStarted","Data":"5d8f6d4edb9210ad9b86ca5bd301bedae60ceff90bd810377e3ed21cedbfd8ab"} Oct 01 15:12:45 crc kubenswrapper[4771]: I1001 15:12:45.225529 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.455572363 podStartE2EDuration="9.225501363s" podCreationTimestamp="2025-10-01 15:12:36 +0000 UTC" firstStartedPulling="2025-10-01 15:12:37.142061864 +0000 UTC m=+1001.761237035" lastFinishedPulling="2025-10-01 15:12:43.911990854 +0000 UTC m=+1008.531166035" observedRunningTime="2025-10-01 15:12:45.212134732 +0000 UTC m=+1009.831309903" watchObservedRunningTime="2025-10-01 15:12:45.225501363 +0000 UTC m=+1009.844676534" Oct 01 15:12:45 crc kubenswrapper[4771]: I1001 15:12:45.269326 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=-9223371969.585466 podStartE2EDuration="1m7.269309725s" podCreationTimestamp="2025-10-01 15:11:38 +0000 UTC" firstStartedPulling="2025-10-01 15:11:40.25128864 +0000 UTC m=+944.870463811" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:12:45.268248378 +0000 UTC m=+1009.887423549" watchObservedRunningTime="2025-10-01 15:12:45.269309725 +0000 UTC m=+1009.888484896" Oct 01 15:12:46 crc kubenswrapper[4771]: I1001 15:12:46.180029 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-tckkt" event={"ID":"553f381f-ef83-4876-8a81-df81a5be7dd8","Type":"ContainerStarted","Data":"2810aa16a7eb21192d96e2836726c39c7feb41f59043c166a96cd3e2b29ef32d"} Oct 01 15:12:46 crc kubenswrapper[4771]: I1001 15:12:46.181132 4771 generic.go:334] "Generic (PLEG): container finished" podID="a56d3441-b413-4629-870b-49c208943243" containerID="17f9310cc8f5cb934e0037a5fbaa6f489410557510252754a9d8b383461b613f" exitCode=0 Oct 01 15:12:46 crc kubenswrapper[4771]: I1001 15:12:46.181205 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-hzvjt" event={"ID":"a56d3441-b413-4629-870b-49c208943243","Type":"ContainerDied","Data":"17f9310cc8f5cb934e0037a5fbaa6f489410557510252754a9d8b383461b613f"} Oct 01 15:12:46 crc kubenswrapper[4771]: I1001 15:12:46.199463 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-tckkt" podStartSLOduration=-9223372002.65533 podStartE2EDuration="34.199444789s" podCreationTimestamp="2025-10-01 15:12:12 +0000 UTC" firstStartedPulling="2025-10-01 15:12:13.904513216 +0000 UTC m=+978.523688387" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:12:46.197027709 +0000 UTC m=+1010.816202890" watchObservedRunningTime="2025-10-01 15:12:46.199444789 +0000 UTC m=+1010.818619980" Oct 01 15:12:46 crc kubenswrapper[4771]: I1001 15:12:46.931624 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-zpdvh" podUID="20d8761e-4ce2-4312-8a80-8c3ce8908f2c" containerName="ovn-controller" probeResult="failure" output=< Oct 01 15:12:46 crc kubenswrapper[4771]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 01 15:12:46 crc kubenswrapper[4771]: > Oct 01 15:12:47 crc kubenswrapper[4771]: I1001 15:12:47.191492 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"eca5bbfa-3927-4c5b-b973-7dce060db69b","Type":"ContainerStarted","Data":"2ebe277151b21c46d2d99dbef04c1ed19ccdc61d7ba1739b0a3ef2c97c4243d6"} Oct 01 15:12:47 crc kubenswrapper[4771]: I1001 15:12:47.193952 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 01 15:12:47 crc kubenswrapper[4771]: I1001 15:12:47.213156 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=24.926286455 podStartE2EDuration="1m5.213114407s" podCreationTimestamp="2025-10-01 15:11:42 +0000 UTC" firstStartedPulling="2025-10-01 15:12:06.343615757 +0000 UTC m=+970.962790928" lastFinishedPulling="2025-10-01 15:12:46.630443689 +0000 UTC m=+1011.249618880" observedRunningTime="2025-10-01 15:12:47.21200823 +0000 UTC m=+1011.831183441" watchObservedRunningTime="2025-10-01 15:12:47.213114407 +0000 UTC m=+1011.832289578" Oct 01 15:12:47 crc kubenswrapper[4771]: I1001 15:12:47.651256 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-hzvjt" Oct 01 15:12:47 crc kubenswrapper[4771]: I1001 15:12:47.764035 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a56d3441-b413-4629-870b-49c208943243-etc-swift\") pod \"a56d3441-b413-4629-870b-49c208943243\" (UID: \"a56d3441-b413-4629-870b-49c208943243\") " Oct 01 15:12:47 crc kubenswrapper[4771]: I1001 15:12:47.764129 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a56d3441-b413-4629-870b-49c208943243-scripts\") pod \"a56d3441-b413-4629-870b-49c208943243\" (UID: \"a56d3441-b413-4629-870b-49c208943243\") " Oct 01 15:12:47 crc kubenswrapper[4771]: I1001 15:12:47.764165 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a56d3441-b413-4629-870b-49c208943243-combined-ca-bundle\") pod \"a56d3441-b413-4629-870b-49c208943243\" (UID: \"a56d3441-b413-4629-870b-49c208943243\") " Oct 01 15:12:47 crc kubenswrapper[4771]: I1001 15:12:47.764210 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zj8tl\" (UniqueName: \"kubernetes.io/projected/a56d3441-b413-4629-870b-49c208943243-kube-api-access-zj8tl\") pod \"a56d3441-b413-4629-870b-49c208943243\" (UID: \"a56d3441-b413-4629-870b-49c208943243\") " Oct 01 15:12:47 crc kubenswrapper[4771]: I1001 15:12:47.764249 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a56d3441-b413-4629-870b-49c208943243-swiftconf\") pod \"a56d3441-b413-4629-870b-49c208943243\" (UID: \"a56d3441-b413-4629-870b-49c208943243\") " Oct 01 15:12:47 crc kubenswrapper[4771]: I1001 15:12:47.764300 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a56d3441-b413-4629-870b-49c208943243-ring-data-devices\") pod \"a56d3441-b413-4629-870b-49c208943243\" (UID: \"a56d3441-b413-4629-870b-49c208943243\") " Oct 01 15:12:47 crc kubenswrapper[4771]: I1001 15:12:47.764431 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a56d3441-b413-4629-870b-49c208943243-dispersionconf\") pod \"a56d3441-b413-4629-870b-49c208943243\" (UID: \"a56d3441-b413-4629-870b-49c208943243\") " Oct 01 15:12:47 crc kubenswrapper[4771]: I1001 15:12:47.764826 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a56d3441-b413-4629-870b-49c208943243-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "a56d3441-b413-4629-870b-49c208943243" (UID: "a56d3441-b413-4629-870b-49c208943243"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:12:47 crc kubenswrapper[4771]: I1001 15:12:47.765388 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a56d3441-b413-4629-870b-49c208943243-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "a56d3441-b413-4629-870b-49c208943243" (UID: "a56d3441-b413-4629-870b-49c208943243"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:12:47 crc kubenswrapper[4771]: I1001 15:12:47.776228 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a56d3441-b413-4629-870b-49c208943243-kube-api-access-zj8tl" (OuterVolumeSpecName: "kube-api-access-zj8tl") pod "a56d3441-b413-4629-870b-49c208943243" (UID: "a56d3441-b413-4629-870b-49c208943243"). InnerVolumeSpecName "kube-api-access-zj8tl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:12:47 crc kubenswrapper[4771]: I1001 15:12:47.776567 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a56d3441-b413-4629-870b-49c208943243-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "a56d3441-b413-4629-870b-49c208943243" (UID: "a56d3441-b413-4629-870b-49c208943243"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:12:47 crc kubenswrapper[4771]: I1001 15:12:47.791641 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a56d3441-b413-4629-870b-49c208943243-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "a56d3441-b413-4629-870b-49c208943243" (UID: "a56d3441-b413-4629-870b-49c208943243"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:12:47 crc kubenswrapper[4771]: I1001 15:12:47.792926 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a56d3441-b413-4629-870b-49c208943243-scripts" (OuterVolumeSpecName: "scripts") pod "a56d3441-b413-4629-870b-49c208943243" (UID: "a56d3441-b413-4629-870b-49c208943243"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:12:47 crc kubenswrapper[4771]: I1001 15:12:47.795385 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a56d3441-b413-4629-870b-49c208943243-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a56d3441-b413-4629-870b-49c208943243" (UID: "a56d3441-b413-4629-870b-49c208943243"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:12:47 crc kubenswrapper[4771]: I1001 15:12:47.865940 4771 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a56d3441-b413-4629-870b-49c208943243-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 01 15:12:47 crc kubenswrapper[4771]: I1001 15:12:47.865988 4771 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a56d3441-b413-4629-870b-49c208943243-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 01 15:12:47 crc kubenswrapper[4771]: I1001 15:12:47.866000 4771 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a56d3441-b413-4629-870b-49c208943243-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 01 15:12:47 crc kubenswrapper[4771]: I1001 15:12:47.866012 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a56d3441-b413-4629-870b-49c208943243-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 15:12:47 crc kubenswrapper[4771]: I1001 15:12:47.866025 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a56d3441-b413-4629-870b-49c208943243-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:12:47 crc kubenswrapper[4771]: I1001 15:12:47.866038 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zj8tl\" (UniqueName: \"kubernetes.io/projected/a56d3441-b413-4629-870b-49c208943243-kube-api-access-zj8tl\") on node \"crc\" DevicePath \"\"" Oct 01 15:12:47 crc kubenswrapper[4771]: I1001 15:12:47.866052 4771 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a56d3441-b413-4629-870b-49c208943243-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 01 15:12:48 crc kubenswrapper[4771]: I1001 15:12:48.201250 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-hzvjt" event={"ID":"a56d3441-b413-4629-870b-49c208943243","Type":"ContainerDied","Data":"e6e29d7ed38f04e3d89347303546f15073e7db2cfb4ec9f2c64f982a7062a1c1"} Oct 01 15:12:48 crc kubenswrapper[4771]: I1001 15:12:48.201505 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e6e29d7ed38f04e3d89347303546f15073e7db2cfb4ec9f2c64f982a7062a1c1" Oct 01 15:12:48 crc kubenswrapper[4771]: I1001 15:12:48.201299 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-hzvjt" Oct 01 15:12:49 crc kubenswrapper[4771]: I1001 15:12:49.603543 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 01 15:12:49 crc kubenswrapper[4771]: I1001 15:12:49.604884 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 01 15:12:49 crc kubenswrapper[4771]: I1001 15:12:49.665402 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 01 15:12:50 crc kubenswrapper[4771]: I1001 15:12:50.274315 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 01 15:12:50 crc kubenswrapper[4771]: I1001 15:12:50.885676 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-c9qvq"] Oct 01 15:12:50 crc kubenswrapper[4771]: E1001 15:12:50.886474 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a56d3441-b413-4629-870b-49c208943243" containerName="swift-ring-rebalance" Oct 01 15:12:50 crc kubenswrapper[4771]: I1001 15:12:50.886494 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a56d3441-b413-4629-870b-49c208943243" containerName="swift-ring-rebalance" Oct 01 15:12:50 crc kubenswrapper[4771]: E1001 15:12:50.886509 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="483f4f4a-6fce-4e64-9775-010730037203" containerName="init" Oct 01 15:12:50 crc kubenswrapper[4771]: I1001 15:12:50.886517 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="483f4f4a-6fce-4e64-9775-010730037203" containerName="init" Oct 01 15:12:50 crc kubenswrapper[4771]: E1001 15:12:50.886533 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="483f4f4a-6fce-4e64-9775-010730037203" containerName="dnsmasq-dns" Oct 01 15:12:50 crc kubenswrapper[4771]: I1001 15:12:50.886545 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="483f4f4a-6fce-4e64-9775-010730037203" containerName="dnsmasq-dns" Oct 01 15:12:50 crc kubenswrapper[4771]: I1001 15:12:50.886800 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="483f4f4a-6fce-4e64-9775-010730037203" containerName="dnsmasq-dns" Oct 01 15:12:50 crc kubenswrapper[4771]: I1001 15:12:50.886837 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a56d3441-b413-4629-870b-49c208943243" containerName="swift-ring-rebalance" Oct 01 15:12:50 crc kubenswrapper[4771]: I1001 15:12:50.887523 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-c9qvq" Oct 01 15:12:50 crc kubenswrapper[4771]: I1001 15:12:50.897199 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-c9qvq"] Oct 01 15:12:51 crc kubenswrapper[4771]: I1001 15:12:51.036077 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gktgm\" (UniqueName: \"kubernetes.io/projected/bccbccff-65c9-487b-b3bd-160f41dc53ee-kube-api-access-gktgm\") pod \"keystone-db-create-c9qvq\" (UID: \"bccbccff-65c9-487b-b3bd-160f41dc53ee\") " pod="openstack/keystone-db-create-c9qvq" Oct 01 15:12:51 crc kubenswrapper[4771]: I1001 15:12:51.082349 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-7sh6p"] Oct 01 15:12:51 crc kubenswrapper[4771]: I1001 15:12:51.083541 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-7sh6p" Oct 01 15:12:51 crc kubenswrapper[4771]: I1001 15:12:51.093103 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-7sh6p"] Oct 01 15:12:51 crc kubenswrapper[4771]: I1001 15:12:51.138414 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gktgm\" (UniqueName: \"kubernetes.io/projected/bccbccff-65c9-487b-b3bd-160f41dc53ee-kube-api-access-gktgm\") pod \"keystone-db-create-c9qvq\" (UID: \"bccbccff-65c9-487b-b3bd-160f41dc53ee\") " pod="openstack/keystone-db-create-c9qvq" Oct 01 15:12:51 crc kubenswrapper[4771]: I1001 15:12:51.159718 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gktgm\" (UniqueName: \"kubernetes.io/projected/bccbccff-65c9-487b-b3bd-160f41dc53ee-kube-api-access-gktgm\") pod \"keystone-db-create-c9qvq\" (UID: \"bccbccff-65c9-487b-b3bd-160f41dc53ee\") " pod="openstack/keystone-db-create-c9qvq" Oct 01 15:12:51 crc kubenswrapper[4771]: I1001 15:12:51.229770 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-c9qvq" Oct 01 15:12:51 crc kubenswrapper[4771]: I1001 15:12:51.240615 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm5jg\" (UniqueName: \"kubernetes.io/projected/fc054581-8c29-4f8e-b1eb-7903c06dfd17-kube-api-access-wm5jg\") pod \"placement-db-create-7sh6p\" (UID: \"fc054581-8c29-4f8e-b1eb-7903c06dfd17\") " pod="openstack/placement-db-create-7sh6p" Oct 01 15:12:51 crc kubenswrapper[4771]: I1001 15:12:51.342810 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm5jg\" (UniqueName: \"kubernetes.io/projected/fc054581-8c29-4f8e-b1eb-7903c06dfd17-kube-api-access-wm5jg\") pod \"placement-db-create-7sh6p\" (UID: \"fc054581-8c29-4f8e-b1eb-7903c06dfd17\") " pod="openstack/placement-db-create-7sh6p" Oct 01 15:12:51 crc kubenswrapper[4771]: I1001 15:12:51.383580 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm5jg\" (UniqueName: \"kubernetes.io/projected/fc054581-8c29-4f8e-b1eb-7903c06dfd17-kube-api-access-wm5jg\") pod \"placement-db-create-7sh6p\" (UID: \"fc054581-8c29-4f8e-b1eb-7903c06dfd17\") " pod="openstack/placement-db-create-7sh6p" Oct 01 15:12:51 crc kubenswrapper[4771]: I1001 15:12:51.399013 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-7sh6p" Oct 01 15:12:51 crc kubenswrapper[4771]: W1001 15:12:51.707806 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbccbccff_65c9_487b_b3bd_160f41dc53ee.slice/crio-6e3f39b028cfbe60d95ab750e35271be57a4ee39e01b530f421a6f106cf0bc39 WatchSource:0}: Error finding container 6e3f39b028cfbe60d95ab750e35271be57a4ee39e01b530f421a6f106cf0bc39: Status 404 returned error can't find the container with id 6e3f39b028cfbe60d95ab750e35271be57a4ee39e01b530f421a6f106cf0bc39 Oct 01 15:12:51 crc kubenswrapper[4771]: I1001 15:12:51.708245 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-c9qvq"] Oct 01 15:12:51 crc kubenswrapper[4771]: I1001 15:12:51.878305 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-7sh6p"] Oct 01 15:12:51 crc kubenswrapper[4771]: I1001 15:12:51.987196 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-zpdvh" podUID="20d8761e-4ce2-4312-8a80-8c3ce8908f2c" containerName="ovn-controller" probeResult="failure" output=< Oct 01 15:12:51 crc kubenswrapper[4771]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 01 15:12:51 crc kubenswrapper[4771]: > Oct 01 15:12:52 crc kubenswrapper[4771]: I1001 15:12:52.247034 4771 generic.go:334] "Generic (PLEG): container finished" podID="fc054581-8c29-4f8e-b1eb-7903c06dfd17" containerID="484c257f9380eb59f879f2cbeabad5e5394798cb4e491c7f2d5e94e3c94d1115" exitCode=0 Oct 01 15:12:52 crc kubenswrapper[4771]: I1001 15:12:52.247126 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-7sh6p" event={"ID":"fc054581-8c29-4f8e-b1eb-7903c06dfd17","Type":"ContainerDied","Data":"484c257f9380eb59f879f2cbeabad5e5394798cb4e491c7f2d5e94e3c94d1115"} Oct 01 15:12:52 crc kubenswrapper[4771]: I1001 15:12:52.247154 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-7sh6p" event={"ID":"fc054581-8c29-4f8e-b1eb-7903c06dfd17","Type":"ContainerStarted","Data":"a0eacbc81bdaa87b71dd20e6e1ff5606adb87ede25bac600e66435690d64a95a"} Oct 01 15:12:52 crc kubenswrapper[4771]: I1001 15:12:52.249230 4771 generic.go:334] "Generic (PLEG): container finished" podID="bccbccff-65c9-487b-b3bd-160f41dc53ee" containerID="46fdfe9fde4ce0ed6e5020b10715e2684b8498221cf22a391b32800dd7765b89" exitCode=0 Oct 01 15:12:52 crc kubenswrapper[4771]: I1001 15:12:52.249307 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-c9qvq" event={"ID":"bccbccff-65c9-487b-b3bd-160f41dc53ee","Type":"ContainerDied","Data":"46fdfe9fde4ce0ed6e5020b10715e2684b8498221cf22a391b32800dd7765b89"} Oct 01 15:12:52 crc kubenswrapper[4771]: I1001 15:12:52.249376 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-c9qvq" event={"ID":"bccbccff-65c9-487b-b3bd-160f41dc53ee","Type":"ContainerStarted","Data":"6e3f39b028cfbe60d95ab750e35271be57a4ee39e01b530f421a6f106cf0bc39"} Oct 01 15:12:53 crc kubenswrapper[4771]: I1001 15:12:53.181137 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 01 15:12:53 crc kubenswrapper[4771]: I1001 15:12:53.606101 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-c9qvq" Oct 01 15:12:53 crc kubenswrapper[4771]: I1001 15:12:53.611554 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-7sh6p" Oct 01 15:12:53 crc kubenswrapper[4771]: I1001 15:12:53.692877 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gktgm\" (UniqueName: \"kubernetes.io/projected/bccbccff-65c9-487b-b3bd-160f41dc53ee-kube-api-access-gktgm\") pod \"bccbccff-65c9-487b-b3bd-160f41dc53ee\" (UID: \"bccbccff-65c9-487b-b3bd-160f41dc53ee\") " Oct 01 15:12:53 crc kubenswrapper[4771]: I1001 15:12:53.693003 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wm5jg\" (UniqueName: \"kubernetes.io/projected/fc054581-8c29-4f8e-b1eb-7903c06dfd17-kube-api-access-wm5jg\") pod \"fc054581-8c29-4f8e-b1eb-7903c06dfd17\" (UID: \"fc054581-8c29-4f8e-b1eb-7903c06dfd17\") " Oct 01 15:12:53 crc kubenswrapper[4771]: I1001 15:12:53.703132 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bccbccff-65c9-487b-b3bd-160f41dc53ee-kube-api-access-gktgm" (OuterVolumeSpecName: "kube-api-access-gktgm") pod "bccbccff-65c9-487b-b3bd-160f41dc53ee" (UID: "bccbccff-65c9-487b-b3bd-160f41dc53ee"). InnerVolumeSpecName "kube-api-access-gktgm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:12:53 crc kubenswrapper[4771]: I1001 15:12:53.703215 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc054581-8c29-4f8e-b1eb-7903c06dfd17-kube-api-access-wm5jg" (OuterVolumeSpecName: "kube-api-access-wm5jg") pod "fc054581-8c29-4f8e-b1eb-7903c06dfd17" (UID: "fc054581-8c29-4f8e-b1eb-7903c06dfd17"). InnerVolumeSpecName "kube-api-access-wm5jg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:12:53 crc kubenswrapper[4771]: I1001 15:12:53.794615 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wm5jg\" (UniqueName: \"kubernetes.io/projected/fc054581-8c29-4f8e-b1eb-7903c06dfd17-kube-api-access-wm5jg\") on node \"crc\" DevicePath \"\"" Oct 01 15:12:53 crc kubenswrapper[4771]: I1001 15:12:53.794650 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gktgm\" (UniqueName: \"kubernetes.io/projected/bccbccff-65c9-487b-b3bd-160f41dc53ee-kube-api-access-gktgm\") on node \"crc\" DevicePath \"\"" Oct 01 15:12:54 crc kubenswrapper[4771]: I1001 15:12:54.266921 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-c9qvq" event={"ID":"bccbccff-65c9-487b-b3bd-160f41dc53ee","Type":"ContainerDied","Data":"6e3f39b028cfbe60d95ab750e35271be57a4ee39e01b530f421a6f106cf0bc39"} Oct 01 15:12:54 crc kubenswrapper[4771]: I1001 15:12:54.267260 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6e3f39b028cfbe60d95ab750e35271be57a4ee39e01b530f421a6f106cf0bc39" Oct 01 15:12:54 crc kubenswrapper[4771]: I1001 15:12:54.267697 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-c9qvq" Oct 01 15:12:54 crc kubenswrapper[4771]: I1001 15:12:54.269452 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-7sh6p" event={"ID":"fc054581-8c29-4f8e-b1eb-7903c06dfd17","Type":"ContainerDied","Data":"a0eacbc81bdaa87b71dd20e6e1ff5606adb87ede25bac600e66435690d64a95a"} Oct 01 15:12:54 crc kubenswrapper[4771]: I1001 15:12:54.269480 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0eacbc81bdaa87b71dd20e6e1ff5606adb87ede25bac600e66435690d64a95a" Oct 01 15:12:54 crc kubenswrapper[4771]: I1001 15:12:54.269541 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-7sh6p" Oct 01 15:12:56 crc kubenswrapper[4771]: I1001 15:12:56.328247 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-lrw6m"] Oct 01 15:12:56 crc kubenswrapper[4771]: E1001 15:12:56.329112 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bccbccff-65c9-487b-b3bd-160f41dc53ee" containerName="mariadb-database-create" Oct 01 15:12:56 crc kubenswrapper[4771]: I1001 15:12:56.329133 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bccbccff-65c9-487b-b3bd-160f41dc53ee" containerName="mariadb-database-create" Oct 01 15:12:56 crc kubenswrapper[4771]: E1001 15:12:56.329158 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc054581-8c29-4f8e-b1eb-7903c06dfd17" containerName="mariadb-database-create" Oct 01 15:12:56 crc kubenswrapper[4771]: I1001 15:12:56.329169 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc054581-8c29-4f8e-b1eb-7903c06dfd17" containerName="mariadb-database-create" Oct 01 15:12:56 crc kubenswrapper[4771]: I1001 15:12:56.329470 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc054581-8c29-4f8e-b1eb-7903c06dfd17" containerName="mariadb-database-create" Oct 01 15:12:56 crc kubenswrapper[4771]: I1001 15:12:56.329511 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="bccbccff-65c9-487b-b3bd-160f41dc53ee" containerName="mariadb-database-create" Oct 01 15:12:56 crc kubenswrapper[4771]: I1001 15:12:56.330282 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lrw6m" Oct 01 15:12:56 crc kubenswrapper[4771]: I1001 15:12:56.344488 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-lrw6m"] Oct 01 15:12:56 crc kubenswrapper[4771]: I1001 15:12:56.350329 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1916131b-f4ff-4f49-8abc-a640dc07abc4-etc-swift\") pod \"swift-storage-0\" (UID: \"1916131b-f4ff-4f49-8abc-a640dc07abc4\") " pod="openstack/swift-storage-0" Oct 01 15:12:56 crc kubenswrapper[4771]: I1001 15:12:56.362230 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/1916131b-f4ff-4f49-8abc-a640dc07abc4-etc-swift\") pod \"swift-storage-0\" (UID: \"1916131b-f4ff-4f49-8abc-a640dc07abc4\") " pod="openstack/swift-storage-0" Oct 01 15:12:56 crc kubenswrapper[4771]: I1001 15:12:56.452300 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfd8s\" (UniqueName: \"kubernetes.io/projected/448dc4ba-7224-4c6b-8448-e50389967c50-kube-api-access-rfd8s\") pod \"glance-db-create-lrw6m\" (UID: \"448dc4ba-7224-4c6b-8448-e50389967c50\") " pod="openstack/glance-db-create-lrw6m" Oct 01 15:12:56 crc kubenswrapper[4771]: I1001 15:12:56.542095 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 01 15:12:56 crc kubenswrapper[4771]: I1001 15:12:56.554832 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfd8s\" (UniqueName: \"kubernetes.io/projected/448dc4ba-7224-4c6b-8448-e50389967c50-kube-api-access-rfd8s\") pod \"glance-db-create-lrw6m\" (UID: \"448dc4ba-7224-4c6b-8448-e50389967c50\") " pod="openstack/glance-db-create-lrw6m" Oct 01 15:12:56 crc kubenswrapper[4771]: I1001 15:12:56.588385 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfd8s\" (UniqueName: \"kubernetes.io/projected/448dc4ba-7224-4c6b-8448-e50389967c50-kube-api-access-rfd8s\") pod \"glance-db-create-lrw6m\" (UID: \"448dc4ba-7224-4c6b-8448-e50389967c50\") " pod="openstack/glance-db-create-lrw6m" Oct 01 15:12:56 crc kubenswrapper[4771]: I1001 15:12:56.659188 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lrw6m" Oct 01 15:12:56 crc kubenswrapper[4771]: I1001 15:12:56.774115 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 01 15:12:56 crc kubenswrapper[4771]: I1001 15:12:56.949585 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-zpdvh" podUID="20d8761e-4ce2-4312-8a80-8c3ce8908f2c" containerName="ovn-controller" probeResult="failure" output=< Oct 01 15:12:56 crc kubenswrapper[4771]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 01 15:12:56 crc kubenswrapper[4771]: > Oct 01 15:12:57 crc kubenswrapper[4771]: I1001 15:12:57.229084 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-lrw6m"] Oct 01 15:12:57 crc kubenswrapper[4771]: I1001 15:12:57.237317 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 01 15:12:57 crc kubenswrapper[4771]: I1001 15:12:57.300156 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-lrw6m" event={"ID":"448dc4ba-7224-4c6b-8448-e50389967c50","Type":"ContainerStarted","Data":"027800442f76982b7e8c7faa7205c60ab187a5a9beee78a6eb34c53df58389b6"} Oct 01 15:12:57 crc kubenswrapper[4771]: I1001 15:12:57.301544 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1916131b-f4ff-4f49-8abc-a640dc07abc4","Type":"ContainerStarted","Data":"2a560b2fb960ea29d0dfce9586178c1a0bd18b19b4afd1656c5467fa8618d659"} Oct 01 15:12:57 crc kubenswrapper[4771]: I1001 15:12:57.595061 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 01 15:12:57 crc kubenswrapper[4771]: I1001 15:12:57.858926 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-2vz57"] Oct 01 15:12:57 crc kubenswrapper[4771]: I1001 15:12:57.859874 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-2vz57" Oct 01 15:12:57 crc kubenswrapper[4771]: I1001 15:12:57.879290 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-2vz57"] Oct 01 15:12:57 crc kubenswrapper[4771]: I1001 15:12:57.954580 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-ddg7l"] Oct 01 15:12:57 crc kubenswrapper[4771]: I1001 15:12:57.955493 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-ddg7l" Oct 01 15:12:57 crc kubenswrapper[4771]: I1001 15:12:57.963556 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-ddg7l"] Oct 01 15:12:57 crc kubenswrapper[4771]: I1001 15:12:57.983467 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcpgl\" (UniqueName: \"kubernetes.io/projected/650906d6-4b76-4503-b0ee-e59f0a3302fb-kube-api-access-lcpgl\") pod \"cinder-db-create-2vz57\" (UID: \"650906d6-4b76-4503-b0ee-e59f0a3302fb\") " pod="openstack/cinder-db-create-2vz57" Oct 01 15:12:58 crc kubenswrapper[4771]: I1001 15:12:58.084725 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jltm5\" (UniqueName: \"kubernetes.io/projected/b6b325a5-74ef-42ee-89a8-c5855a4dc1f8-kube-api-access-jltm5\") pod \"barbican-db-create-ddg7l\" (UID: \"b6b325a5-74ef-42ee-89a8-c5855a4dc1f8\") " pod="openstack/barbican-db-create-ddg7l" Oct 01 15:12:58 crc kubenswrapper[4771]: I1001 15:12:58.084815 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcpgl\" (UniqueName: \"kubernetes.io/projected/650906d6-4b76-4503-b0ee-e59f0a3302fb-kube-api-access-lcpgl\") pod \"cinder-db-create-2vz57\" (UID: \"650906d6-4b76-4503-b0ee-e59f0a3302fb\") " pod="openstack/cinder-db-create-2vz57" Oct 01 15:12:58 crc kubenswrapper[4771]: I1001 15:12:58.103185 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcpgl\" (UniqueName: \"kubernetes.io/projected/650906d6-4b76-4503-b0ee-e59f0a3302fb-kube-api-access-lcpgl\") pod \"cinder-db-create-2vz57\" (UID: \"650906d6-4b76-4503-b0ee-e59f0a3302fb\") " pod="openstack/cinder-db-create-2vz57" Oct 01 15:12:58 crc kubenswrapper[4771]: I1001 15:12:58.186684 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jltm5\" (UniqueName: \"kubernetes.io/projected/b6b325a5-74ef-42ee-89a8-c5855a4dc1f8-kube-api-access-jltm5\") pod \"barbican-db-create-ddg7l\" (UID: \"b6b325a5-74ef-42ee-89a8-c5855a4dc1f8\") " pod="openstack/barbican-db-create-ddg7l" Oct 01 15:12:58 crc kubenswrapper[4771]: I1001 15:12:58.202764 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jltm5\" (UniqueName: \"kubernetes.io/projected/b6b325a5-74ef-42ee-89a8-c5855a4dc1f8-kube-api-access-jltm5\") pod \"barbican-db-create-ddg7l\" (UID: \"b6b325a5-74ef-42ee-89a8-c5855a4dc1f8\") " pod="openstack/barbican-db-create-ddg7l" Oct 01 15:12:58 crc kubenswrapper[4771]: I1001 15:12:58.223525 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-2vz57" Oct 01 15:12:58 crc kubenswrapper[4771]: I1001 15:12:58.256604 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-627pv"] Oct 01 15:12:58 crc kubenswrapper[4771]: I1001 15:12:58.257536 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-627pv" Oct 01 15:12:58 crc kubenswrapper[4771]: I1001 15:12:58.278330 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-ddg7l" Oct 01 15:12:58 crc kubenswrapper[4771]: I1001 15:12:58.297687 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-627pv"] Oct 01 15:12:58 crc kubenswrapper[4771]: I1001 15:12:58.311146 4771 generic.go:334] "Generic (PLEG): container finished" podID="448dc4ba-7224-4c6b-8448-e50389967c50" containerID="ddef70121a06df011cd9c0e8719f5dd03127b6c67d02bf94dafe6a280be85d6b" exitCode=0 Oct 01 15:12:58 crc kubenswrapper[4771]: I1001 15:12:58.311194 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-lrw6m" event={"ID":"448dc4ba-7224-4c6b-8448-e50389967c50","Type":"ContainerDied","Data":"ddef70121a06df011cd9c0e8719f5dd03127b6c67d02bf94dafe6a280be85d6b"} Oct 01 15:12:58 crc kubenswrapper[4771]: I1001 15:12:58.390013 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vfkr\" (UniqueName: \"kubernetes.io/projected/74e7f311-b206-4643-a136-5d40e30e7e39-kube-api-access-2vfkr\") pod \"neutron-db-create-627pv\" (UID: \"74e7f311-b206-4643-a136-5d40e30e7e39\") " pod="openstack/neutron-db-create-627pv" Oct 01 15:12:58 crc kubenswrapper[4771]: I1001 15:12:58.492023 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vfkr\" (UniqueName: \"kubernetes.io/projected/74e7f311-b206-4643-a136-5d40e30e7e39-kube-api-access-2vfkr\") pod \"neutron-db-create-627pv\" (UID: \"74e7f311-b206-4643-a136-5d40e30e7e39\") " pod="openstack/neutron-db-create-627pv" Oct 01 15:12:58 crc kubenswrapper[4771]: I1001 15:12:58.510075 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vfkr\" (UniqueName: \"kubernetes.io/projected/74e7f311-b206-4643-a136-5d40e30e7e39-kube-api-access-2vfkr\") pod \"neutron-db-create-627pv\" (UID: \"74e7f311-b206-4643-a136-5d40e30e7e39\") " pod="openstack/neutron-db-create-627pv" Oct 01 15:12:58 crc kubenswrapper[4771]: I1001 15:12:58.583201 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-627pv" Oct 01 15:12:59 crc kubenswrapper[4771]: I1001 15:12:59.718180 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-2vz57"] Oct 01 15:12:59 crc kubenswrapper[4771]: I1001 15:12:59.769568 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-627pv"] Oct 01 15:12:59 crc kubenswrapper[4771]: W1001 15:12:59.773894 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod74e7f311_b206_4643_a136_5d40e30e7e39.slice/crio-a63598149ba177f4467dfddbfbece66cfcc65d1c77b2c601d597de43257854cb WatchSource:0}: Error finding container a63598149ba177f4467dfddbfbece66cfcc65d1c77b2c601d597de43257854cb: Status 404 returned error can't find the container with id a63598149ba177f4467dfddbfbece66cfcc65d1c77b2c601d597de43257854cb Oct 01 15:12:59 crc kubenswrapper[4771]: W1001 15:12:59.775530 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6b325a5_74ef_42ee_89a8_c5855a4dc1f8.slice/crio-139e0c3681301033b5447e2460c08f7c58820597056a56b72301570d4db82bfc WatchSource:0}: Error finding container 139e0c3681301033b5447e2460c08f7c58820597056a56b72301570d4db82bfc: Status 404 returned error can't find the container with id 139e0c3681301033b5447e2460c08f7c58820597056a56b72301570d4db82bfc Oct 01 15:12:59 crc kubenswrapper[4771]: I1001 15:12:59.780319 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-ddg7l"] Oct 01 15:12:59 crc kubenswrapper[4771]: I1001 15:12:59.949261 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lrw6m" Oct 01 15:13:00 crc kubenswrapper[4771]: I1001 15:13:00.019618 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfd8s\" (UniqueName: \"kubernetes.io/projected/448dc4ba-7224-4c6b-8448-e50389967c50-kube-api-access-rfd8s\") pod \"448dc4ba-7224-4c6b-8448-e50389967c50\" (UID: \"448dc4ba-7224-4c6b-8448-e50389967c50\") " Oct 01 15:13:00 crc kubenswrapper[4771]: I1001 15:13:00.045661 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/448dc4ba-7224-4c6b-8448-e50389967c50-kube-api-access-rfd8s" (OuterVolumeSpecName: "kube-api-access-rfd8s") pod "448dc4ba-7224-4c6b-8448-e50389967c50" (UID: "448dc4ba-7224-4c6b-8448-e50389967c50"). InnerVolumeSpecName "kube-api-access-rfd8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:13:00 crc kubenswrapper[4771]: I1001 15:13:00.121639 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfd8s\" (UniqueName: \"kubernetes.io/projected/448dc4ba-7224-4c6b-8448-e50389967c50-kube-api-access-rfd8s\") on node \"crc\" DevicePath \"\"" Oct 01 15:13:00 crc kubenswrapper[4771]: I1001 15:13:00.341235 4771 generic.go:334] "Generic (PLEG): container finished" podID="74e7f311-b206-4643-a136-5d40e30e7e39" containerID="f8ab8f67ce768baea8ebbdebb514dd523e38ac6f89cde17f00bf3b1087231269" exitCode=0 Oct 01 15:13:00 crc kubenswrapper[4771]: I1001 15:13:00.341453 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-627pv" event={"ID":"74e7f311-b206-4643-a136-5d40e30e7e39","Type":"ContainerDied","Data":"f8ab8f67ce768baea8ebbdebb514dd523e38ac6f89cde17f00bf3b1087231269"} Oct 01 15:13:00 crc kubenswrapper[4771]: I1001 15:13:00.341608 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-627pv" event={"ID":"74e7f311-b206-4643-a136-5d40e30e7e39","Type":"ContainerStarted","Data":"a63598149ba177f4467dfddbfbece66cfcc65d1c77b2c601d597de43257854cb"} Oct 01 15:13:00 crc kubenswrapper[4771]: I1001 15:13:00.346456 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1916131b-f4ff-4f49-8abc-a640dc07abc4","Type":"ContainerStarted","Data":"d4762ffa6a50a545f1e66bf0769e41750626a0ce3b27e6deb5fe79dfc7e0d557"} Oct 01 15:13:00 crc kubenswrapper[4771]: I1001 15:13:00.346493 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1916131b-f4ff-4f49-8abc-a640dc07abc4","Type":"ContainerStarted","Data":"eae471868210d30bc0c3ec8fb828d056217e94d38b59fc085e53f01b6b818633"} Oct 01 15:13:00 crc kubenswrapper[4771]: I1001 15:13:00.346507 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1916131b-f4ff-4f49-8abc-a640dc07abc4","Type":"ContainerStarted","Data":"373f556c0d7d819de67fefb3a0343213745e58c36a464bc3ee44117649aed848"} Oct 01 15:13:00 crc kubenswrapper[4771]: I1001 15:13:00.348227 4771 generic.go:334] "Generic (PLEG): container finished" podID="650906d6-4b76-4503-b0ee-e59f0a3302fb" containerID="9eed105f8bdc1d5e2c48424c5dbef253c3805ae6574dce161114c88b6a63e677" exitCode=0 Oct 01 15:13:00 crc kubenswrapper[4771]: I1001 15:13:00.348286 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-2vz57" event={"ID":"650906d6-4b76-4503-b0ee-e59f0a3302fb","Type":"ContainerDied","Data":"9eed105f8bdc1d5e2c48424c5dbef253c3805ae6574dce161114c88b6a63e677"} Oct 01 15:13:00 crc kubenswrapper[4771]: I1001 15:13:00.348307 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-2vz57" event={"ID":"650906d6-4b76-4503-b0ee-e59f0a3302fb","Type":"ContainerStarted","Data":"0af49154e2263df0d5a5fa3da512bc7435a6a5f6160ae06f23a1e4840b022677"} Oct 01 15:13:00 crc kubenswrapper[4771]: I1001 15:13:00.349664 4771 generic.go:334] "Generic (PLEG): container finished" podID="b6b325a5-74ef-42ee-89a8-c5855a4dc1f8" containerID="5e844da9189ece3df4085c7933f71cfbfda01ddbda537ed4082972000854297a" exitCode=0 Oct 01 15:13:00 crc kubenswrapper[4771]: I1001 15:13:00.349703 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-ddg7l" event={"ID":"b6b325a5-74ef-42ee-89a8-c5855a4dc1f8","Type":"ContainerDied","Data":"5e844da9189ece3df4085c7933f71cfbfda01ddbda537ed4082972000854297a"} Oct 01 15:13:00 crc kubenswrapper[4771]: I1001 15:13:00.349718 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-ddg7l" event={"ID":"b6b325a5-74ef-42ee-89a8-c5855a4dc1f8","Type":"ContainerStarted","Data":"139e0c3681301033b5447e2460c08f7c58820597056a56b72301570d4db82bfc"} Oct 01 15:13:00 crc kubenswrapper[4771]: I1001 15:13:00.350970 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-lrw6m" event={"ID":"448dc4ba-7224-4c6b-8448-e50389967c50","Type":"ContainerDied","Data":"027800442f76982b7e8c7faa7205c60ab187a5a9beee78a6eb34c53df58389b6"} Oct 01 15:13:00 crc kubenswrapper[4771]: I1001 15:13:00.350995 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="027800442f76982b7e8c7faa7205c60ab187a5a9beee78a6eb34c53df58389b6" Oct 01 15:13:00 crc kubenswrapper[4771]: I1001 15:13:00.351036 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lrw6m" Oct 01 15:13:00 crc kubenswrapper[4771]: I1001 15:13:00.890352 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-330c-account-create-m6s7j"] Oct 01 15:13:00 crc kubenswrapper[4771]: E1001 15:13:00.890849 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="448dc4ba-7224-4c6b-8448-e50389967c50" containerName="mariadb-database-create" Oct 01 15:13:00 crc kubenswrapper[4771]: I1001 15:13:00.890865 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="448dc4ba-7224-4c6b-8448-e50389967c50" containerName="mariadb-database-create" Oct 01 15:13:00 crc kubenswrapper[4771]: I1001 15:13:00.891128 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="448dc4ba-7224-4c6b-8448-e50389967c50" containerName="mariadb-database-create" Oct 01 15:13:00 crc kubenswrapper[4771]: I1001 15:13:00.891829 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-330c-account-create-m6s7j" Oct 01 15:13:00 crc kubenswrapper[4771]: I1001 15:13:00.895031 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 01 15:13:00 crc kubenswrapper[4771]: I1001 15:13:00.931021 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-330c-account-create-m6s7j"] Oct 01 15:13:01 crc kubenswrapper[4771]: I1001 15:13:01.036649 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wt8v\" (UniqueName: \"kubernetes.io/projected/48bdd44b-bf9e-4349-90de-9c3e1126def6-kube-api-access-6wt8v\") pod \"keystone-330c-account-create-m6s7j\" (UID: \"48bdd44b-bf9e-4349-90de-9c3e1126def6\") " pod="openstack/keystone-330c-account-create-m6s7j" Oct 01 15:13:01 crc kubenswrapper[4771]: I1001 15:13:01.138925 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wt8v\" (UniqueName: \"kubernetes.io/projected/48bdd44b-bf9e-4349-90de-9c3e1126def6-kube-api-access-6wt8v\") pod \"keystone-330c-account-create-m6s7j\" (UID: \"48bdd44b-bf9e-4349-90de-9c3e1126def6\") " pod="openstack/keystone-330c-account-create-m6s7j" Oct 01 15:13:01 crc kubenswrapper[4771]: I1001 15:13:01.165270 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wt8v\" (UniqueName: \"kubernetes.io/projected/48bdd44b-bf9e-4349-90de-9c3e1126def6-kube-api-access-6wt8v\") pod \"keystone-330c-account-create-m6s7j\" (UID: \"48bdd44b-bf9e-4349-90de-9c3e1126def6\") " pod="openstack/keystone-330c-account-create-m6s7j" Oct 01 15:13:01 crc kubenswrapper[4771]: I1001 15:13:01.210419 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-e413-account-create-wghx5"] Oct 01 15:13:01 crc kubenswrapper[4771]: I1001 15:13:01.210708 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-330c-account-create-m6s7j" Oct 01 15:13:01 crc kubenswrapper[4771]: I1001 15:13:01.211789 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e413-account-create-wghx5" Oct 01 15:13:01 crc kubenswrapper[4771]: I1001 15:13:01.214589 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 01 15:13:01 crc kubenswrapper[4771]: I1001 15:13:01.224719 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-e413-account-create-wghx5"] Oct 01 15:13:01 crc kubenswrapper[4771]: I1001 15:13:01.343059 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6blh4\" (UniqueName: \"kubernetes.io/projected/d31bdfbc-f66b-4a6f-b64c-f56b4a63c481-kube-api-access-6blh4\") pod \"placement-e413-account-create-wghx5\" (UID: \"d31bdfbc-f66b-4a6f-b64c-f56b4a63c481\") " pod="openstack/placement-e413-account-create-wghx5" Oct 01 15:13:01 crc kubenswrapper[4771]: I1001 15:13:01.367298 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1916131b-f4ff-4f49-8abc-a640dc07abc4","Type":"ContainerStarted","Data":"78e06236ac31ba2a667fba46e41e27b6b9aeaf36e37b92e276cd075cebe64369"} Oct 01 15:13:01 crc kubenswrapper[4771]: I1001 15:13:01.444202 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6blh4\" (UniqueName: \"kubernetes.io/projected/d31bdfbc-f66b-4a6f-b64c-f56b4a63c481-kube-api-access-6blh4\") pod \"placement-e413-account-create-wghx5\" (UID: \"d31bdfbc-f66b-4a6f-b64c-f56b4a63c481\") " pod="openstack/placement-e413-account-create-wghx5" Oct 01 15:13:01 crc kubenswrapper[4771]: I1001 15:13:01.474448 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6blh4\" (UniqueName: \"kubernetes.io/projected/d31bdfbc-f66b-4a6f-b64c-f56b4a63c481-kube-api-access-6blh4\") pod \"placement-e413-account-create-wghx5\" (UID: \"d31bdfbc-f66b-4a6f-b64c-f56b4a63c481\") " pod="openstack/placement-e413-account-create-wghx5" Oct 01 15:13:01 crc kubenswrapper[4771]: I1001 15:13:01.606454 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e413-account-create-wghx5" Oct 01 15:13:01 crc kubenswrapper[4771]: I1001 15:13:01.714894 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-330c-account-create-m6s7j"] Oct 01 15:13:01 crc kubenswrapper[4771]: I1001 15:13:01.835791 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-ddg7l" Oct 01 15:13:01 crc kubenswrapper[4771]: I1001 15:13:01.844319 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-627pv" Oct 01 15:13:01 crc kubenswrapper[4771]: I1001 15:13:01.867078 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-2vz57" Oct 01 15:13:01 crc kubenswrapper[4771]: I1001 15:13:01.922499 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-zpdvh" podUID="20d8761e-4ce2-4312-8a80-8c3ce8908f2c" containerName="ovn-controller" probeResult="failure" output=< Oct 01 15:13:01 crc kubenswrapper[4771]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 01 15:13:01 crc kubenswrapper[4771]: > Oct 01 15:13:01 crc kubenswrapper[4771]: I1001 15:13:01.950941 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcpgl\" (UniqueName: \"kubernetes.io/projected/650906d6-4b76-4503-b0ee-e59f0a3302fb-kube-api-access-lcpgl\") pod \"650906d6-4b76-4503-b0ee-e59f0a3302fb\" (UID: \"650906d6-4b76-4503-b0ee-e59f0a3302fb\") " Oct 01 15:13:01 crc kubenswrapper[4771]: I1001 15:13:01.951172 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vfkr\" (UniqueName: \"kubernetes.io/projected/74e7f311-b206-4643-a136-5d40e30e7e39-kube-api-access-2vfkr\") pod \"74e7f311-b206-4643-a136-5d40e30e7e39\" (UID: \"74e7f311-b206-4643-a136-5d40e30e7e39\") " Oct 01 15:13:01 crc kubenswrapper[4771]: I1001 15:13:01.951357 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jltm5\" (UniqueName: \"kubernetes.io/projected/b6b325a5-74ef-42ee-89a8-c5855a4dc1f8-kube-api-access-jltm5\") pod \"b6b325a5-74ef-42ee-89a8-c5855a4dc1f8\" (UID: \"b6b325a5-74ef-42ee-89a8-c5855a4dc1f8\") " Oct 01 15:13:01 crc kubenswrapper[4771]: I1001 15:13:01.957190 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6b325a5-74ef-42ee-89a8-c5855a4dc1f8-kube-api-access-jltm5" (OuterVolumeSpecName: "kube-api-access-jltm5") pod "b6b325a5-74ef-42ee-89a8-c5855a4dc1f8" (UID: "b6b325a5-74ef-42ee-89a8-c5855a4dc1f8"). InnerVolumeSpecName "kube-api-access-jltm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:13:01 crc kubenswrapper[4771]: I1001 15:13:01.957943 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74e7f311-b206-4643-a136-5d40e30e7e39-kube-api-access-2vfkr" (OuterVolumeSpecName: "kube-api-access-2vfkr") pod "74e7f311-b206-4643-a136-5d40e30e7e39" (UID: "74e7f311-b206-4643-a136-5d40e30e7e39"). InnerVolumeSpecName "kube-api-access-2vfkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:13:01 crc kubenswrapper[4771]: I1001 15:13:01.962926 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/650906d6-4b76-4503-b0ee-e59f0a3302fb-kube-api-access-lcpgl" (OuterVolumeSpecName: "kube-api-access-lcpgl") pod "650906d6-4b76-4503-b0ee-e59f0a3302fb" (UID: "650906d6-4b76-4503-b0ee-e59f0a3302fb"). InnerVolumeSpecName "kube-api-access-lcpgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:13:02 crc kubenswrapper[4771]: I1001 15:13:02.057883 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vfkr\" (UniqueName: \"kubernetes.io/projected/74e7f311-b206-4643-a136-5d40e30e7e39-kube-api-access-2vfkr\") on node \"crc\" DevicePath \"\"" Oct 01 15:13:02 crc kubenswrapper[4771]: I1001 15:13:02.058127 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jltm5\" (UniqueName: \"kubernetes.io/projected/b6b325a5-74ef-42ee-89a8-c5855a4dc1f8-kube-api-access-jltm5\") on node \"crc\" DevicePath \"\"" Oct 01 15:13:02 crc kubenswrapper[4771]: I1001 15:13:02.058219 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcpgl\" (UniqueName: \"kubernetes.io/projected/650906d6-4b76-4503-b0ee-e59f0a3302fb-kube-api-access-lcpgl\") on node \"crc\" DevicePath \"\"" Oct 01 15:13:02 crc kubenswrapper[4771]: I1001 15:13:02.083095 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-e413-account-create-wghx5"] Oct 01 15:13:02 crc kubenswrapper[4771]: I1001 15:13:02.379177 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-2vz57" event={"ID":"650906d6-4b76-4503-b0ee-e59f0a3302fb","Type":"ContainerDied","Data":"0af49154e2263df0d5a5fa3da512bc7435a6a5f6160ae06f23a1e4840b022677"} Oct 01 15:13:02 crc kubenswrapper[4771]: I1001 15:13:02.379440 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0af49154e2263df0d5a5fa3da512bc7435a6a5f6160ae06f23a1e4840b022677" Oct 01 15:13:02 crc kubenswrapper[4771]: I1001 15:13:02.379290 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-2vz57" Oct 01 15:13:02 crc kubenswrapper[4771]: I1001 15:13:02.381034 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-627pv" event={"ID":"74e7f311-b206-4643-a136-5d40e30e7e39","Type":"ContainerDied","Data":"a63598149ba177f4467dfddbfbece66cfcc65d1c77b2c601d597de43257854cb"} Oct 01 15:13:02 crc kubenswrapper[4771]: I1001 15:13:02.381098 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a63598149ba177f4467dfddbfbece66cfcc65d1c77b2c601d597de43257854cb" Oct 01 15:13:02 crc kubenswrapper[4771]: I1001 15:13:02.381055 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-627pv" Oct 01 15:13:02 crc kubenswrapper[4771]: I1001 15:13:02.384480 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-330c-account-create-m6s7j" event={"ID":"48bdd44b-bf9e-4349-90de-9c3e1126def6","Type":"ContainerStarted","Data":"62eeaba437500f0b27f8a9b401ccf4e5b965dad0ed30093065e403e91fef3b5b"} Oct 01 15:13:02 crc kubenswrapper[4771]: I1001 15:13:02.384542 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-330c-account-create-m6s7j" event={"ID":"48bdd44b-bf9e-4349-90de-9c3e1126def6","Type":"ContainerStarted","Data":"bda4835601f60ff8a5a457198c2a07dcc1a9b228e577411ed15f045dfdfb62ae"} Oct 01 15:13:02 crc kubenswrapper[4771]: I1001 15:13:02.387876 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-ddg7l" event={"ID":"b6b325a5-74ef-42ee-89a8-c5855a4dc1f8","Type":"ContainerDied","Data":"139e0c3681301033b5447e2460c08f7c58820597056a56b72301570d4db82bfc"} Oct 01 15:13:02 crc kubenswrapper[4771]: I1001 15:13:02.387924 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="139e0c3681301033b5447e2460c08f7c58820597056a56b72301570d4db82bfc" Oct 01 15:13:02 crc kubenswrapper[4771]: I1001 15:13:02.387988 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-ddg7l" Oct 01 15:13:02 crc kubenswrapper[4771]: W1001 15:13:02.628926 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd31bdfbc_f66b_4a6f_b64c_f56b4a63c481.slice/crio-ca3302706dad586c90248bde801a3d2543cdc5c9007aa52b4d4eb2d2ee84e732 WatchSource:0}: Error finding container ca3302706dad586c90248bde801a3d2543cdc5c9007aa52b4d4eb2d2ee84e732: Status 404 returned error can't find the container with id ca3302706dad586c90248bde801a3d2543cdc5c9007aa52b4d4eb2d2ee84e732 Oct 01 15:13:03 crc kubenswrapper[4771]: I1001 15:13:03.400648 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1916131b-f4ff-4f49-8abc-a640dc07abc4","Type":"ContainerStarted","Data":"8bfedc72a36562014a2a2f27448c77825dc4313dd692e1a47fab8de561d4efb1"} Oct 01 15:13:03 crc kubenswrapper[4771]: I1001 15:13:03.400985 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1916131b-f4ff-4f49-8abc-a640dc07abc4","Type":"ContainerStarted","Data":"579a78813547a4b43df0ede6e4b13abe1111dc6a21354d95a87c4840949eb017"} Oct 01 15:13:03 crc kubenswrapper[4771]: I1001 15:13:03.403395 4771 generic.go:334] "Generic (PLEG): container finished" podID="d31bdfbc-f66b-4a6f-b64c-f56b4a63c481" containerID="36e14aecc4196441acb3b318260ee774c457f14a319476e9aa52221e7d5a388f" exitCode=0 Oct 01 15:13:03 crc kubenswrapper[4771]: I1001 15:13:03.403474 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-e413-account-create-wghx5" event={"ID":"d31bdfbc-f66b-4a6f-b64c-f56b4a63c481","Type":"ContainerDied","Data":"36e14aecc4196441acb3b318260ee774c457f14a319476e9aa52221e7d5a388f"} Oct 01 15:13:03 crc kubenswrapper[4771]: I1001 15:13:03.403525 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-e413-account-create-wghx5" event={"ID":"d31bdfbc-f66b-4a6f-b64c-f56b4a63c481","Type":"ContainerStarted","Data":"ca3302706dad586c90248bde801a3d2543cdc5c9007aa52b4d4eb2d2ee84e732"} Oct 01 15:13:03 crc kubenswrapper[4771]: I1001 15:13:03.405125 4771 generic.go:334] "Generic (PLEG): container finished" podID="48bdd44b-bf9e-4349-90de-9c3e1126def6" containerID="62eeaba437500f0b27f8a9b401ccf4e5b965dad0ed30093065e403e91fef3b5b" exitCode=0 Oct 01 15:13:03 crc kubenswrapper[4771]: I1001 15:13:03.405158 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-330c-account-create-m6s7j" event={"ID":"48bdd44b-bf9e-4349-90de-9c3e1126def6","Type":"ContainerDied","Data":"62eeaba437500f0b27f8a9b401ccf4e5b965dad0ed30093065e403e91fef3b5b"} Oct 01 15:13:03 crc kubenswrapper[4771]: I1001 15:13:03.641340 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-330c-account-create-m6s7j" Oct 01 15:13:03 crc kubenswrapper[4771]: I1001 15:13:03.793831 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wt8v\" (UniqueName: \"kubernetes.io/projected/48bdd44b-bf9e-4349-90de-9c3e1126def6-kube-api-access-6wt8v\") pod \"48bdd44b-bf9e-4349-90de-9c3e1126def6\" (UID: \"48bdd44b-bf9e-4349-90de-9c3e1126def6\") " Oct 01 15:13:03 crc kubenswrapper[4771]: I1001 15:13:03.799258 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48bdd44b-bf9e-4349-90de-9c3e1126def6-kube-api-access-6wt8v" (OuterVolumeSpecName: "kube-api-access-6wt8v") pod "48bdd44b-bf9e-4349-90de-9c3e1126def6" (UID: "48bdd44b-bf9e-4349-90de-9c3e1126def6"). InnerVolumeSpecName "kube-api-access-6wt8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:13:03 crc kubenswrapper[4771]: I1001 15:13:03.895505 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wt8v\" (UniqueName: \"kubernetes.io/projected/48bdd44b-bf9e-4349-90de-9c3e1126def6-kube-api-access-6wt8v\") on node \"crc\" DevicePath \"\"" Oct 01 15:13:04 crc kubenswrapper[4771]: I1001 15:13:04.423449 4771 generic.go:334] "Generic (PLEG): container finished" podID="f11eb6e5-8306-4db5-af63-ef4d869f7e2c" containerID="c3eb3e3832dbb71c6552b52fffc9d46d565d2ed58dc9ada50391280b5d29480a" exitCode=0 Oct 01 15:13:04 crc kubenswrapper[4771]: I1001 15:13:04.423529 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f11eb6e5-8306-4db5-af63-ef4d869f7e2c","Type":"ContainerDied","Data":"c3eb3e3832dbb71c6552b52fffc9d46d565d2ed58dc9ada50391280b5d29480a"} Oct 01 15:13:04 crc kubenswrapper[4771]: I1001 15:13:04.425185 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-330c-account-create-m6s7j" Oct 01 15:13:04 crc kubenswrapper[4771]: I1001 15:13:04.425201 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-330c-account-create-m6s7j" event={"ID":"48bdd44b-bf9e-4349-90de-9c3e1126def6","Type":"ContainerDied","Data":"bda4835601f60ff8a5a457198c2a07dcc1a9b228e577411ed15f045dfdfb62ae"} Oct 01 15:13:04 crc kubenswrapper[4771]: I1001 15:13:04.425247 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bda4835601f60ff8a5a457198c2a07dcc1a9b228e577411ed15f045dfdfb62ae" Oct 01 15:13:04 crc kubenswrapper[4771]: I1001 15:13:04.430415 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1916131b-f4ff-4f49-8abc-a640dc07abc4","Type":"ContainerStarted","Data":"f7c8370903ce107375839ecf753738c7d59627b9f4797eb7f1e261cb408fadbe"} Oct 01 15:13:04 crc kubenswrapper[4771]: I1001 15:13:04.430471 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1916131b-f4ff-4f49-8abc-a640dc07abc4","Type":"ContainerStarted","Data":"8d410cc3c2509e42c54db5ca32b91ba693ac045b958dbee1857a4501b2e8a32f"} Oct 01 15:13:04 crc kubenswrapper[4771]: I1001 15:13:04.779940 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e413-account-create-wghx5" Oct 01 15:13:04 crc kubenswrapper[4771]: I1001 15:13:04.910837 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6blh4\" (UniqueName: \"kubernetes.io/projected/d31bdfbc-f66b-4a6f-b64c-f56b4a63c481-kube-api-access-6blh4\") pod \"d31bdfbc-f66b-4a6f-b64c-f56b4a63c481\" (UID: \"d31bdfbc-f66b-4a6f-b64c-f56b4a63c481\") " Oct 01 15:13:04 crc kubenswrapper[4771]: I1001 15:13:04.915815 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d31bdfbc-f66b-4a6f-b64c-f56b4a63c481-kube-api-access-6blh4" (OuterVolumeSpecName: "kube-api-access-6blh4") pod "d31bdfbc-f66b-4a6f-b64c-f56b4a63c481" (UID: "d31bdfbc-f66b-4a6f-b64c-f56b4a63c481"). InnerVolumeSpecName "kube-api-access-6blh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:13:05 crc kubenswrapper[4771]: I1001 15:13:05.013830 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6blh4\" (UniqueName: \"kubernetes.io/projected/d31bdfbc-f66b-4a6f-b64c-f56b4a63c481-kube-api-access-6blh4\") on node \"crc\" DevicePath \"\"" Oct 01 15:13:05 crc kubenswrapper[4771]: I1001 15:13:05.454661 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1916131b-f4ff-4f49-8abc-a640dc07abc4","Type":"ContainerStarted","Data":"f90d7d84535aceb93ac9cf48fe5f36b28aa924f34cafac8a300d0561510eeab2"} Oct 01 15:13:05 crc kubenswrapper[4771]: I1001 15:13:05.458684 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-e413-account-create-wghx5" Oct 01 15:13:05 crc kubenswrapper[4771]: I1001 15:13:05.458760 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-e413-account-create-wghx5" event={"ID":"d31bdfbc-f66b-4a6f-b64c-f56b4a63c481","Type":"ContainerDied","Data":"ca3302706dad586c90248bde801a3d2543cdc5c9007aa52b4d4eb2d2ee84e732"} Oct 01 15:13:05 crc kubenswrapper[4771]: I1001 15:13:05.458819 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca3302706dad586c90248bde801a3d2543cdc5c9007aa52b4d4eb2d2ee84e732" Oct 01 15:13:05 crc kubenswrapper[4771]: I1001 15:13:05.461775 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f11eb6e5-8306-4db5-af63-ef4d869f7e2c","Type":"ContainerStarted","Data":"588dc45bf31c18016b37452ed3e4b68b57aad853692243431af12969d4a441e0"} Oct 01 15:13:05 crc kubenswrapper[4771]: I1001 15:13:05.463378 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:13:05 crc kubenswrapper[4771]: I1001 15:13:05.496506 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=-9223371947.358292 podStartE2EDuration="1m29.496484153s" podCreationTimestamp="2025-10-01 15:11:36 +0000 UTC" firstStartedPulling="2025-10-01 15:11:38.556700691 +0000 UTC m=+943.175875862" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:13:05.493360276 +0000 UTC m=+1030.112535447" watchObservedRunningTime="2025-10-01 15:13:05.496484153 +0000 UTC m=+1030.115659334" Oct 01 15:13:06 crc kubenswrapper[4771]: I1001 15:13:06.447446 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-l6hss"] Oct 01 15:13:06 crc kubenswrapper[4771]: E1001 15:13:06.450331 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="650906d6-4b76-4503-b0ee-e59f0a3302fb" containerName="mariadb-database-create" Oct 01 15:13:06 crc kubenswrapper[4771]: I1001 15:13:06.450361 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="650906d6-4b76-4503-b0ee-e59f0a3302fb" containerName="mariadb-database-create" Oct 01 15:13:06 crc kubenswrapper[4771]: E1001 15:13:06.450390 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74e7f311-b206-4643-a136-5d40e30e7e39" containerName="mariadb-database-create" Oct 01 15:13:06 crc kubenswrapper[4771]: I1001 15:13:06.450399 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="74e7f311-b206-4643-a136-5d40e30e7e39" containerName="mariadb-database-create" Oct 01 15:13:06 crc kubenswrapper[4771]: E1001 15:13:06.450421 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d31bdfbc-f66b-4a6f-b64c-f56b4a63c481" containerName="mariadb-account-create" Oct 01 15:13:06 crc kubenswrapper[4771]: I1001 15:13:06.450427 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d31bdfbc-f66b-4a6f-b64c-f56b4a63c481" containerName="mariadb-account-create" Oct 01 15:13:06 crc kubenswrapper[4771]: E1001 15:13:06.450438 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48bdd44b-bf9e-4349-90de-9c3e1126def6" containerName="mariadb-account-create" Oct 01 15:13:06 crc kubenswrapper[4771]: I1001 15:13:06.450444 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="48bdd44b-bf9e-4349-90de-9c3e1126def6" containerName="mariadb-account-create" Oct 01 15:13:06 crc kubenswrapper[4771]: E1001 15:13:06.450454 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6b325a5-74ef-42ee-89a8-c5855a4dc1f8" containerName="mariadb-database-create" Oct 01 15:13:06 crc kubenswrapper[4771]: I1001 15:13:06.450461 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6b325a5-74ef-42ee-89a8-c5855a4dc1f8" containerName="mariadb-database-create" Oct 01 15:13:06 crc kubenswrapper[4771]: I1001 15:13:06.450634 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="74e7f311-b206-4643-a136-5d40e30e7e39" containerName="mariadb-database-create" Oct 01 15:13:06 crc kubenswrapper[4771]: I1001 15:13:06.450653 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="d31bdfbc-f66b-4a6f-b64c-f56b4a63c481" containerName="mariadb-account-create" Oct 01 15:13:06 crc kubenswrapper[4771]: I1001 15:13:06.450666 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6b325a5-74ef-42ee-89a8-c5855a4dc1f8" containerName="mariadb-database-create" Oct 01 15:13:06 crc kubenswrapper[4771]: I1001 15:13:06.450677 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="650906d6-4b76-4503-b0ee-e59f0a3302fb" containerName="mariadb-database-create" Oct 01 15:13:06 crc kubenswrapper[4771]: I1001 15:13:06.450688 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="48bdd44b-bf9e-4349-90de-9c3e1126def6" containerName="mariadb-account-create" Oct 01 15:13:06 crc kubenswrapper[4771]: I1001 15:13:06.451276 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-l6hss" Oct 01 15:13:06 crc kubenswrapper[4771]: I1001 15:13:06.454413 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 01 15:13:06 crc kubenswrapper[4771]: I1001 15:13:06.454465 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 01 15:13:06 crc kubenswrapper[4771]: I1001 15:13:06.454536 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-h6k26" Oct 01 15:13:06 crc kubenswrapper[4771]: I1001 15:13:06.454692 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 01 15:13:06 crc kubenswrapper[4771]: I1001 15:13:06.459347 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-l6hss"] Oct 01 15:13:06 crc kubenswrapper[4771]: I1001 15:13:06.503094 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-2939-account-create-9lwzm"] Oct 01 15:13:06 crc kubenswrapper[4771]: I1001 15:13:06.504226 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2939-account-create-9lwzm" Oct 01 15:13:06 crc kubenswrapper[4771]: I1001 15:13:06.511054 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 01 15:13:06 crc kubenswrapper[4771]: I1001 15:13:06.513527 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1916131b-f4ff-4f49-8abc-a640dc07abc4","Type":"ContainerStarted","Data":"545d98573e6a2b5fb8f98ad7d04e5362fd5493cb029c636a79da1556d3deafcc"} Oct 01 15:13:06 crc kubenswrapper[4771]: I1001 15:13:06.513575 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-2939-account-create-9lwzm"] Oct 01 15:13:06 crc kubenswrapper[4771]: I1001 15:13:06.513591 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1916131b-f4ff-4f49-8abc-a640dc07abc4","Type":"ContainerStarted","Data":"54b136ca6899d62b177efa98b6e54cddb305d14b93769f49f26b3f003f963fc2"} Oct 01 15:13:06 crc kubenswrapper[4771]: I1001 15:13:06.513600 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1916131b-f4ff-4f49-8abc-a640dc07abc4","Type":"ContainerStarted","Data":"43520bdfc8db2286a79fed436064dad9cbc094fda3317a55f33659a680f638ac"} Oct 01 15:13:06 crc kubenswrapper[4771]: I1001 15:13:06.513612 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1916131b-f4ff-4f49-8abc-a640dc07abc4","Type":"ContainerStarted","Data":"a0ed94b9a423ba7d6e63a7ae305165f545e26c2a14ba98f99520f51fd4321974"} Oct 01 15:13:06 crc kubenswrapper[4771]: I1001 15:13:06.539774 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04988f3e-c2c9-454f-8ec5-7d269d07685a-config-data\") pod \"keystone-db-sync-l6hss\" (UID: \"04988f3e-c2c9-454f-8ec5-7d269d07685a\") " pod="openstack/keystone-db-sync-l6hss" Oct 01 15:13:06 crc kubenswrapper[4771]: I1001 15:13:06.539833 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz68v\" (UniqueName: \"kubernetes.io/projected/04988f3e-c2c9-454f-8ec5-7d269d07685a-kube-api-access-zz68v\") pod \"keystone-db-sync-l6hss\" (UID: \"04988f3e-c2c9-454f-8ec5-7d269d07685a\") " pod="openstack/keystone-db-sync-l6hss" Oct 01 15:13:06 crc kubenswrapper[4771]: I1001 15:13:06.539862 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04988f3e-c2c9-454f-8ec5-7d269d07685a-combined-ca-bundle\") pod \"keystone-db-sync-l6hss\" (UID: \"04988f3e-c2c9-454f-8ec5-7d269d07685a\") " pod="openstack/keystone-db-sync-l6hss" Oct 01 15:13:06 crc kubenswrapper[4771]: I1001 15:13:06.644327 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04988f3e-c2c9-454f-8ec5-7d269d07685a-config-data\") pod \"keystone-db-sync-l6hss\" (UID: \"04988f3e-c2c9-454f-8ec5-7d269d07685a\") " pod="openstack/keystone-db-sync-l6hss" Oct 01 15:13:06 crc kubenswrapper[4771]: I1001 15:13:06.644916 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjntg\" (UniqueName: \"kubernetes.io/projected/dd4bd627-838e-4099-9801-c08693e22b9b-kube-api-access-cjntg\") pod \"glance-2939-account-create-9lwzm\" (UID: \"dd4bd627-838e-4099-9801-c08693e22b9b\") " pod="openstack/glance-2939-account-create-9lwzm" Oct 01 15:13:06 crc kubenswrapper[4771]: I1001 15:13:06.644995 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz68v\" (UniqueName: \"kubernetes.io/projected/04988f3e-c2c9-454f-8ec5-7d269d07685a-kube-api-access-zz68v\") pod \"keystone-db-sync-l6hss\" (UID: \"04988f3e-c2c9-454f-8ec5-7d269d07685a\") " pod="openstack/keystone-db-sync-l6hss" Oct 01 15:13:06 crc kubenswrapper[4771]: I1001 15:13:06.645664 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04988f3e-c2c9-454f-8ec5-7d269d07685a-combined-ca-bundle\") pod \"keystone-db-sync-l6hss\" (UID: \"04988f3e-c2c9-454f-8ec5-7d269d07685a\") " pod="openstack/keystone-db-sync-l6hss" Oct 01 15:13:06 crc kubenswrapper[4771]: I1001 15:13:06.650793 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04988f3e-c2c9-454f-8ec5-7d269d07685a-combined-ca-bundle\") pod \"keystone-db-sync-l6hss\" (UID: \"04988f3e-c2c9-454f-8ec5-7d269d07685a\") " pod="openstack/keystone-db-sync-l6hss" Oct 01 15:13:06 crc kubenswrapper[4771]: I1001 15:13:06.652661 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04988f3e-c2c9-454f-8ec5-7d269d07685a-config-data\") pod \"keystone-db-sync-l6hss\" (UID: \"04988f3e-c2c9-454f-8ec5-7d269d07685a\") " pod="openstack/keystone-db-sync-l6hss" Oct 01 15:13:06 crc kubenswrapper[4771]: I1001 15:13:06.665509 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz68v\" (UniqueName: \"kubernetes.io/projected/04988f3e-c2c9-454f-8ec5-7d269d07685a-kube-api-access-zz68v\") pod \"keystone-db-sync-l6hss\" (UID: \"04988f3e-c2c9-454f-8ec5-7d269d07685a\") " pod="openstack/keystone-db-sync-l6hss" Oct 01 15:13:06 crc kubenswrapper[4771]: I1001 15:13:06.747445 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjntg\" (UniqueName: \"kubernetes.io/projected/dd4bd627-838e-4099-9801-c08693e22b9b-kube-api-access-cjntg\") pod \"glance-2939-account-create-9lwzm\" (UID: \"dd4bd627-838e-4099-9801-c08693e22b9b\") " pod="openstack/glance-2939-account-create-9lwzm" Oct 01 15:13:06 crc kubenswrapper[4771]: I1001 15:13:06.764442 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjntg\" (UniqueName: \"kubernetes.io/projected/dd4bd627-838e-4099-9801-c08693e22b9b-kube-api-access-cjntg\") pod \"glance-2939-account-create-9lwzm\" (UID: \"dd4bd627-838e-4099-9801-c08693e22b9b\") " pod="openstack/glance-2939-account-create-9lwzm" Oct 01 15:13:06 crc kubenswrapper[4771]: I1001 15:13:06.769370 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-l6hss" Oct 01 15:13:06 crc kubenswrapper[4771]: I1001 15:13:06.825099 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2939-account-create-9lwzm" Oct 01 15:13:06 crc kubenswrapper[4771]: I1001 15:13:06.955866 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-zpdvh" podUID="20d8761e-4ce2-4312-8a80-8c3ce8908f2c" containerName="ovn-controller" probeResult="failure" output=< Oct 01 15:13:06 crc kubenswrapper[4771]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 01 15:13:06 crc kubenswrapper[4771]: > Oct 01 15:13:06 crc kubenswrapper[4771]: I1001 15:13:06.979269 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-rv4hj" Oct 01 15:13:06 crc kubenswrapper[4771]: I1001 15:13:06.997134 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-rv4hj" Oct 01 15:13:07 crc kubenswrapper[4771]: I1001 15:13:07.197796 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-zpdvh-config-sxncj"] Oct 01 15:13:07 crc kubenswrapper[4771]: I1001 15:13:07.199234 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zpdvh-config-sxncj" Oct 01 15:13:07 crc kubenswrapper[4771]: I1001 15:13:07.201325 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 01 15:13:07 crc kubenswrapper[4771]: I1001 15:13:07.212342 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zpdvh-config-sxncj"] Oct 01 15:13:07 crc kubenswrapper[4771]: I1001 15:13:07.256827 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78e6ad24-e82e-4edf-b5ab-b7a0abdb686a-scripts\") pod \"ovn-controller-zpdvh-config-sxncj\" (UID: \"78e6ad24-e82e-4edf-b5ab-b7a0abdb686a\") " pod="openstack/ovn-controller-zpdvh-config-sxncj" Oct 01 15:13:07 crc kubenswrapper[4771]: I1001 15:13:07.256963 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/78e6ad24-e82e-4edf-b5ab-b7a0abdb686a-var-run\") pod \"ovn-controller-zpdvh-config-sxncj\" (UID: \"78e6ad24-e82e-4edf-b5ab-b7a0abdb686a\") " pod="openstack/ovn-controller-zpdvh-config-sxncj" Oct 01 15:13:07 crc kubenswrapper[4771]: I1001 15:13:07.257160 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/78e6ad24-e82e-4edf-b5ab-b7a0abdb686a-var-run-ovn\") pod \"ovn-controller-zpdvh-config-sxncj\" (UID: \"78e6ad24-e82e-4edf-b5ab-b7a0abdb686a\") " pod="openstack/ovn-controller-zpdvh-config-sxncj" Oct 01 15:13:07 crc kubenswrapper[4771]: I1001 15:13:07.257383 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnccx\" (UniqueName: \"kubernetes.io/projected/78e6ad24-e82e-4edf-b5ab-b7a0abdb686a-kube-api-access-xnccx\") pod \"ovn-controller-zpdvh-config-sxncj\" (UID: \"78e6ad24-e82e-4edf-b5ab-b7a0abdb686a\") " pod="openstack/ovn-controller-zpdvh-config-sxncj" Oct 01 15:13:07 crc kubenswrapper[4771]: I1001 15:13:07.257428 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/78e6ad24-e82e-4edf-b5ab-b7a0abdb686a-additional-scripts\") pod \"ovn-controller-zpdvh-config-sxncj\" (UID: \"78e6ad24-e82e-4edf-b5ab-b7a0abdb686a\") " pod="openstack/ovn-controller-zpdvh-config-sxncj" Oct 01 15:13:07 crc kubenswrapper[4771]: I1001 15:13:07.257674 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/78e6ad24-e82e-4edf-b5ab-b7a0abdb686a-var-log-ovn\") pod \"ovn-controller-zpdvh-config-sxncj\" (UID: \"78e6ad24-e82e-4edf-b5ab-b7a0abdb686a\") " pod="openstack/ovn-controller-zpdvh-config-sxncj" Oct 01 15:13:07 crc kubenswrapper[4771]: I1001 15:13:07.268376 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-l6hss"] Oct 01 15:13:07 crc kubenswrapper[4771]: W1001 15:13:07.268518 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04988f3e_c2c9_454f_8ec5_7d269d07685a.slice/crio-095cebea4a256c978e099afd1224a4e2f7c02b64e5de7f34374abc1777868029 WatchSource:0}: Error finding container 095cebea4a256c978e099afd1224a4e2f7c02b64e5de7f34374abc1777868029: Status 404 returned error can't find the container with id 095cebea4a256c978e099afd1224a4e2f7c02b64e5de7f34374abc1777868029 Oct 01 15:13:07 crc kubenswrapper[4771]: I1001 15:13:07.327762 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-2939-account-create-9lwzm"] Oct 01 15:13:07 crc kubenswrapper[4771]: W1001 15:13:07.328697 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd4bd627_838e_4099_9801_c08693e22b9b.slice/crio-4905753f6550f4835be7d44bb1cb10f6fa2df2cca8e71e8a3fe0cb8dbeb94883 WatchSource:0}: Error finding container 4905753f6550f4835be7d44bb1cb10f6fa2df2cca8e71e8a3fe0cb8dbeb94883: Status 404 returned error can't find the container with id 4905753f6550f4835be7d44bb1cb10f6fa2df2cca8e71e8a3fe0cb8dbeb94883 Oct 01 15:13:07 crc kubenswrapper[4771]: I1001 15:13:07.359700 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnccx\" (UniqueName: \"kubernetes.io/projected/78e6ad24-e82e-4edf-b5ab-b7a0abdb686a-kube-api-access-xnccx\") pod \"ovn-controller-zpdvh-config-sxncj\" (UID: \"78e6ad24-e82e-4edf-b5ab-b7a0abdb686a\") " pod="openstack/ovn-controller-zpdvh-config-sxncj" Oct 01 15:13:07 crc kubenswrapper[4771]: I1001 15:13:07.359757 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/78e6ad24-e82e-4edf-b5ab-b7a0abdb686a-additional-scripts\") pod \"ovn-controller-zpdvh-config-sxncj\" (UID: \"78e6ad24-e82e-4edf-b5ab-b7a0abdb686a\") " pod="openstack/ovn-controller-zpdvh-config-sxncj" Oct 01 15:13:07 crc kubenswrapper[4771]: I1001 15:13:07.359834 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/78e6ad24-e82e-4edf-b5ab-b7a0abdb686a-var-log-ovn\") pod \"ovn-controller-zpdvh-config-sxncj\" (UID: \"78e6ad24-e82e-4edf-b5ab-b7a0abdb686a\") " pod="openstack/ovn-controller-zpdvh-config-sxncj" Oct 01 15:13:07 crc kubenswrapper[4771]: I1001 15:13:07.359860 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78e6ad24-e82e-4edf-b5ab-b7a0abdb686a-scripts\") pod \"ovn-controller-zpdvh-config-sxncj\" (UID: \"78e6ad24-e82e-4edf-b5ab-b7a0abdb686a\") " pod="openstack/ovn-controller-zpdvh-config-sxncj" Oct 01 15:13:07 crc kubenswrapper[4771]: I1001 15:13:07.359882 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/78e6ad24-e82e-4edf-b5ab-b7a0abdb686a-var-run\") pod \"ovn-controller-zpdvh-config-sxncj\" (UID: \"78e6ad24-e82e-4edf-b5ab-b7a0abdb686a\") " pod="openstack/ovn-controller-zpdvh-config-sxncj" Oct 01 15:13:07 crc kubenswrapper[4771]: I1001 15:13:07.359941 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/78e6ad24-e82e-4edf-b5ab-b7a0abdb686a-var-run-ovn\") pod \"ovn-controller-zpdvh-config-sxncj\" (UID: \"78e6ad24-e82e-4edf-b5ab-b7a0abdb686a\") " pod="openstack/ovn-controller-zpdvh-config-sxncj" Oct 01 15:13:07 crc kubenswrapper[4771]: I1001 15:13:07.360209 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/78e6ad24-e82e-4edf-b5ab-b7a0abdb686a-var-run-ovn\") pod \"ovn-controller-zpdvh-config-sxncj\" (UID: \"78e6ad24-e82e-4edf-b5ab-b7a0abdb686a\") " pod="openstack/ovn-controller-zpdvh-config-sxncj" Oct 01 15:13:07 crc kubenswrapper[4771]: I1001 15:13:07.360872 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/78e6ad24-e82e-4edf-b5ab-b7a0abdb686a-var-run\") pod \"ovn-controller-zpdvh-config-sxncj\" (UID: \"78e6ad24-e82e-4edf-b5ab-b7a0abdb686a\") " pod="openstack/ovn-controller-zpdvh-config-sxncj" Oct 01 15:13:07 crc kubenswrapper[4771]: I1001 15:13:07.360885 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/78e6ad24-e82e-4edf-b5ab-b7a0abdb686a-var-log-ovn\") pod \"ovn-controller-zpdvh-config-sxncj\" (UID: \"78e6ad24-e82e-4edf-b5ab-b7a0abdb686a\") " pod="openstack/ovn-controller-zpdvh-config-sxncj" Oct 01 15:13:07 crc kubenswrapper[4771]: I1001 15:13:07.361251 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/78e6ad24-e82e-4edf-b5ab-b7a0abdb686a-additional-scripts\") pod \"ovn-controller-zpdvh-config-sxncj\" (UID: \"78e6ad24-e82e-4edf-b5ab-b7a0abdb686a\") " pod="openstack/ovn-controller-zpdvh-config-sxncj" Oct 01 15:13:07 crc kubenswrapper[4771]: I1001 15:13:07.366828 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78e6ad24-e82e-4edf-b5ab-b7a0abdb686a-scripts\") pod \"ovn-controller-zpdvh-config-sxncj\" (UID: \"78e6ad24-e82e-4edf-b5ab-b7a0abdb686a\") " pod="openstack/ovn-controller-zpdvh-config-sxncj" Oct 01 15:13:07 crc kubenswrapper[4771]: I1001 15:13:07.388628 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnccx\" (UniqueName: \"kubernetes.io/projected/78e6ad24-e82e-4edf-b5ab-b7a0abdb686a-kube-api-access-xnccx\") pod \"ovn-controller-zpdvh-config-sxncj\" (UID: \"78e6ad24-e82e-4edf-b5ab-b7a0abdb686a\") " pod="openstack/ovn-controller-zpdvh-config-sxncj" Oct 01 15:13:07 crc kubenswrapper[4771]: I1001 15:13:07.522217 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-l6hss" event={"ID":"04988f3e-c2c9-454f-8ec5-7d269d07685a","Type":"ContainerStarted","Data":"095cebea4a256c978e099afd1224a4e2f7c02b64e5de7f34374abc1777868029"} Oct 01 15:13:07 crc kubenswrapper[4771]: I1001 15:13:07.524445 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2939-account-create-9lwzm" event={"ID":"dd4bd627-838e-4099-9801-c08693e22b9b","Type":"ContainerStarted","Data":"d1f4e0400758a0a5710359f819b00860985ebbef25d26a61d9cc05553e6efc80"} Oct 01 15:13:07 crc kubenswrapper[4771]: I1001 15:13:07.524491 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2939-account-create-9lwzm" event={"ID":"dd4bd627-838e-4099-9801-c08693e22b9b","Type":"ContainerStarted","Data":"4905753f6550f4835be7d44bb1cb10f6fa2df2cca8e71e8a3fe0cb8dbeb94883"} Oct 01 15:13:07 crc kubenswrapper[4771]: I1001 15:13:07.531349 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zpdvh-config-sxncj" Oct 01 15:13:07 crc kubenswrapper[4771]: I1001 15:13:07.531582 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1916131b-f4ff-4f49-8abc-a640dc07abc4","Type":"ContainerStarted","Data":"42e8bf10afdde268d72f3e8882fb2edac6d8350961507dd2b2d9318155829b6d"} Oct 01 15:13:07 crc kubenswrapper[4771]: I1001 15:13:07.531633 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"1916131b-f4ff-4f49-8abc-a640dc07abc4","Type":"ContainerStarted","Data":"737d074ad76d869cecacc6cfd2d69a6d753ba0e041656315fad7ff7b0de53822"} Oct 01 15:13:07 crc kubenswrapper[4771]: I1001 15:13:07.543346 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-2939-account-create-9lwzm" podStartSLOduration=1.5433157419999999 podStartE2EDuration="1.543315742s" podCreationTimestamp="2025-10-01 15:13:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:13:07.540879421 +0000 UTC m=+1032.160054592" watchObservedRunningTime="2025-10-01 15:13:07.543315742 +0000 UTC m=+1032.162490953" Oct 01 15:13:07 crc kubenswrapper[4771]: I1001 15:13:07.588519 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=36.833769432 podStartE2EDuration="44.588499258s" podCreationTimestamp="2025-10-01 15:12:23 +0000 UTC" firstStartedPulling="2025-10-01 15:12:57.23516299 +0000 UTC m=+1021.854338171" lastFinishedPulling="2025-10-01 15:13:04.989892786 +0000 UTC m=+1029.609067997" observedRunningTime="2025-10-01 15:13:07.580521902 +0000 UTC m=+1032.199697113" watchObservedRunningTime="2025-10-01 15:13:07.588499258 +0000 UTC m=+1032.207674439" Oct 01 15:13:07 crc kubenswrapper[4771]: I1001 15:13:07.831389 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-wrk8t"] Oct 01 15:13:07 crc kubenswrapper[4771]: I1001 15:13:07.832990 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-wrk8t" Oct 01 15:13:07 crc kubenswrapper[4771]: I1001 15:13:07.844033 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 01 15:13:07 crc kubenswrapper[4771]: I1001 15:13:07.852036 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-wrk8t"] Oct 01 15:13:07 crc kubenswrapper[4771]: I1001 15:13:07.967533 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-zpdvh-config-sxncj"] Oct 01 15:13:07 crc kubenswrapper[4771]: I1001 15:13:07.970885 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czs4n\" (UniqueName: \"kubernetes.io/projected/c3158c13-7e3a-4467-85bb-acb01fcba800-kube-api-access-czs4n\") pod \"dnsmasq-dns-77585f5f8c-wrk8t\" (UID: \"c3158c13-7e3a-4467-85bb-acb01fcba800\") " pod="openstack/dnsmasq-dns-77585f5f8c-wrk8t" Oct 01 15:13:07 crc kubenswrapper[4771]: I1001 15:13:07.970982 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3158c13-7e3a-4467-85bb-acb01fcba800-config\") pod \"dnsmasq-dns-77585f5f8c-wrk8t\" (UID: \"c3158c13-7e3a-4467-85bb-acb01fcba800\") " pod="openstack/dnsmasq-dns-77585f5f8c-wrk8t" Oct 01 15:13:07 crc kubenswrapper[4771]: I1001 15:13:07.971180 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3158c13-7e3a-4467-85bb-acb01fcba800-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-wrk8t\" (UID: \"c3158c13-7e3a-4467-85bb-acb01fcba800\") " pod="openstack/dnsmasq-dns-77585f5f8c-wrk8t" Oct 01 15:13:07 crc kubenswrapper[4771]: I1001 15:13:07.971294 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c3158c13-7e3a-4467-85bb-acb01fcba800-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-wrk8t\" (UID: \"c3158c13-7e3a-4467-85bb-acb01fcba800\") " pod="openstack/dnsmasq-dns-77585f5f8c-wrk8t" Oct 01 15:13:07 crc kubenswrapper[4771]: I1001 15:13:07.971376 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c3158c13-7e3a-4467-85bb-acb01fcba800-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-wrk8t\" (UID: \"c3158c13-7e3a-4467-85bb-acb01fcba800\") " pod="openstack/dnsmasq-dns-77585f5f8c-wrk8t" Oct 01 15:13:07 crc kubenswrapper[4771]: I1001 15:13:07.971470 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3158c13-7e3a-4467-85bb-acb01fcba800-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-wrk8t\" (UID: \"c3158c13-7e3a-4467-85bb-acb01fcba800\") " pod="openstack/dnsmasq-dns-77585f5f8c-wrk8t" Oct 01 15:13:07 crc kubenswrapper[4771]: W1001 15:13:07.972996 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78e6ad24_e82e_4edf_b5ab_b7a0abdb686a.slice/crio-8c1c113c843ef468870a58b826b4f5651638b496059661abb236ab09c8a20c24 WatchSource:0}: Error finding container 8c1c113c843ef468870a58b826b4f5651638b496059661abb236ab09c8a20c24: Status 404 returned error can't find the container with id 8c1c113c843ef468870a58b826b4f5651638b496059661abb236ab09c8a20c24 Oct 01 15:13:08 crc kubenswrapper[4771]: I1001 15:13:08.011847 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-ec97-account-create-fmwdx"] Oct 01 15:13:08 crc kubenswrapper[4771]: I1001 15:13:08.012897 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ec97-account-create-fmwdx" Oct 01 15:13:08 crc kubenswrapper[4771]: I1001 15:13:08.015298 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 01 15:13:08 crc kubenswrapper[4771]: I1001 15:13:08.022111 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-ec97-account-create-fmwdx"] Oct 01 15:13:08 crc kubenswrapper[4771]: I1001 15:13:08.073234 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czs4n\" (UniqueName: \"kubernetes.io/projected/c3158c13-7e3a-4467-85bb-acb01fcba800-kube-api-access-czs4n\") pod \"dnsmasq-dns-77585f5f8c-wrk8t\" (UID: \"c3158c13-7e3a-4467-85bb-acb01fcba800\") " pod="openstack/dnsmasq-dns-77585f5f8c-wrk8t" Oct 01 15:13:08 crc kubenswrapper[4771]: I1001 15:13:08.073608 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3158c13-7e3a-4467-85bb-acb01fcba800-config\") pod \"dnsmasq-dns-77585f5f8c-wrk8t\" (UID: \"c3158c13-7e3a-4467-85bb-acb01fcba800\") " pod="openstack/dnsmasq-dns-77585f5f8c-wrk8t" Oct 01 15:13:08 crc kubenswrapper[4771]: I1001 15:13:08.073633 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmdnf\" (UniqueName: \"kubernetes.io/projected/6d0dd8a9-f5fe-4838-892d-62ec5db46f3b-kube-api-access-gmdnf\") pod \"barbican-ec97-account-create-fmwdx\" (UID: \"6d0dd8a9-f5fe-4838-892d-62ec5db46f3b\") " pod="openstack/barbican-ec97-account-create-fmwdx" Oct 01 15:13:08 crc kubenswrapper[4771]: I1001 15:13:08.073699 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3158c13-7e3a-4467-85bb-acb01fcba800-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-wrk8t\" (UID: \"c3158c13-7e3a-4467-85bb-acb01fcba800\") " pod="openstack/dnsmasq-dns-77585f5f8c-wrk8t" Oct 01 15:13:08 crc kubenswrapper[4771]: I1001 15:13:08.073761 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c3158c13-7e3a-4467-85bb-acb01fcba800-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-wrk8t\" (UID: \"c3158c13-7e3a-4467-85bb-acb01fcba800\") " pod="openstack/dnsmasq-dns-77585f5f8c-wrk8t" Oct 01 15:13:08 crc kubenswrapper[4771]: I1001 15:13:08.073799 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c3158c13-7e3a-4467-85bb-acb01fcba800-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-wrk8t\" (UID: \"c3158c13-7e3a-4467-85bb-acb01fcba800\") " pod="openstack/dnsmasq-dns-77585f5f8c-wrk8t" Oct 01 15:13:08 crc kubenswrapper[4771]: I1001 15:13:08.073837 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3158c13-7e3a-4467-85bb-acb01fcba800-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-wrk8t\" (UID: \"c3158c13-7e3a-4467-85bb-acb01fcba800\") " pod="openstack/dnsmasq-dns-77585f5f8c-wrk8t" Oct 01 15:13:08 crc kubenswrapper[4771]: I1001 15:13:08.075927 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3158c13-7e3a-4467-85bb-acb01fcba800-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-wrk8t\" (UID: \"c3158c13-7e3a-4467-85bb-acb01fcba800\") " pod="openstack/dnsmasq-dns-77585f5f8c-wrk8t" Oct 01 15:13:08 crc kubenswrapper[4771]: I1001 15:13:08.076090 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3158c13-7e3a-4467-85bb-acb01fcba800-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-wrk8t\" (UID: \"c3158c13-7e3a-4467-85bb-acb01fcba800\") " pod="openstack/dnsmasq-dns-77585f5f8c-wrk8t" Oct 01 15:13:08 crc kubenswrapper[4771]: I1001 15:13:08.076374 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c3158c13-7e3a-4467-85bb-acb01fcba800-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-wrk8t\" (UID: \"c3158c13-7e3a-4467-85bb-acb01fcba800\") " pod="openstack/dnsmasq-dns-77585f5f8c-wrk8t" Oct 01 15:13:08 crc kubenswrapper[4771]: I1001 15:13:08.076514 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3158c13-7e3a-4467-85bb-acb01fcba800-config\") pod \"dnsmasq-dns-77585f5f8c-wrk8t\" (UID: \"c3158c13-7e3a-4467-85bb-acb01fcba800\") " pod="openstack/dnsmasq-dns-77585f5f8c-wrk8t" Oct 01 15:13:08 crc kubenswrapper[4771]: I1001 15:13:08.077424 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c3158c13-7e3a-4467-85bb-acb01fcba800-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-wrk8t\" (UID: \"c3158c13-7e3a-4467-85bb-acb01fcba800\") " pod="openstack/dnsmasq-dns-77585f5f8c-wrk8t" Oct 01 15:13:08 crc kubenswrapper[4771]: I1001 15:13:08.099939 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-bcc4-account-create-bgj79"] Oct 01 15:13:08 crc kubenswrapper[4771]: I1001 15:13:08.101705 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-bcc4-account-create-bgj79" Oct 01 15:13:08 crc kubenswrapper[4771]: I1001 15:13:08.104777 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czs4n\" (UniqueName: \"kubernetes.io/projected/c3158c13-7e3a-4467-85bb-acb01fcba800-kube-api-access-czs4n\") pod \"dnsmasq-dns-77585f5f8c-wrk8t\" (UID: \"c3158c13-7e3a-4467-85bb-acb01fcba800\") " pod="openstack/dnsmasq-dns-77585f5f8c-wrk8t" Oct 01 15:13:08 crc kubenswrapper[4771]: I1001 15:13:08.105033 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 01 15:13:08 crc kubenswrapper[4771]: I1001 15:13:08.109360 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-bcc4-account-create-bgj79"] Oct 01 15:13:08 crc kubenswrapper[4771]: I1001 15:13:08.150844 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-wrk8t" Oct 01 15:13:08 crc kubenswrapper[4771]: I1001 15:13:08.176173 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmdnf\" (UniqueName: \"kubernetes.io/projected/6d0dd8a9-f5fe-4838-892d-62ec5db46f3b-kube-api-access-gmdnf\") pod \"barbican-ec97-account-create-fmwdx\" (UID: \"6d0dd8a9-f5fe-4838-892d-62ec5db46f3b\") " pod="openstack/barbican-ec97-account-create-fmwdx" Oct 01 15:13:08 crc kubenswrapper[4771]: I1001 15:13:08.176385 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgl5n\" (UniqueName: \"kubernetes.io/projected/86dbf463-9a57-4d6b-b352-b5a6c70d3e9c-kube-api-access-cgl5n\") pod \"cinder-bcc4-account-create-bgj79\" (UID: \"86dbf463-9a57-4d6b-b352-b5a6c70d3e9c\") " pod="openstack/cinder-bcc4-account-create-bgj79" Oct 01 15:13:08 crc kubenswrapper[4771]: I1001 15:13:08.200896 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmdnf\" (UniqueName: \"kubernetes.io/projected/6d0dd8a9-f5fe-4838-892d-62ec5db46f3b-kube-api-access-gmdnf\") pod \"barbican-ec97-account-create-fmwdx\" (UID: \"6d0dd8a9-f5fe-4838-892d-62ec5db46f3b\") " pod="openstack/barbican-ec97-account-create-fmwdx" Oct 01 15:13:08 crc kubenswrapper[4771]: I1001 15:13:08.277972 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgl5n\" (UniqueName: \"kubernetes.io/projected/86dbf463-9a57-4d6b-b352-b5a6c70d3e9c-kube-api-access-cgl5n\") pod \"cinder-bcc4-account-create-bgj79\" (UID: \"86dbf463-9a57-4d6b-b352-b5a6c70d3e9c\") " pod="openstack/cinder-bcc4-account-create-bgj79" Oct 01 15:13:08 crc kubenswrapper[4771]: I1001 15:13:08.303190 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgl5n\" (UniqueName: \"kubernetes.io/projected/86dbf463-9a57-4d6b-b352-b5a6c70d3e9c-kube-api-access-cgl5n\") pod \"cinder-bcc4-account-create-bgj79\" (UID: \"86dbf463-9a57-4d6b-b352-b5a6c70d3e9c\") " pod="openstack/cinder-bcc4-account-create-bgj79" Oct 01 15:13:08 crc kubenswrapper[4771]: I1001 15:13:08.337967 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ec97-account-create-fmwdx" Oct 01 15:13:08 crc kubenswrapper[4771]: I1001 15:13:08.427421 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-ae86-account-create-jtrlh"] Oct 01 15:13:08 crc kubenswrapper[4771]: I1001 15:13:08.429060 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ae86-account-create-jtrlh" Oct 01 15:13:08 crc kubenswrapper[4771]: I1001 15:13:08.431248 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 01 15:13:08 crc kubenswrapper[4771]: I1001 15:13:08.438582 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ae86-account-create-jtrlh"] Oct 01 15:13:08 crc kubenswrapper[4771]: I1001 15:13:08.460211 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-bcc4-account-create-bgj79" Oct 01 15:13:08 crc kubenswrapper[4771]: I1001 15:13:08.480885 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqdjd\" (UniqueName: \"kubernetes.io/projected/28639e17-35cd-4824-9390-0a1212a73c73-kube-api-access-tqdjd\") pod \"neutron-ae86-account-create-jtrlh\" (UID: \"28639e17-35cd-4824-9390-0a1212a73c73\") " pod="openstack/neutron-ae86-account-create-jtrlh" Oct 01 15:13:08 crc kubenswrapper[4771]: I1001 15:13:08.582506 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqdjd\" (UniqueName: \"kubernetes.io/projected/28639e17-35cd-4824-9390-0a1212a73c73-kube-api-access-tqdjd\") pod \"neutron-ae86-account-create-jtrlh\" (UID: \"28639e17-35cd-4824-9390-0a1212a73c73\") " pod="openstack/neutron-ae86-account-create-jtrlh" Oct 01 15:13:08 crc kubenswrapper[4771]: I1001 15:13:08.588920 4771 generic.go:334] "Generic (PLEG): container finished" podID="dd4bd627-838e-4099-9801-c08693e22b9b" containerID="d1f4e0400758a0a5710359f819b00860985ebbef25d26a61d9cc05553e6efc80" exitCode=0 Oct 01 15:13:08 crc kubenswrapper[4771]: I1001 15:13:08.589245 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2939-account-create-9lwzm" event={"ID":"dd4bd627-838e-4099-9801-c08693e22b9b","Type":"ContainerDied","Data":"d1f4e0400758a0a5710359f819b00860985ebbef25d26a61d9cc05553e6efc80"} Oct 01 15:13:08 crc kubenswrapper[4771]: I1001 15:13:08.593800 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zpdvh-config-sxncj" event={"ID":"78e6ad24-e82e-4edf-b5ab-b7a0abdb686a","Type":"ContainerStarted","Data":"0d62b92665c1a93f086952661ef09c1be90e0a57654d03292b9451c0e1a66451"} Oct 01 15:13:08 crc kubenswrapper[4771]: I1001 15:13:08.593834 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zpdvh-config-sxncj" event={"ID":"78e6ad24-e82e-4edf-b5ab-b7a0abdb686a","Type":"ContainerStarted","Data":"8c1c113c843ef468870a58b826b4f5651638b496059661abb236ab09c8a20c24"} Oct 01 15:13:08 crc kubenswrapper[4771]: I1001 15:13:08.608938 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqdjd\" (UniqueName: \"kubernetes.io/projected/28639e17-35cd-4824-9390-0a1212a73c73-kube-api-access-tqdjd\") pod \"neutron-ae86-account-create-jtrlh\" (UID: \"28639e17-35cd-4824-9390-0a1212a73c73\") " pod="openstack/neutron-ae86-account-create-jtrlh" Oct 01 15:13:08 crc kubenswrapper[4771]: I1001 15:13:08.634613 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-zpdvh-config-sxncj" podStartSLOduration=1.6345915290000002 podStartE2EDuration="1.634591529s" podCreationTimestamp="2025-10-01 15:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:13:08.630959918 +0000 UTC m=+1033.250135129" watchObservedRunningTime="2025-10-01 15:13:08.634591529 +0000 UTC m=+1033.253766700" Oct 01 15:13:08 crc kubenswrapper[4771]: I1001 15:13:08.649784 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-wrk8t"] Oct 01 15:13:08 crc kubenswrapper[4771]: W1001 15:13:08.649998 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3158c13_7e3a_4467_85bb_acb01fcba800.slice/crio-06e717b0d134806f8d7d505648eafd216737d8d9f715d61ef252289f83c5db6e WatchSource:0}: Error finding container 06e717b0d134806f8d7d505648eafd216737d8d9f715d61ef252289f83c5db6e: Status 404 returned error can't find the container with id 06e717b0d134806f8d7d505648eafd216737d8d9f715d61ef252289f83c5db6e Oct 01 15:13:08 crc kubenswrapper[4771]: I1001 15:13:08.750486 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ae86-account-create-jtrlh" Oct 01 15:13:08 crc kubenswrapper[4771]: I1001 15:13:08.905900 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-ec97-account-create-fmwdx"] Oct 01 15:13:08 crc kubenswrapper[4771]: I1001 15:13:08.994875 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-bcc4-account-create-bgj79"] Oct 01 15:13:09 crc kubenswrapper[4771]: W1001 15:13:09.005538 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86dbf463_9a57_4d6b_b352_b5a6c70d3e9c.slice/crio-00777668dcff1f31329289cf48bbc7830b53379f88b46b80c3bdcc2b7973b46c WatchSource:0}: Error finding container 00777668dcff1f31329289cf48bbc7830b53379f88b46b80c3bdcc2b7973b46c: Status 404 returned error can't find the container with id 00777668dcff1f31329289cf48bbc7830b53379f88b46b80c3bdcc2b7973b46c Oct 01 15:13:09 crc kubenswrapper[4771]: I1001 15:13:09.299332 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-ae86-account-create-jtrlh"] Oct 01 15:13:09 crc kubenswrapper[4771]: W1001 15:13:09.346044 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28639e17_35cd_4824_9390_0a1212a73c73.slice/crio-cd36c8790c1df62dae4149ca018765987ea113e035d6c1520342e082e8f7239a WatchSource:0}: Error finding container cd36c8790c1df62dae4149ca018765987ea113e035d6c1520342e082e8f7239a: Status 404 returned error can't find the container with id cd36c8790c1df62dae4149ca018765987ea113e035d6c1520342e082e8f7239a Oct 01 15:13:09 crc kubenswrapper[4771]: I1001 15:13:09.603457 4771 generic.go:334] "Generic (PLEG): container finished" podID="6d0dd8a9-f5fe-4838-892d-62ec5db46f3b" containerID="afb716d4146e55322085752768bfaf7dea05659a898bfd7a35e8f33432e12903" exitCode=0 Oct 01 15:13:09 crc kubenswrapper[4771]: I1001 15:13:09.603554 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-ec97-account-create-fmwdx" event={"ID":"6d0dd8a9-f5fe-4838-892d-62ec5db46f3b","Type":"ContainerDied","Data":"afb716d4146e55322085752768bfaf7dea05659a898bfd7a35e8f33432e12903"} Oct 01 15:13:09 crc kubenswrapper[4771]: I1001 15:13:09.603586 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-ec97-account-create-fmwdx" event={"ID":"6d0dd8a9-f5fe-4838-892d-62ec5db46f3b","Type":"ContainerStarted","Data":"159b5ee0fe247d6ca5a791a6d76f287ff1a3bb06e53d575261d21d15880efe1e"} Oct 01 15:13:09 crc kubenswrapper[4771]: I1001 15:13:09.605656 4771 generic.go:334] "Generic (PLEG): container finished" podID="c3158c13-7e3a-4467-85bb-acb01fcba800" containerID="d5a986d2ce17ea560ae7a61d317eda70744bada3d1bcbf933a0008de1694ebeb" exitCode=0 Oct 01 15:13:09 crc kubenswrapper[4771]: I1001 15:13:09.605705 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-wrk8t" event={"ID":"c3158c13-7e3a-4467-85bb-acb01fcba800","Type":"ContainerDied","Data":"d5a986d2ce17ea560ae7a61d317eda70744bada3d1bcbf933a0008de1694ebeb"} Oct 01 15:13:09 crc kubenswrapper[4771]: I1001 15:13:09.605724 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-wrk8t" event={"ID":"c3158c13-7e3a-4467-85bb-acb01fcba800","Type":"ContainerStarted","Data":"06e717b0d134806f8d7d505648eafd216737d8d9f715d61ef252289f83c5db6e"} Oct 01 15:13:09 crc kubenswrapper[4771]: I1001 15:13:09.607837 4771 generic.go:334] "Generic (PLEG): container finished" podID="28639e17-35cd-4824-9390-0a1212a73c73" containerID="cf136dc245827fbe5b3c4d31bb810a8befe8438a3a2fe35e4ca4f5c2614e03b8" exitCode=0 Oct 01 15:13:09 crc kubenswrapper[4771]: I1001 15:13:09.607982 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ae86-account-create-jtrlh" event={"ID":"28639e17-35cd-4824-9390-0a1212a73c73","Type":"ContainerDied","Data":"cf136dc245827fbe5b3c4d31bb810a8befe8438a3a2fe35e4ca4f5c2614e03b8"} Oct 01 15:13:09 crc kubenswrapper[4771]: I1001 15:13:09.608010 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ae86-account-create-jtrlh" event={"ID":"28639e17-35cd-4824-9390-0a1212a73c73","Type":"ContainerStarted","Data":"cd36c8790c1df62dae4149ca018765987ea113e035d6c1520342e082e8f7239a"} Oct 01 15:13:09 crc kubenswrapper[4771]: I1001 15:13:09.610203 4771 generic.go:334] "Generic (PLEG): container finished" podID="86dbf463-9a57-4d6b-b352-b5a6c70d3e9c" containerID="877114b5ea07b63ad882069286ac22c0e0cb10181c19aa21af9f4b38a9f2d632" exitCode=0 Oct 01 15:13:09 crc kubenswrapper[4771]: I1001 15:13:09.610256 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-bcc4-account-create-bgj79" event={"ID":"86dbf463-9a57-4d6b-b352-b5a6c70d3e9c","Type":"ContainerDied","Data":"877114b5ea07b63ad882069286ac22c0e0cb10181c19aa21af9f4b38a9f2d632"} Oct 01 15:13:09 crc kubenswrapper[4771]: I1001 15:13:09.610276 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-bcc4-account-create-bgj79" event={"ID":"86dbf463-9a57-4d6b-b352-b5a6c70d3e9c","Type":"ContainerStarted","Data":"00777668dcff1f31329289cf48bbc7830b53379f88b46b80c3bdcc2b7973b46c"} Oct 01 15:13:09 crc kubenswrapper[4771]: I1001 15:13:09.611551 4771 generic.go:334] "Generic (PLEG): container finished" podID="78e6ad24-e82e-4edf-b5ab-b7a0abdb686a" containerID="0d62b92665c1a93f086952661ef09c1be90e0a57654d03292b9451c0e1a66451" exitCode=0 Oct 01 15:13:09 crc kubenswrapper[4771]: I1001 15:13:09.611769 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zpdvh-config-sxncj" event={"ID":"78e6ad24-e82e-4edf-b5ab-b7a0abdb686a","Type":"ContainerDied","Data":"0d62b92665c1a93f086952661ef09c1be90e0a57654d03292b9451c0e1a66451"} Oct 01 15:13:11 crc kubenswrapper[4771]: I1001 15:13:11.931774 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-zpdvh" Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.430262 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-bcc4-account-create-bgj79" Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.442705 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ae86-account-create-jtrlh" Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.471527 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cgl5n\" (UniqueName: \"kubernetes.io/projected/86dbf463-9a57-4d6b-b352-b5a6c70d3e9c-kube-api-access-cgl5n\") pod \"86dbf463-9a57-4d6b-b352-b5a6c70d3e9c\" (UID: \"86dbf463-9a57-4d6b-b352-b5a6c70d3e9c\") " Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.476060 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ec97-account-create-fmwdx" Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.479064 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86dbf463-9a57-4d6b-b352-b5a6c70d3e9c-kube-api-access-cgl5n" (OuterVolumeSpecName: "kube-api-access-cgl5n") pod "86dbf463-9a57-4d6b-b352-b5a6c70d3e9c" (UID: "86dbf463-9a57-4d6b-b352-b5a6c70d3e9c"). InnerVolumeSpecName "kube-api-access-cgl5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.530534 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2939-account-create-9lwzm" Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.538968 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zpdvh-config-sxncj" Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.573518 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmdnf\" (UniqueName: \"kubernetes.io/projected/6d0dd8a9-f5fe-4838-892d-62ec5db46f3b-kube-api-access-gmdnf\") pod \"6d0dd8a9-f5fe-4838-892d-62ec5db46f3b\" (UID: \"6d0dd8a9-f5fe-4838-892d-62ec5db46f3b\") " Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.573607 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjntg\" (UniqueName: \"kubernetes.io/projected/dd4bd627-838e-4099-9801-c08693e22b9b-kube-api-access-cjntg\") pod \"dd4bd627-838e-4099-9801-c08693e22b9b\" (UID: \"dd4bd627-838e-4099-9801-c08693e22b9b\") " Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.573840 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqdjd\" (UniqueName: \"kubernetes.io/projected/28639e17-35cd-4824-9390-0a1212a73c73-kube-api-access-tqdjd\") pod \"28639e17-35cd-4824-9390-0a1212a73c73\" (UID: \"28639e17-35cd-4824-9390-0a1212a73c73\") " Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.574258 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cgl5n\" (UniqueName: \"kubernetes.io/projected/86dbf463-9a57-4d6b-b352-b5a6c70d3e9c-kube-api-access-cgl5n\") on node \"crc\" DevicePath \"\"" Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.577174 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28639e17-35cd-4824-9390-0a1212a73c73-kube-api-access-tqdjd" (OuterVolumeSpecName: "kube-api-access-tqdjd") pod "28639e17-35cd-4824-9390-0a1212a73c73" (UID: "28639e17-35cd-4824-9390-0a1212a73c73"). InnerVolumeSpecName "kube-api-access-tqdjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.579666 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d0dd8a9-f5fe-4838-892d-62ec5db46f3b-kube-api-access-gmdnf" (OuterVolumeSpecName: "kube-api-access-gmdnf") pod "6d0dd8a9-f5fe-4838-892d-62ec5db46f3b" (UID: "6d0dd8a9-f5fe-4838-892d-62ec5db46f3b"). InnerVolumeSpecName "kube-api-access-gmdnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.650259 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-zpdvh-config-sxncj" event={"ID":"78e6ad24-e82e-4edf-b5ab-b7a0abdb686a","Type":"ContainerDied","Data":"8c1c113c843ef468870a58b826b4f5651638b496059661abb236ab09c8a20c24"} Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.650302 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-zpdvh-config-sxncj" Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.650307 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c1c113c843ef468870a58b826b4f5651638b496059661abb236ab09c8a20c24" Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.651629 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-ec97-account-create-fmwdx" event={"ID":"6d0dd8a9-f5fe-4838-892d-62ec5db46f3b","Type":"ContainerDied","Data":"159b5ee0fe247d6ca5a791a6d76f287ff1a3bb06e53d575261d21d15880efe1e"} Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.651685 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="159b5ee0fe247d6ca5a791a6d76f287ff1a3bb06e53d575261d21d15880efe1e" Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.651786 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-ec97-account-create-fmwdx" Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.652778 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-ae86-account-create-jtrlh" event={"ID":"28639e17-35cd-4824-9390-0a1212a73c73","Type":"ContainerDied","Data":"cd36c8790c1df62dae4149ca018765987ea113e035d6c1520342e082e8f7239a"} Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.652800 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd36c8790c1df62dae4149ca018765987ea113e035d6c1520342e082e8f7239a" Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.652807 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-ae86-account-create-jtrlh" Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.653886 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-bcc4-account-create-bgj79" Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.653889 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-bcc4-account-create-bgj79" event={"ID":"86dbf463-9a57-4d6b-b352-b5a6c70d3e9c","Type":"ContainerDied","Data":"00777668dcff1f31329289cf48bbc7830b53379f88b46b80c3bdcc2b7973b46c"} Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.653934 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00777668dcff1f31329289cf48bbc7830b53379f88b46b80c3bdcc2b7973b46c" Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.655183 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2939-account-create-9lwzm" event={"ID":"dd4bd627-838e-4099-9801-c08693e22b9b","Type":"ContainerDied","Data":"4905753f6550f4835be7d44bb1cb10f6fa2df2cca8e71e8a3fe0cb8dbeb94883"} Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.655200 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4905753f6550f4835be7d44bb1cb10f6fa2df2cca8e71e8a3fe0cb8dbeb94883" Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.655246 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2939-account-create-9lwzm" Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.675525 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/78e6ad24-e82e-4edf-b5ab-b7a0abdb686a-var-log-ovn\") pod \"78e6ad24-e82e-4edf-b5ab-b7a0abdb686a\" (UID: \"78e6ad24-e82e-4edf-b5ab-b7a0abdb686a\") " Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.675620 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnccx\" (UniqueName: \"kubernetes.io/projected/78e6ad24-e82e-4edf-b5ab-b7a0abdb686a-kube-api-access-xnccx\") pod \"78e6ad24-e82e-4edf-b5ab-b7a0abdb686a\" (UID: \"78e6ad24-e82e-4edf-b5ab-b7a0abdb686a\") " Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.675627 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/78e6ad24-e82e-4edf-b5ab-b7a0abdb686a-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "78e6ad24-e82e-4edf-b5ab-b7a0abdb686a" (UID: "78e6ad24-e82e-4edf-b5ab-b7a0abdb686a"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.675658 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/78e6ad24-e82e-4edf-b5ab-b7a0abdb686a-var-run\") pod \"78e6ad24-e82e-4edf-b5ab-b7a0abdb686a\" (UID: \"78e6ad24-e82e-4edf-b5ab-b7a0abdb686a\") " Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.675698 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/78e6ad24-e82e-4edf-b5ab-b7a0abdb686a-var-run" (OuterVolumeSpecName: "var-run") pod "78e6ad24-e82e-4edf-b5ab-b7a0abdb686a" (UID: "78e6ad24-e82e-4edf-b5ab-b7a0abdb686a"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.675837 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/78e6ad24-e82e-4edf-b5ab-b7a0abdb686a-var-run-ovn\") pod \"78e6ad24-e82e-4edf-b5ab-b7a0abdb686a\" (UID: \"78e6ad24-e82e-4edf-b5ab-b7a0abdb686a\") " Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.675903 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/78e6ad24-e82e-4edf-b5ab-b7a0abdb686a-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "78e6ad24-e82e-4edf-b5ab-b7a0abdb686a" (UID: "78e6ad24-e82e-4edf-b5ab-b7a0abdb686a"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.676041 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78e6ad24-e82e-4edf-b5ab-b7a0abdb686a-scripts\") pod \"78e6ad24-e82e-4edf-b5ab-b7a0abdb686a\" (UID: \"78e6ad24-e82e-4edf-b5ab-b7a0abdb686a\") " Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.676072 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/78e6ad24-e82e-4edf-b5ab-b7a0abdb686a-additional-scripts\") pod \"78e6ad24-e82e-4edf-b5ab-b7a0abdb686a\" (UID: \"78e6ad24-e82e-4edf-b5ab-b7a0abdb686a\") " Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.676953 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78e6ad24-e82e-4edf-b5ab-b7a0abdb686a-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "78e6ad24-e82e-4edf-b5ab-b7a0abdb686a" (UID: "78e6ad24-e82e-4edf-b5ab-b7a0abdb686a"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.677359 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78e6ad24-e82e-4edf-b5ab-b7a0abdb686a-scripts" (OuterVolumeSpecName: "scripts") pod "78e6ad24-e82e-4edf-b5ab-b7a0abdb686a" (UID: "78e6ad24-e82e-4edf-b5ab-b7a0abdb686a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.677391 4771 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/78e6ad24-e82e-4edf-b5ab-b7a0abdb686a-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.677418 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmdnf\" (UniqueName: \"kubernetes.io/projected/6d0dd8a9-f5fe-4838-892d-62ec5db46f3b-kube-api-access-gmdnf\") on node \"crc\" DevicePath \"\"" Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.677432 4771 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/78e6ad24-e82e-4edf-b5ab-b7a0abdb686a-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.677445 4771 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/78e6ad24-e82e-4edf-b5ab-b7a0abdb686a-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.677459 4771 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/78e6ad24-e82e-4edf-b5ab-b7a0abdb686a-var-run\") on node \"crc\" DevicePath \"\"" Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.677472 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqdjd\" (UniqueName: \"kubernetes.io/projected/28639e17-35cd-4824-9390-0a1212a73c73-kube-api-access-tqdjd\") on node \"crc\" DevicePath \"\"" Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.683993 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78e6ad24-e82e-4edf-b5ab-b7a0abdb686a-kube-api-access-xnccx" (OuterVolumeSpecName: "kube-api-access-xnccx") pod "78e6ad24-e82e-4edf-b5ab-b7a0abdb686a" (UID: "78e6ad24-e82e-4edf-b5ab-b7a0abdb686a"). InnerVolumeSpecName "kube-api-access-xnccx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.719638 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd4bd627-838e-4099-9801-c08693e22b9b-kube-api-access-cjntg" (OuterVolumeSpecName: "kube-api-access-cjntg") pod "dd4bd627-838e-4099-9801-c08693e22b9b" (UID: "dd4bd627-838e-4099-9801-c08693e22b9b"). InnerVolumeSpecName "kube-api-access-cjntg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.779012 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjntg\" (UniqueName: \"kubernetes.io/projected/dd4bd627-838e-4099-9801-c08693e22b9b-kube-api-access-cjntg\") on node \"crc\" DevicePath \"\"" Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.779060 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78e6ad24-e82e-4edf-b5ab-b7a0abdb686a-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 15:13:13 crc kubenswrapper[4771]: I1001 15:13:13.779082 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnccx\" (UniqueName: \"kubernetes.io/projected/78e6ad24-e82e-4edf-b5ab-b7a0abdb686a-kube-api-access-xnccx\") on node \"crc\" DevicePath \"\"" Oct 01 15:13:14 crc kubenswrapper[4771]: I1001 15:13:14.667887 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-wrk8t" event={"ID":"c3158c13-7e3a-4467-85bb-acb01fcba800","Type":"ContainerStarted","Data":"ec3c80d24487130f95d057281465248d12cfa12c294ac6271b7edc9b977bad77"} Oct 01 15:13:14 crc kubenswrapper[4771]: I1001 15:13:14.683860 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-zpdvh-config-sxncj"] Oct 01 15:13:14 crc kubenswrapper[4771]: I1001 15:13:14.693042 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-zpdvh-config-sxncj"] Oct 01 15:13:15 crc kubenswrapper[4771]: I1001 15:13:15.679691 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-l6hss" event={"ID":"04988f3e-c2c9-454f-8ec5-7d269d07685a","Type":"ContainerStarted","Data":"e4374df0d4ede53e93aafc7d6199d0453d62027813d852ca7553887badb4500c"} Oct 01 15:13:15 crc kubenswrapper[4771]: I1001 15:13:15.680040 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77585f5f8c-wrk8t" Oct 01 15:13:15 crc kubenswrapper[4771]: I1001 15:13:15.713037 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77585f5f8c-wrk8t" podStartSLOduration=8.713001931 podStartE2EDuration="8.713001931s" podCreationTimestamp="2025-10-01 15:13:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:13:15.708404327 +0000 UTC m=+1040.327579578" watchObservedRunningTime="2025-10-01 15:13:15.713001931 +0000 UTC m=+1040.332177172" Oct 01 15:13:15 crc kubenswrapper[4771]: I1001 15:13:15.730142 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-l6hss" podStartSLOduration=2.068050917 podStartE2EDuration="9.730121303s" podCreationTimestamp="2025-10-01 15:13:06 +0000 UTC" firstStartedPulling="2025-10-01 15:13:07.270551921 +0000 UTC m=+1031.889727092" lastFinishedPulling="2025-10-01 15:13:14.932622297 +0000 UTC m=+1039.551797478" observedRunningTime="2025-10-01 15:13:15.729181331 +0000 UTC m=+1040.348356512" watchObservedRunningTime="2025-10-01 15:13:15.730121303 +0000 UTC m=+1040.349296474" Oct 01 15:13:16 crc kubenswrapper[4771]: I1001 15:13:16.000280 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78e6ad24-e82e-4edf-b5ab-b7a0abdb686a" path="/var/lib/kubelet/pods/78e6ad24-e82e-4edf-b5ab-b7a0abdb686a/volumes" Oct 01 15:13:16 crc kubenswrapper[4771]: I1001 15:13:16.616999 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-gq7gl"] Oct 01 15:13:16 crc kubenswrapper[4771]: E1001 15:13:16.617660 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78e6ad24-e82e-4edf-b5ab-b7a0abdb686a" containerName="ovn-config" Oct 01 15:13:16 crc kubenswrapper[4771]: I1001 15:13:16.617791 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="78e6ad24-e82e-4edf-b5ab-b7a0abdb686a" containerName="ovn-config" Oct 01 15:13:16 crc kubenswrapper[4771]: E1001 15:13:16.617915 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28639e17-35cd-4824-9390-0a1212a73c73" containerName="mariadb-account-create" Oct 01 15:13:16 crc kubenswrapper[4771]: I1001 15:13:16.617993 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="28639e17-35cd-4824-9390-0a1212a73c73" containerName="mariadb-account-create" Oct 01 15:13:16 crc kubenswrapper[4771]: E1001 15:13:16.618105 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d0dd8a9-f5fe-4838-892d-62ec5db46f3b" containerName="mariadb-account-create" Oct 01 15:13:16 crc kubenswrapper[4771]: I1001 15:13:16.618216 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d0dd8a9-f5fe-4838-892d-62ec5db46f3b" containerName="mariadb-account-create" Oct 01 15:13:16 crc kubenswrapper[4771]: E1001 15:13:16.618398 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd4bd627-838e-4099-9801-c08693e22b9b" containerName="mariadb-account-create" Oct 01 15:13:16 crc kubenswrapper[4771]: I1001 15:13:16.618695 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd4bd627-838e-4099-9801-c08693e22b9b" containerName="mariadb-account-create" Oct 01 15:13:16 crc kubenswrapper[4771]: E1001 15:13:16.618942 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86dbf463-9a57-4d6b-b352-b5a6c70d3e9c" containerName="mariadb-account-create" Oct 01 15:13:16 crc kubenswrapper[4771]: I1001 15:13:16.619685 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="86dbf463-9a57-4d6b-b352-b5a6c70d3e9c" containerName="mariadb-account-create" Oct 01 15:13:16 crc kubenswrapper[4771]: I1001 15:13:16.620142 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="78e6ad24-e82e-4edf-b5ab-b7a0abdb686a" containerName="ovn-config" Oct 01 15:13:16 crc kubenswrapper[4771]: I1001 15:13:16.620284 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd4bd627-838e-4099-9801-c08693e22b9b" containerName="mariadb-account-create" Oct 01 15:13:16 crc kubenswrapper[4771]: I1001 15:13:16.620402 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="86dbf463-9a57-4d6b-b352-b5a6c70d3e9c" containerName="mariadb-account-create" Oct 01 15:13:16 crc kubenswrapper[4771]: I1001 15:13:16.620494 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d0dd8a9-f5fe-4838-892d-62ec5db46f3b" containerName="mariadb-account-create" Oct 01 15:13:16 crc kubenswrapper[4771]: I1001 15:13:16.620883 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="28639e17-35cd-4824-9390-0a1212a73c73" containerName="mariadb-account-create" Oct 01 15:13:16 crc kubenswrapper[4771]: I1001 15:13:16.622123 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-gq7gl" Oct 01 15:13:16 crc kubenswrapper[4771]: I1001 15:13:16.624305 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-hfxfb" Oct 01 15:13:16 crc kubenswrapper[4771]: I1001 15:13:16.625695 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 01 15:13:16 crc kubenswrapper[4771]: I1001 15:13:16.628695 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-gq7gl"] Oct 01 15:13:16 crc kubenswrapper[4771]: I1001 15:13:16.749839 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b7689a2-6ac8-47ac-86f7-7456994c39ca-config-data\") pod \"glance-db-sync-gq7gl\" (UID: \"7b7689a2-6ac8-47ac-86f7-7456994c39ca\") " pod="openstack/glance-db-sync-gq7gl" Oct 01 15:13:16 crc kubenswrapper[4771]: I1001 15:13:16.749910 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ncw8\" (UniqueName: \"kubernetes.io/projected/7b7689a2-6ac8-47ac-86f7-7456994c39ca-kube-api-access-6ncw8\") pod \"glance-db-sync-gq7gl\" (UID: \"7b7689a2-6ac8-47ac-86f7-7456994c39ca\") " pod="openstack/glance-db-sync-gq7gl" Oct 01 15:13:16 crc kubenswrapper[4771]: I1001 15:13:16.749998 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b7689a2-6ac8-47ac-86f7-7456994c39ca-combined-ca-bundle\") pod \"glance-db-sync-gq7gl\" (UID: \"7b7689a2-6ac8-47ac-86f7-7456994c39ca\") " pod="openstack/glance-db-sync-gq7gl" Oct 01 15:13:16 crc kubenswrapper[4771]: I1001 15:13:16.750051 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7b7689a2-6ac8-47ac-86f7-7456994c39ca-db-sync-config-data\") pod \"glance-db-sync-gq7gl\" (UID: \"7b7689a2-6ac8-47ac-86f7-7456994c39ca\") " pod="openstack/glance-db-sync-gq7gl" Oct 01 15:13:16 crc kubenswrapper[4771]: I1001 15:13:16.851822 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b7689a2-6ac8-47ac-86f7-7456994c39ca-combined-ca-bundle\") pod \"glance-db-sync-gq7gl\" (UID: \"7b7689a2-6ac8-47ac-86f7-7456994c39ca\") " pod="openstack/glance-db-sync-gq7gl" Oct 01 15:13:16 crc kubenswrapper[4771]: I1001 15:13:16.851902 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7b7689a2-6ac8-47ac-86f7-7456994c39ca-db-sync-config-data\") pod \"glance-db-sync-gq7gl\" (UID: \"7b7689a2-6ac8-47ac-86f7-7456994c39ca\") " pod="openstack/glance-db-sync-gq7gl" Oct 01 15:13:16 crc kubenswrapper[4771]: I1001 15:13:16.851978 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b7689a2-6ac8-47ac-86f7-7456994c39ca-config-data\") pod \"glance-db-sync-gq7gl\" (UID: \"7b7689a2-6ac8-47ac-86f7-7456994c39ca\") " pod="openstack/glance-db-sync-gq7gl" Oct 01 15:13:16 crc kubenswrapper[4771]: I1001 15:13:16.852016 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ncw8\" (UniqueName: \"kubernetes.io/projected/7b7689a2-6ac8-47ac-86f7-7456994c39ca-kube-api-access-6ncw8\") pod \"glance-db-sync-gq7gl\" (UID: \"7b7689a2-6ac8-47ac-86f7-7456994c39ca\") " pod="openstack/glance-db-sync-gq7gl" Oct 01 15:13:16 crc kubenswrapper[4771]: I1001 15:13:16.856387 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7b7689a2-6ac8-47ac-86f7-7456994c39ca-db-sync-config-data\") pod \"glance-db-sync-gq7gl\" (UID: \"7b7689a2-6ac8-47ac-86f7-7456994c39ca\") " pod="openstack/glance-db-sync-gq7gl" Oct 01 15:13:16 crc kubenswrapper[4771]: I1001 15:13:16.856534 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b7689a2-6ac8-47ac-86f7-7456994c39ca-config-data\") pod \"glance-db-sync-gq7gl\" (UID: \"7b7689a2-6ac8-47ac-86f7-7456994c39ca\") " pod="openstack/glance-db-sync-gq7gl" Oct 01 15:13:16 crc kubenswrapper[4771]: I1001 15:13:16.857159 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b7689a2-6ac8-47ac-86f7-7456994c39ca-combined-ca-bundle\") pod \"glance-db-sync-gq7gl\" (UID: \"7b7689a2-6ac8-47ac-86f7-7456994c39ca\") " pod="openstack/glance-db-sync-gq7gl" Oct 01 15:13:16 crc kubenswrapper[4771]: I1001 15:13:16.867427 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ncw8\" (UniqueName: \"kubernetes.io/projected/7b7689a2-6ac8-47ac-86f7-7456994c39ca-kube-api-access-6ncw8\") pod \"glance-db-sync-gq7gl\" (UID: \"7b7689a2-6ac8-47ac-86f7-7456994c39ca\") " pod="openstack/glance-db-sync-gq7gl" Oct 01 15:13:17 crc kubenswrapper[4771]: I1001 15:13:17.015118 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-gq7gl" Oct 01 15:13:17 crc kubenswrapper[4771]: I1001 15:13:17.569190 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-gq7gl"] Oct 01 15:13:17 crc kubenswrapper[4771]: W1001 15:13:17.579419 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b7689a2_6ac8_47ac_86f7_7456994c39ca.slice/crio-21ea8d6fe0775b907e8f132fc2b8712483c01cde09959015fbdec9235c0eb530 WatchSource:0}: Error finding container 21ea8d6fe0775b907e8f132fc2b8712483c01cde09959015fbdec9235c0eb530: Status 404 returned error can't find the container with id 21ea8d6fe0775b907e8f132fc2b8712483c01cde09959015fbdec9235c0eb530 Oct 01 15:13:17 crc kubenswrapper[4771]: I1001 15:13:17.737165 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-gq7gl" event={"ID":"7b7689a2-6ac8-47ac-86f7-7456994c39ca","Type":"ContainerStarted","Data":"21ea8d6fe0775b907e8f132fc2b8712483c01cde09959015fbdec9235c0eb530"} Oct 01 15:13:17 crc kubenswrapper[4771]: I1001 15:13:17.912679 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:13:18 crc kubenswrapper[4771]: I1001 15:13:18.759173 4771 generic.go:334] "Generic (PLEG): container finished" podID="04988f3e-c2c9-454f-8ec5-7d269d07685a" containerID="e4374df0d4ede53e93aafc7d6199d0453d62027813d852ca7553887badb4500c" exitCode=0 Oct 01 15:13:18 crc kubenswrapper[4771]: I1001 15:13:18.759500 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-l6hss" event={"ID":"04988f3e-c2c9-454f-8ec5-7d269d07685a","Type":"ContainerDied","Data":"e4374df0d4ede53e93aafc7d6199d0453d62027813d852ca7553887badb4500c"} Oct 01 15:13:20 crc kubenswrapper[4771]: I1001 15:13:20.059850 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-l6hss" Oct 01 15:13:20 crc kubenswrapper[4771]: I1001 15:13:20.135892 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04988f3e-c2c9-454f-8ec5-7d269d07685a-config-data\") pod \"04988f3e-c2c9-454f-8ec5-7d269d07685a\" (UID: \"04988f3e-c2c9-454f-8ec5-7d269d07685a\") " Oct 01 15:13:20 crc kubenswrapper[4771]: I1001 15:13:20.136082 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04988f3e-c2c9-454f-8ec5-7d269d07685a-combined-ca-bundle\") pod \"04988f3e-c2c9-454f-8ec5-7d269d07685a\" (UID: \"04988f3e-c2c9-454f-8ec5-7d269d07685a\") " Oct 01 15:13:20 crc kubenswrapper[4771]: I1001 15:13:20.136127 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zz68v\" (UniqueName: \"kubernetes.io/projected/04988f3e-c2c9-454f-8ec5-7d269d07685a-kube-api-access-zz68v\") pod \"04988f3e-c2c9-454f-8ec5-7d269d07685a\" (UID: \"04988f3e-c2c9-454f-8ec5-7d269d07685a\") " Oct 01 15:13:20 crc kubenswrapper[4771]: I1001 15:13:20.149469 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04988f3e-c2c9-454f-8ec5-7d269d07685a-kube-api-access-zz68v" (OuterVolumeSpecName: "kube-api-access-zz68v") pod "04988f3e-c2c9-454f-8ec5-7d269d07685a" (UID: "04988f3e-c2c9-454f-8ec5-7d269d07685a"). InnerVolumeSpecName "kube-api-access-zz68v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:13:20 crc kubenswrapper[4771]: I1001 15:13:20.160585 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04988f3e-c2c9-454f-8ec5-7d269d07685a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04988f3e-c2c9-454f-8ec5-7d269d07685a" (UID: "04988f3e-c2c9-454f-8ec5-7d269d07685a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:13:20 crc kubenswrapper[4771]: I1001 15:13:20.185254 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04988f3e-c2c9-454f-8ec5-7d269d07685a-config-data" (OuterVolumeSpecName: "config-data") pod "04988f3e-c2c9-454f-8ec5-7d269d07685a" (UID: "04988f3e-c2c9-454f-8ec5-7d269d07685a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:13:20 crc kubenswrapper[4771]: I1001 15:13:20.238183 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zz68v\" (UniqueName: \"kubernetes.io/projected/04988f3e-c2c9-454f-8ec5-7d269d07685a-kube-api-access-zz68v\") on node \"crc\" DevicePath \"\"" Oct 01 15:13:20 crc kubenswrapper[4771]: I1001 15:13:20.238219 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04988f3e-c2c9-454f-8ec5-7d269d07685a-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 15:13:20 crc kubenswrapper[4771]: I1001 15:13:20.238232 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04988f3e-c2c9-454f-8ec5-7d269d07685a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:13:20 crc kubenswrapper[4771]: I1001 15:13:20.777112 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-l6hss" event={"ID":"04988f3e-c2c9-454f-8ec5-7d269d07685a","Type":"ContainerDied","Data":"095cebea4a256c978e099afd1224a4e2f7c02b64e5de7f34374abc1777868029"} Oct 01 15:13:20 crc kubenswrapper[4771]: I1001 15:13:20.777152 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="095cebea4a256c978e099afd1224a4e2f7c02b64e5de7f34374abc1777868029" Oct 01 15:13:20 crc kubenswrapper[4771]: I1001 15:13:20.777176 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-l6hss" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.029852 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-wrk8t"] Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.030472 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77585f5f8c-wrk8t" podUID="c3158c13-7e3a-4467-85bb-acb01fcba800" containerName="dnsmasq-dns" containerID="cri-o://ec3c80d24487130f95d057281465248d12cfa12c294ac6271b7edc9b977bad77" gracePeriod=10 Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.032873 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77585f5f8c-wrk8t" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.091685 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-mpc2n"] Oct 01 15:13:21 crc kubenswrapper[4771]: E1001 15:13:21.092034 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04988f3e-c2c9-454f-8ec5-7d269d07685a" containerName="keystone-db-sync" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.092046 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="04988f3e-c2c9-454f-8ec5-7d269d07685a" containerName="keystone-db-sync" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.092212 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="04988f3e-c2c9-454f-8ec5-7d269d07685a" containerName="keystone-db-sync" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.093165 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-mpc2n" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.117191 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-mpc2n"] Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.145256 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-t97db"] Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.148455 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-t97db" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.151237 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.151474 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-h6k26" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.151577 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.151675 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.157769 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-t97db"] Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.262818 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce394a9f-0955-4eaf-8617-c0087216c295-scripts\") pod \"keystone-bootstrap-t97db\" (UID: \"ce394a9f-0955-4eaf-8617-c0087216c295\") " pod="openstack/keystone-bootstrap-t97db" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.262860 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ce394a9f-0955-4eaf-8617-c0087216c295-fernet-keys\") pod \"keystone-bootstrap-t97db\" (UID: \"ce394a9f-0955-4eaf-8617-c0087216c295\") " pod="openstack/keystone-bootstrap-t97db" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.262900 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e48032b5-450d-4975-9657-d7dd1fa95f3b-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-mpc2n\" (UID: \"e48032b5-450d-4975-9657-d7dd1fa95f3b\") " pod="openstack/dnsmasq-dns-55fff446b9-mpc2n" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.262923 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e48032b5-450d-4975-9657-d7dd1fa95f3b-config\") pod \"dnsmasq-dns-55fff446b9-mpc2n\" (UID: \"e48032b5-450d-4975-9657-d7dd1fa95f3b\") " pod="openstack/dnsmasq-dns-55fff446b9-mpc2n" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.262950 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e48032b5-450d-4975-9657-d7dd1fa95f3b-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-mpc2n\" (UID: \"e48032b5-450d-4975-9657-d7dd1fa95f3b\") " pod="openstack/dnsmasq-dns-55fff446b9-mpc2n" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.262968 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e48032b5-450d-4975-9657-d7dd1fa95f3b-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-mpc2n\" (UID: \"e48032b5-450d-4975-9657-d7dd1fa95f3b\") " pod="openstack/dnsmasq-dns-55fff446b9-mpc2n" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.262987 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce394a9f-0955-4eaf-8617-c0087216c295-config-data\") pod \"keystone-bootstrap-t97db\" (UID: \"ce394a9f-0955-4eaf-8617-c0087216c295\") " pod="openstack/keystone-bootstrap-t97db" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.263022 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62q6f\" (UniqueName: \"kubernetes.io/projected/ce394a9f-0955-4eaf-8617-c0087216c295-kube-api-access-62q6f\") pod \"keystone-bootstrap-t97db\" (UID: \"ce394a9f-0955-4eaf-8617-c0087216c295\") " pod="openstack/keystone-bootstrap-t97db" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.263038 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce394a9f-0955-4eaf-8617-c0087216c295-combined-ca-bundle\") pod \"keystone-bootstrap-t97db\" (UID: \"ce394a9f-0955-4eaf-8617-c0087216c295\") " pod="openstack/keystone-bootstrap-t97db" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.263055 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdn5c\" (UniqueName: \"kubernetes.io/projected/e48032b5-450d-4975-9657-d7dd1fa95f3b-kube-api-access-mdn5c\") pod \"dnsmasq-dns-55fff446b9-mpc2n\" (UID: \"e48032b5-450d-4975-9657-d7dd1fa95f3b\") " pod="openstack/dnsmasq-dns-55fff446b9-mpc2n" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.263079 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e48032b5-450d-4975-9657-d7dd1fa95f3b-dns-svc\") pod \"dnsmasq-dns-55fff446b9-mpc2n\" (UID: \"e48032b5-450d-4975-9657-d7dd1fa95f3b\") " pod="openstack/dnsmasq-dns-55fff446b9-mpc2n" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.263095 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ce394a9f-0955-4eaf-8617-c0087216c295-credential-keys\") pod \"keystone-bootstrap-t97db\" (UID: \"ce394a9f-0955-4eaf-8617-c0087216c295\") " pod="openstack/keystone-bootstrap-t97db" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.268438 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7ccd9d5bb7-8mdjn"] Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.269864 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7ccd9d5bb7-8mdjn" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.277374 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.277551 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.277653 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-nnrqt" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.277766 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.288383 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-qfvj2"] Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.289451 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-qfvj2" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.295491 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.295799 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.295912 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-m89mp" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.360833 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7ccd9d5bb7-8mdjn"] Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.403579 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62q6f\" (UniqueName: \"kubernetes.io/projected/ce394a9f-0955-4eaf-8617-c0087216c295-kube-api-access-62q6f\") pod \"keystone-bootstrap-t97db\" (UID: \"ce394a9f-0955-4eaf-8617-c0087216c295\") " pod="openstack/keystone-bootstrap-t97db" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.403624 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce394a9f-0955-4eaf-8617-c0087216c295-combined-ca-bundle\") pod \"keystone-bootstrap-t97db\" (UID: \"ce394a9f-0955-4eaf-8617-c0087216c295\") " pod="openstack/keystone-bootstrap-t97db" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.403650 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdn5c\" (UniqueName: \"kubernetes.io/projected/e48032b5-450d-4975-9657-d7dd1fa95f3b-kube-api-access-mdn5c\") pod \"dnsmasq-dns-55fff446b9-mpc2n\" (UID: \"e48032b5-450d-4975-9657-d7dd1fa95f3b\") " pod="openstack/dnsmasq-dns-55fff446b9-mpc2n" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.403698 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e48032b5-450d-4975-9657-d7dd1fa95f3b-dns-svc\") pod \"dnsmasq-dns-55fff446b9-mpc2n\" (UID: \"e48032b5-450d-4975-9657-d7dd1fa95f3b\") " pod="openstack/dnsmasq-dns-55fff446b9-mpc2n" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.403718 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ce394a9f-0955-4eaf-8617-c0087216c295-credential-keys\") pod \"keystone-bootstrap-t97db\" (UID: \"ce394a9f-0955-4eaf-8617-c0087216c295\") " pod="openstack/keystone-bootstrap-t97db" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.403774 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b474b0f5-0050-4edd-afac-9237aa7284a5-config-data\") pod \"horizon-7ccd9d5bb7-8mdjn\" (UID: \"b474b0f5-0050-4edd-afac-9237aa7284a5\") " pod="openstack/horizon-7ccd9d5bb7-8mdjn" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.403801 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7709b00c-a5a2-46c9-a4dc-ffa7fe3c912c-config\") pod \"neutron-db-sync-qfvj2\" (UID: \"7709b00c-a5a2-46c9-a4dc-ffa7fe3c912c\") " pod="openstack/neutron-db-sync-qfvj2" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.403855 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl8nd\" (UniqueName: \"kubernetes.io/projected/b474b0f5-0050-4edd-afac-9237aa7284a5-kube-api-access-fl8nd\") pod \"horizon-7ccd9d5bb7-8mdjn\" (UID: \"b474b0f5-0050-4edd-afac-9237aa7284a5\") " pod="openstack/horizon-7ccd9d5bb7-8mdjn" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.403910 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce394a9f-0955-4eaf-8617-c0087216c295-scripts\") pod \"keystone-bootstrap-t97db\" (UID: \"ce394a9f-0955-4eaf-8617-c0087216c295\") " pod="openstack/keystone-bootstrap-t97db" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.403925 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b474b0f5-0050-4edd-afac-9237aa7284a5-scripts\") pod \"horizon-7ccd9d5bb7-8mdjn\" (UID: \"b474b0f5-0050-4edd-afac-9237aa7284a5\") " pod="openstack/horizon-7ccd9d5bb7-8mdjn" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.403956 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ce394a9f-0955-4eaf-8617-c0087216c295-fernet-keys\") pod \"keystone-bootstrap-t97db\" (UID: \"ce394a9f-0955-4eaf-8617-c0087216c295\") " pod="openstack/keystone-bootstrap-t97db" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.403975 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7709b00c-a5a2-46c9-a4dc-ffa7fe3c912c-combined-ca-bundle\") pod \"neutron-db-sync-qfvj2\" (UID: \"7709b00c-a5a2-46c9-a4dc-ffa7fe3c912c\") " pod="openstack/neutron-db-sync-qfvj2" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.404023 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e48032b5-450d-4975-9657-d7dd1fa95f3b-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-mpc2n\" (UID: \"e48032b5-450d-4975-9657-d7dd1fa95f3b\") " pod="openstack/dnsmasq-dns-55fff446b9-mpc2n" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.404050 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcfr7\" (UniqueName: \"kubernetes.io/projected/7709b00c-a5a2-46c9-a4dc-ffa7fe3c912c-kube-api-access-kcfr7\") pod \"neutron-db-sync-qfvj2\" (UID: \"7709b00c-a5a2-46c9-a4dc-ffa7fe3c912c\") " pod="openstack/neutron-db-sync-qfvj2" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.404070 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e48032b5-450d-4975-9657-d7dd1fa95f3b-config\") pod \"dnsmasq-dns-55fff446b9-mpc2n\" (UID: \"e48032b5-450d-4975-9657-d7dd1fa95f3b\") " pod="openstack/dnsmasq-dns-55fff446b9-mpc2n" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.404122 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e48032b5-450d-4975-9657-d7dd1fa95f3b-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-mpc2n\" (UID: \"e48032b5-450d-4975-9657-d7dd1fa95f3b\") " pod="openstack/dnsmasq-dns-55fff446b9-mpc2n" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.404141 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e48032b5-450d-4975-9657-d7dd1fa95f3b-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-mpc2n\" (UID: \"e48032b5-450d-4975-9657-d7dd1fa95f3b\") " pod="openstack/dnsmasq-dns-55fff446b9-mpc2n" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.404161 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b474b0f5-0050-4edd-afac-9237aa7284a5-logs\") pod \"horizon-7ccd9d5bb7-8mdjn\" (UID: \"b474b0f5-0050-4edd-afac-9237aa7284a5\") " pod="openstack/horizon-7ccd9d5bb7-8mdjn" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.404185 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce394a9f-0955-4eaf-8617-c0087216c295-config-data\") pod \"keystone-bootstrap-t97db\" (UID: \"ce394a9f-0955-4eaf-8617-c0087216c295\") " pod="openstack/keystone-bootstrap-t97db" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.404200 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b474b0f5-0050-4edd-afac-9237aa7284a5-horizon-secret-key\") pod \"horizon-7ccd9d5bb7-8mdjn\" (UID: \"b474b0f5-0050-4edd-afac-9237aa7284a5\") " pod="openstack/horizon-7ccd9d5bb7-8mdjn" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.413373 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce394a9f-0955-4eaf-8617-c0087216c295-combined-ca-bundle\") pod \"keystone-bootstrap-t97db\" (UID: \"ce394a9f-0955-4eaf-8617-c0087216c295\") " pod="openstack/keystone-bootstrap-t97db" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.419916 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e48032b5-450d-4975-9657-d7dd1fa95f3b-config\") pod \"dnsmasq-dns-55fff446b9-mpc2n\" (UID: \"e48032b5-450d-4975-9657-d7dd1fa95f3b\") " pod="openstack/dnsmasq-dns-55fff446b9-mpc2n" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.426995 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ce394a9f-0955-4eaf-8617-c0087216c295-fernet-keys\") pod \"keystone-bootstrap-t97db\" (UID: \"ce394a9f-0955-4eaf-8617-c0087216c295\") " pod="openstack/keystone-bootstrap-t97db" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.437452 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ce394a9f-0955-4eaf-8617-c0087216c295-credential-keys\") pod \"keystone-bootstrap-t97db\" (UID: \"ce394a9f-0955-4eaf-8617-c0087216c295\") " pod="openstack/keystone-bootstrap-t97db" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.437906 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce394a9f-0955-4eaf-8617-c0087216c295-config-data\") pod \"keystone-bootstrap-t97db\" (UID: \"ce394a9f-0955-4eaf-8617-c0087216c295\") " pod="openstack/keystone-bootstrap-t97db" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.438354 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e48032b5-450d-4975-9657-d7dd1fa95f3b-ovsdbserver-nb\") pod \"dnsmasq-dns-55fff446b9-mpc2n\" (UID: \"e48032b5-450d-4975-9657-d7dd1fa95f3b\") " pod="openstack/dnsmasq-dns-55fff446b9-mpc2n" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.438568 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e48032b5-450d-4975-9657-d7dd1fa95f3b-ovsdbserver-sb\") pod \"dnsmasq-dns-55fff446b9-mpc2n\" (UID: \"e48032b5-450d-4975-9657-d7dd1fa95f3b\") " pod="openstack/dnsmasq-dns-55fff446b9-mpc2n" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.439869 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62q6f\" (UniqueName: \"kubernetes.io/projected/ce394a9f-0955-4eaf-8617-c0087216c295-kube-api-access-62q6f\") pod \"keystone-bootstrap-t97db\" (UID: \"ce394a9f-0955-4eaf-8617-c0087216c295\") " pod="openstack/keystone-bootstrap-t97db" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.445256 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e48032b5-450d-4975-9657-d7dd1fa95f3b-dns-swift-storage-0\") pod \"dnsmasq-dns-55fff446b9-mpc2n\" (UID: \"e48032b5-450d-4975-9657-d7dd1fa95f3b\") " pod="openstack/dnsmasq-dns-55fff446b9-mpc2n" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.445786 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e48032b5-450d-4975-9657-d7dd1fa95f3b-dns-svc\") pod \"dnsmasq-dns-55fff446b9-mpc2n\" (UID: \"e48032b5-450d-4975-9657-d7dd1fa95f3b\") " pod="openstack/dnsmasq-dns-55fff446b9-mpc2n" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.465114 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce394a9f-0955-4eaf-8617-c0087216c295-scripts\") pod \"keystone-bootstrap-t97db\" (UID: \"ce394a9f-0955-4eaf-8617-c0087216c295\") " pod="openstack/keystone-bootstrap-t97db" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.466992 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdn5c\" (UniqueName: \"kubernetes.io/projected/e48032b5-450d-4975-9657-d7dd1fa95f3b-kube-api-access-mdn5c\") pod \"dnsmasq-dns-55fff446b9-mpc2n\" (UID: \"e48032b5-450d-4975-9657-d7dd1fa95f3b\") " pod="openstack/dnsmasq-dns-55fff446b9-mpc2n" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.489162 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-qfvj2"] Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.509183 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcfr7\" (UniqueName: \"kubernetes.io/projected/7709b00c-a5a2-46c9-a4dc-ffa7fe3c912c-kube-api-access-kcfr7\") pod \"neutron-db-sync-qfvj2\" (UID: \"7709b00c-a5a2-46c9-a4dc-ffa7fe3c912c\") " pod="openstack/neutron-db-sync-qfvj2" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.509232 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b474b0f5-0050-4edd-afac-9237aa7284a5-logs\") pod \"horizon-7ccd9d5bb7-8mdjn\" (UID: \"b474b0f5-0050-4edd-afac-9237aa7284a5\") " pod="openstack/horizon-7ccd9d5bb7-8mdjn" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.509255 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b474b0f5-0050-4edd-afac-9237aa7284a5-horizon-secret-key\") pod \"horizon-7ccd9d5bb7-8mdjn\" (UID: \"b474b0f5-0050-4edd-afac-9237aa7284a5\") " pod="openstack/horizon-7ccd9d5bb7-8mdjn" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.509310 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b474b0f5-0050-4edd-afac-9237aa7284a5-config-data\") pod \"horizon-7ccd9d5bb7-8mdjn\" (UID: \"b474b0f5-0050-4edd-afac-9237aa7284a5\") " pod="openstack/horizon-7ccd9d5bb7-8mdjn" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.509328 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7709b00c-a5a2-46c9-a4dc-ffa7fe3c912c-config\") pod \"neutron-db-sync-qfvj2\" (UID: \"7709b00c-a5a2-46c9-a4dc-ffa7fe3c912c\") " pod="openstack/neutron-db-sync-qfvj2" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.509352 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl8nd\" (UniqueName: \"kubernetes.io/projected/b474b0f5-0050-4edd-afac-9237aa7284a5-kube-api-access-fl8nd\") pod \"horizon-7ccd9d5bb7-8mdjn\" (UID: \"b474b0f5-0050-4edd-afac-9237aa7284a5\") " pod="openstack/horizon-7ccd9d5bb7-8mdjn" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.509380 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b474b0f5-0050-4edd-afac-9237aa7284a5-scripts\") pod \"horizon-7ccd9d5bb7-8mdjn\" (UID: \"b474b0f5-0050-4edd-afac-9237aa7284a5\") " pod="openstack/horizon-7ccd9d5bb7-8mdjn" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.509399 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7709b00c-a5a2-46c9-a4dc-ffa7fe3c912c-combined-ca-bundle\") pod \"neutron-db-sync-qfvj2\" (UID: \"7709b00c-a5a2-46c9-a4dc-ffa7fe3c912c\") " pod="openstack/neutron-db-sync-qfvj2" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.512710 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b474b0f5-0050-4edd-afac-9237aa7284a5-scripts\") pod \"horizon-7ccd9d5bb7-8mdjn\" (UID: \"b474b0f5-0050-4edd-afac-9237aa7284a5\") " pod="openstack/horizon-7ccd9d5bb7-8mdjn" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.512951 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7709b00c-a5a2-46c9-a4dc-ffa7fe3c912c-combined-ca-bundle\") pod \"neutron-db-sync-qfvj2\" (UID: \"7709b00c-a5a2-46c9-a4dc-ffa7fe3c912c\") " pod="openstack/neutron-db-sync-qfvj2" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.512971 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b474b0f5-0050-4edd-afac-9237aa7284a5-logs\") pod \"horizon-7ccd9d5bb7-8mdjn\" (UID: \"b474b0f5-0050-4edd-afac-9237aa7284a5\") " pod="openstack/horizon-7ccd9d5bb7-8mdjn" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.513136 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b474b0f5-0050-4edd-afac-9237aa7284a5-config-data\") pod \"horizon-7ccd9d5bb7-8mdjn\" (UID: \"b474b0f5-0050-4edd-afac-9237aa7284a5\") " pod="openstack/horizon-7ccd9d5bb7-8mdjn" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.514677 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7709b00c-a5a2-46c9-a4dc-ffa7fe3c912c-config\") pod \"neutron-db-sync-qfvj2\" (UID: \"7709b00c-a5a2-46c9-a4dc-ffa7fe3c912c\") " pod="openstack/neutron-db-sync-qfvj2" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.522534 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-t97db" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.523635 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b474b0f5-0050-4edd-afac-9237aa7284a5-horizon-secret-key\") pod \"horizon-7ccd9d5bb7-8mdjn\" (UID: \"b474b0f5-0050-4edd-afac-9237aa7284a5\") " pod="openstack/horizon-7ccd9d5bb7-8mdjn" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.537488 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl8nd\" (UniqueName: \"kubernetes.io/projected/b474b0f5-0050-4edd-afac-9237aa7284a5-kube-api-access-fl8nd\") pod \"horizon-7ccd9d5bb7-8mdjn\" (UID: \"b474b0f5-0050-4edd-afac-9237aa7284a5\") " pod="openstack/horizon-7ccd9d5bb7-8mdjn" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.548838 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-99rss"] Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.549978 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-99rss" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.558884 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-8dwpm" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.559088 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.559185 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.564549 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcfr7\" (UniqueName: \"kubernetes.io/projected/7709b00c-a5a2-46c9-a4dc-ffa7fe3c912c-kube-api-access-kcfr7\") pod \"neutron-db-sync-qfvj2\" (UID: \"7709b00c-a5a2-46c9-a4dc-ffa7fe3c912c\") " pod="openstack/neutron-db-sync-qfvj2" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.572533 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-99rss"] Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.599166 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.601149 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.603202 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.604127 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.605945 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.615318 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a523e37-804a-4173-8012-19848efc8cc0-combined-ca-bundle\") pod \"cinder-db-sync-99rss\" (UID: \"9a523e37-804a-4173-8012-19848efc8cc0\") " pod="openstack/cinder-db-sync-99rss" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.621125 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g54n\" (UniqueName: \"kubernetes.io/projected/127a1d07-d1f4-4c95-abf8-da08884ea57a-kube-api-access-4g54n\") pod \"ceilometer-0\" (UID: \"127a1d07-d1f4-4c95-abf8-da08884ea57a\") " pod="openstack/ceilometer-0" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.621196 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a523e37-804a-4173-8012-19848efc8cc0-scripts\") pod \"cinder-db-sync-99rss\" (UID: \"9a523e37-804a-4173-8012-19848efc8cc0\") " pod="openstack/cinder-db-sync-99rss" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.621291 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/127a1d07-d1f4-4c95-abf8-da08884ea57a-scripts\") pod \"ceilometer-0\" (UID: \"127a1d07-d1f4-4c95-abf8-da08884ea57a\") " pod="openstack/ceilometer-0" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.621362 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9a523e37-804a-4173-8012-19848efc8cc0-etc-machine-id\") pod \"cinder-db-sync-99rss\" (UID: \"9a523e37-804a-4173-8012-19848efc8cc0\") " pod="openstack/cinder-db-sync-99rss" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.621459 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/127a1d07-d1f4-4c95-abf8-da08884ea57a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"127a1d07-d1f4-4c95-abf8-da08884ea57a\") " pod="openstack/ceilometer-0" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.621488 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9a523e37-804a-4173-8012-19848efc8cc0-db-sync-config-data\") pod \"cinder-db-sync-99rss\" (UID: \"9a523e37-804a-4173-8012-19848efc8cc0\") " pod="openstack/cinder-db-sync-99rss" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.621526 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/127a1d07-d1f4-4c95-abf8-da08884ea57a-config-data\") pod \"ceilometer-0\" (UID: \"127a1d07-d1f4-4c95-abf8-da08884ea57a\") " pod="openstack/ceilometer-0" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.621588 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a523e37-804a-4173-8012-19848efc8cc0-config-data\") pod \"cinder-db-sync-99rss\" (UID: \"9a523e37-804a-4173-8012-19848efc8cc0\") " pod="openstack/cinder-db-sync-99rss" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.621657 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/127a1d07-d1f4-4c95-abf8-da08884ea57a-log-httpd\") pod \"ceilometer-0\" (UID: \"127a1d07-d1f4-4c95-abf8-da08884ea57a\") " pod="openstack/ceilometer-0" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.621692 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/127a1d07-d1f4-4c95-abf8-da08884ea57a-run-httpd\") pod \"ceilometer-0\" (UID: \"127a1d07-d1f4-4c95-abf8-da08884ea57a\") " pod="openstack/ceilometer-0" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.621775 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75gmn\" (UniqueName: \"kubernetes.io/projected/9a523e37-804a-4173-8012-19848efc8cc0-kube-api-access-75gmn\") pod \"cinder-db-sync-99rss\" (UID: \"9a523e37-804a-4173-8012-19848efc8cc0\") " pod="openstack/cinder-db-sync-99rss" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.621833 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/127a1d07-d1f4-4c95-abf8-da08884ea57a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"127a1d07-d1f4-4c95-abf8-da08884ea57a\") " pod="openstack/ceilometer-0" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.624414 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-wrk8t" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.627745 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-nvszz"] Oct 01 15:13:21 crc kubenswrapper[4771]: E1001 15:13:21.628090 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3158c13-7e3a-4467-85bb-acb01fcba800" containerName="init" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.628221 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3158c13-7e3a-4467-85bb-acb01fcba800" containerName="init" Oct 01 15:13:21 crc kubenswrapper[4771]: E1001 15:13:21.628242 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3158c13-7e3a-4467-85bb-acb01fcba800" containerName="dnsmasq-dns" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.628249 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3158c13-7e3a-4467-85bb-acb01fcba800" containerName="dnsmasq-dns" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.628402 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3158c13-7e3a-4467-85bb-acb01fcba800" containerName="dnsmasq-dns" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.628967 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-nvszz" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.631455 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-ngq92" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.631842 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.640309 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7ccd9d5bb7-8mdjn" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.646661 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-d4zpd"] Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.649065 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-d4zpd" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.652902 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-w4m4k" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.653004 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.653140 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.659176 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-mpc2n"] Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.659779 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-mpc2n" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.665858 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-nvszz"] Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.674193 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-d4zpd"] Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.683675 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-qjn7w"] Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.692553 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7975b9b95f-8lwcw"] Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.693273 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-qjn7w" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.694354 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7975b9b95f-8lwcw" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.697357 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-qjn7w"] Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.703084 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-qfvj2" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.705686 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7975b9b95f-8lwcw"] Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.723250 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czs4n\" (UniqueName: \"kubernetes.io/projected/c3158c13-7e3a-4467-85bb-acb01fcba800-kube-api-access-czs4n\") pod \"c3158c13-7e3a-4467-85bb-acb01fcba800\" (UID: \"c3158c13-7e3a-4467-85bb-acb01fcba800\") " Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.723407 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c3158c13-7e3a-4467-85bb-acb01fcba800-dns-swift-storage-0\") pod \"c3158c13-7e3a-4467-85bb-acb01fcba800\" (UID: \"c3158c13-7e3a-4467-85bb-acb01fcba800\") " Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.723429 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3158c13-7e3a-4467-85bb-acb01fcba800-config\") pod \"c3158c13-7e3a-4467-85bb-acb01fcba800\" (UID: \"c3158c13-7e3a-4467-85bb-acb01fcba800\") " Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.723523 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c3158c13-7e3a-4467-85bb-acb01fcba800-ovsdbserver-nb\") pod \"c3158c13-7e3a-4467-85bb-acb01fcba800\" (UID: \"c3158c13-7e3a-4467-85bb-acb01fcba800\") " Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.723592 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3158c13-7e3a-4467-85bb-acb01fcba800-ovsdbserver-sb\") pod \"c3158c13-7e3a-4467-85bb-acb01fcba800\" (UID: \"c3158c13-7e3a-4467-85bb-acb01fcba800\") " Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.723609 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3158c13-7e3a-4467-85bb-acb01fcba800-dns-svc\") pod \"c3158c13-7e3a-4467-85bb-acb01fcba800\" (UID: \"c3158c13-7e3a-4467-85bb-acb01fcba800\") " Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.724143 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91950d77-9457-412f-be07-626b553f6b8d-combined-ca-bundle\") pod \"placement-db-sync-d4zpd\" (UID: \"91950d77-9457-412f-be07-626b553f6b8d\") " pod="openstack/placement-db-sync-d4zpd" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.724168 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ceb96a8c-ce5a-420e-aa1e-594d09fc1487-config\") pod \"dnsmasq-dns-76fcf4b695-qjn7w\" (UID: \"ceb96a8c-ce5a-420e-aa1e-594d09fc1487\") " pod="openstack/dnsmasq-dns-76fcf4b695-qjn7w" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.724196 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdbvj\" (UniqueName: \"kubernetes.io/projected/e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0-kube-api-access-pdbvj\") pod \"horizon-7975b9b95f-8lwcw\" (UID: \"e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0\") " pod="openstack/horizon-7975b9b95f-8lwcw" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.724223 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/127a1d07-d1f4-4c95-abf8-da08884ea57a-scripts\") pod \"ceilometer-0\" (UID: \"127a1d07-d1f4-4c95-abf8-da08884ea57a\") " pod="openstack/ceilometer-0" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.724243 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9a523e37-804a-4173-8012-19848efc8cc0-etc-machine-id\") pod \"cinder-db-sync-99rss\" (UID: \"9a523e37-804a-4173-8012-19848efc8cc0\") " pod="openstack/cinder-db-sync-99rss" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.724262 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ceb96a8c-ce5a-420e-aa1e-594d09fc1487-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcf4b695-qjn7w\" (UID: \"ceb96a8c-ce5a-420e-aa1e-594d09fc1487\") " pod="openstack/dnsmasq-dns-76fcf4b695-qjn7w" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.724280 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ceb96a8c-ce5a-420e-aa1e-594d09fc1487-dns-svc\") pod \"dnsmasq-dns-76fcf4b695-qjn7w\" (UID: \"ceb96a8c-ce5a-420e-aa1e-594d09fc1487\") " pod="openstack/dnsmasq-dns-76fcf4b695-qjn7w" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.724308 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/127a1d07-d1f4-4c95-abf8-da08884ea57a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"127a1d07-d1f4-4c95-abf8-da08884ea57a\") " pod="openstack/ceilometer-0" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.724326 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9a523e37-804a-4173-8012-19848efc8cc0-db-sync-config-data\") pod \"cinder-db-sync-99rss\" (UID: \"9a523e37-804a-4173-8012-19848efc8cc0\") " pod="openstack/cinder-db-sync-99rss" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.724342 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/127a1d07-d1f4-4c95-abf8-da08884ea57a-config-data\") pod \"ceilometer-0\" (UID: \"127a1d07-d1f4-4c95-abf8-da08884ea57a\") " pod="openstack/ceilometer-0" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.724362 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a523e37-804a-4173-8012-19848efc8cc0-config-data\") pod \"cinder-db-sync-99rss\" (UID: \"9a523e37-804a-4173-8012-19848efc8cc0\") " pod="openstack/cinder-db-sync-99rss" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.724380 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhkrp\" (UniqueName: \"kubernetes.io/projected/ceb96a8c-ce5a-420e-aa1e-594d09fc1487-kube-api-access-jhkrp\") pod \"dnsmasq-dns-76fcf4b695-qjn7w\" (UID: \"ceb96a8c-ce5a-420e-aa1e-594d09fc1487\") " pod="openstack/dnsmasq-dns-76fcf4b695-qjn7w" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.724396 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0-horizon-secret-key\") pod \"horizon-7975b9b95f-8lwcw\" (UID: \"e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0\") " pod="openstack/horizon-7975b9b95f-8lwcw" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.724416 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ceb96a8c-ce5a-420e-aa1e-594d09fc1487-dns-swift-storage-0\") pod \"dnsmasq-dns-76fcf4b695-qjn7w\" (UID: \"ceb96a8c-ce5a-420e-aa1e-594d09fc1487\") " pod="openstack/dnsmasq-dns-76fcf4b695-qjn7w" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.724435 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/127a1d07-d1f4-4c95-abf8-da08884ea57a-log-httpd\") pod \"ceilometer-0\" (UID: \"127a1d07-d1f4-4c95-abf8-da08884ea57a\") " pod="openstack/ceilometer-0" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.724451 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0-logs\") pod \"horizon-7975b9b95f-8lwcw\" (UID: \"e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0\") " pod="openstack/horizon-7975b9b95f-8lwcw" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.724467 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0-scripts\") pod \"horizon-7975b9b95f-8lwcw\" (UID: \"e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0\") " pod="openstack/horizon-7975b9b95f-8lwcw" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.724485 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/127a1d07-d1f4-4c95-abf8-da08884ea57a-run-httpd\") pod \"ceilometer-0\" (UID: \"127a1d07-d1f4-4c95-abf8-da08884ea57a\") " pod="openstack/ceilometer-0" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.724504 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91950d77-9457-412f-be07-626b553f6b8d-logs\") pod \"placement-db-sync-d4zpd\" (UID: \"91950d77-9457-412f-be07-626b553f6b8d\") " pod="openstack/placement-db-sync-d4zpd" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.724522 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75gmn\" (UniqueName: \"kubernetes.io/projected/9a523e37-804a-4173-8012-19848efc8cc0-kube-api-access-75gmn\") pod \"cinder-db-sync-99rss\" (UID: \"9a523e37-804a-4173-8012-19848efc8cc0\") " pod="openstack/cinder-db-sync-99rss" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.724545 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/313f9ab1-8fa8-476f-94cb-1d94bd975a06-combined-ca-bundle\") pod \"barbican-db-sync-nvszz\" (UID: \"313f9ab1-8fa8-476f-94cb-1d94bd975a06\") " pod="openstack/barbican-db-sync-nvszz" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.724564 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0-config-data\") pod \"horizon-7975b9b95f-8lwcw\" (UID: \"e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0\") " pod="openstack/horizon-7975b9b95f-8lwcw" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.724583 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/127a1d07-d1f4-4c95-abf8-da08884ea57a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"127a1d07-d1f4-4c95-abf8-da08884ea57a\") " pod="openstack/ceilometer-0" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.726863 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/313f9ab1-8fa8-476f-94cb-1d94bd975a06-db-sync-config-data\") pod \"barbican-db-sync-nvszz\" (UID: \"313f9ab1-8fa8-476f-94cb-1d94bd975a06\") " pod="openstack/barbican-db-sync-nvszz" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.726905 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91950d77-9457-412f-be07-626b553f6b8d-scripts\") pod \"placement-db-sync-d4zpd\" (UID: \"91950d77-9457-412f-be07-626b553f6b8d\") " pod="openstack/placement-db-sync-d4zpd" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.726927 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg27h\" (UniqueName: \"kubernetes.io/projected/313f9ab1-8fa8-476f-94cb-1d94bd975a06-kube-api-access-tg27h\") pod \"barbican-db-sync-nvszz\" (UID: \"313f9ab1-8fa8-476f-94cb-1d94bd975a06\") " pod="openstack/barbican-db-sync-nvszz" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.726954 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlfnq\" (UniqueName: \"kubernetes.io/projected/91950d77-9457-412f-be07-626b553f6b8d-kube-api-access-zlfnq\") pod \"placement-db-sync-d4zpd\" (UID: \"91950d77-9457-412f-be07-626b553f6b8d\") " pod="openstack/placement-db-sync-d4zpd" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.726974 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91950d77-9457-412f-be07-626b553f6b8d-config-data\") pod \"placement-db-sync-d4zpd\" (UID: \"91950d77-9457-412f-be07-626b553f6b8d\") " pod="openstack/placement-db-sync-d4zpd" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.726999 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ceb96a8c-ce5a-420e-aa1e-594d09fc1487-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcf4b695-qjn7w\" (UID: \"ceb96a8c-ce5a-420e-aa1e-594d09fc1487\") " pod="openstack/dnsmasq-dns-76fcf4b695-qjn7w" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.727038 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a523e37-804a-4173-8012-19848efc8cc0-combined-ca-bundle\") pod \"cinder-db-sync-99rss\" (UID: \"9a523e37-804a-4173-8012-19848efc8cc0\") " pod="openstack/cinder-db-sync-99rss" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.727043 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/127a1d07-d1f4-4c95-abf8-da08884ea57a-log-httpd\") pod \"ceilometer-0\" (UID: \"127a1d07-d1f4-4c95-abf8-da08884ea57a\") " pod="openstack/ceilometer-0" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.727072 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g54n\" (UniqueName: \"kubernetes.io/projected/127a1d07-d1f4-4c95-abf8-da08884ea57a-kube-api-access-4g54n\") pod \"ceilometer-0\" (UID: \"127a1d07-d1f4-4c95-abf8-da08884ea57a\") " pod="openstack/ceilometer-0" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.727109 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a523e37-804a-4173-8012-19848efc8cc0-scripts\") pod \"cinder-db-sync-99rss\" (UID: \"9a523e37-804a-4173-8012-19848efc8cc0\") " pod="openstack/cinder-db-sync-99rss" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.727439 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9a523e37-804a-4173-8012-19848efc8cc0-etc-machine-id\") pod \"cinder-db-sync-99rss\" (UID: \"9a523e37-804a-4173-8012-19848efc8cc0\") " pod="openstack/cinder-db-sync-99rss" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.728218 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/127a1d07-d1f4-4c95-abf8-da08884ea57a-run-httpd\") pod \"ceilometer-0\" (UID: \"127a1d07-d1f4-4c95-abf8-da08884ea57a\") " pod="openstack/ceilometer-0" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.731259 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/127a1d07-d1f4-4c95-abf8-da08884ea57a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"127a1d07-d1f4-4c95-abf8-da08884ea57a\") " pod="openstack/ceilometer-0" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.734077 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a523e37-804a-4173-8012-19848efc8cc0-config-data\") pod \"cinder-db-sync-99rss\" (UID: \"9a523e37-804a-4173-8012-19848efc8cc0\") " pod="openstack/cinder-db-sync-99rss" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.736243 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/127a1d07-d1f4-4c95-abf8-da08884ea57a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"127a1d07-d1f4-4c95-abf8-da08884ea57a\") " pod="openstack/ceilometer-0" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.738209 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/127a1d07-d1f4-4c95-abf8-da08884ea57a-scripts\") pod \"ceilometer-0\" (UID: \"127a1d07-d1f4-4c95-abf8-da08884ea57a\") " pod="openstack/ceilometer-0" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.741071 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9a523e37-804a-4173-8012-19848efc8cc0-db-sync-config-data\") pod \"cinder-db-sync-99rss\" (UID: \"9a523e37-804a-4173-8012-19848efc8cc0\") " pod="openstack/cinder-db-sync-99rss" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.741583 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/127a1d07-d1f4-4c95-abf8-da08884ea57a-config-data\") pod \"ceilometer-0\" (UID: \"127a1d07-d1f4-4c95-abf8-da08884ea57a\") " pod="openstack/ceilometer-0" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.742188 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a523e37-804a-4173-8012-19848efc8cc0-combined-ca-bundle\") pod \"cinder-db-sync-99rss\" (UID: \"9a523e37-804a-4173-8012-19848efc8cc0\") " pod="openstack/cinder-db-sync-99rss" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.750718 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3158c13-7e3a-4467-85bb-acb01fcba800-kube-api-access-czs4n" (OuterVolumeSpecName: "kube-api-access-czs4n") pod "c3158c13-7e3a-4467-85bb-acb01fcba800" (UID: "c3158c13-7e3a-4467-85bb-acb01fcba800"). InnerVolumeSpecName "kube-api-access-czs4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.755640 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a523e37-804a-4173-8012-19848efc8cc0-scripts\") pod \"cinder-db-sync-99rss\" (UID: \"9a523e37-804a-4173-8012-19848efc8cc0\") " pod="openstack/cinder-db-sync-99rss" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.763275 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75gmn\" (UniqueName: \"kubernetes.io/projected/9a523e37-804a-4173-8012-19848efc8cc0-kube-api-access-75gmn\") pod \"cinder-db-sync-99rss\" (UID: \"9a523e37-804a-4173-8012-19848efc8cc0\") " pod="openstack/cinder-db-sync-99rss" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.782628 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g54n\" (UniqueName: \"kubernetes.io/projected/127a1d07-d1f4-4c95-abf8-da08884ea57a-kube-api-access-4g54n\") pod \"ceilometer-0\" (UID: \"127a1d07-d1f4-4c95-abf8-da08884ea57a\") " pod="openstack/ceilometer-0" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.818444 4771 generic.go:334] "Generic (PLEG): container finished" podID="c3158c13-7e3a-4467-85bb-acb01fcba800" containerID="ec3c80d24487130f95d057281465248d12cfa12c294ac6271b7edc9b977bad77" exitCode=0 Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.818480 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-wrk8t" event={"ID":"c3158c13-7e3a-4467-85bb-acb01fcba800","Type":"ContainerDied","Data":"ec3c80d24487130f95d057281465248d12cfa12c294ac6271b7edc9b977bad77"} Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.818508 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-wrk8t" event={"ID":"c3158c13-7e3a-4467-85bb-acb01fcba800","Type":"ContainerDied","Data":"06e717b0d134806f8d7d505648eafd216737d8d9f715d61ef252289f83c5db6e"} Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.818524 4771 scope.go:117] "RemoveContainer" containerID="ec3c80d24487130f95d057281465248d12cfa12c294ac6271b7edc9b977bad77" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.818637 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-wrk8t" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.833180 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlfnq\" (UniqueName: \"kubernetes.io/projected/91950d77-9457-412f-be07-626b553f6b8d-kube-api-access-zlfnq\") pod \"placement-db-sync-d4zpd\" (UID: \"91950d77-9457-412f-be07-626b553f6b8d\") " pod="openstack/placement-db-sync-d4zpd" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.833212 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91950d77-9457-412f-be07-626b553f6b8d-config-data\") pod \"placement-db-sync-d4zpd\" (UID: \"91950d77-9457-412f-be07-626b553f6b8d\") " pod="openstack/placement-db-sync-d4zpd" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.833243 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ceb96a8c-ce5a-420e-aa1e-594d09fc1487-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcf4b695-qjn7w\" (UID: \"ceb96a8c-ce5a-420e-aa1e-594d09fc1487\") " pod="openstack/dnsmasq-dns-76fcf4b695-qjn7w" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.833341 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91950d77-9457-412f-be07-626b553f6b8d-combined-ca-bundle\") pod \"placement-db-sync-d4zpd\" (UID: \"91950d77-9457-412f-be07-626b553f6b8d\") " pod="openstack/placement-db-sync-d4zpd" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.833361 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ceb96a8c-ce5a-420e-aa1e-594d09fc1487-config\") pod \"dnsmasq-dns-76fcf4b695-qjn7w\" (UID: \"ceb96a8c-ce5a-420e-aa1e-594d09fc1487\") " pod="openstack/dnsmasq-dns-76fcf4b695-qjn7w" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.833388 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdbvj\" (UniqueName: \"kubernetes.io/projected/e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0-kube-api-access-pdbvj\") pod \"horizon-7975b9b95f-8lwcw\" (UID: \"e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0\") " pod="openstack/horizon-7975b9b95f-8lwcw" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.833412 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ceb96a8c-ce5a-420e-aa1e-594d09fc1487-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcf4b695-qjn7w\" (UID: \"ceb96a8c-ce5a-420e-aa1e-594d09fc1487\") " pod="openstack/dnsmasq-dns-76fcf4b695-qjn7w" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.833430 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ceb96a8c-ce5a-420e-aa1e-594d09fc1487-dns-svc\") pod \"dnsmasq-dns-76fcf4b695-qjn7w\" (UID: \"ceb96a8c-ce5a-420e-aa1e-594d09fc1487\") " pod="openstack/dnsmasq-dns-76fcf4b695-qjn7w" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.833493 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhkrp\" (UniqueName: \"kubernetes.io/projected/ceb96a8c-ce5a-420e-aa1e-594d09fc1487-kube-api-access-jhkrp\") pod \"dnsmasq-dns-76fcf4b695-qjn7w\" (UID: \"ceb96a8c-ce5a-420e-aa1e-594d09fc1487\") " pod="openstack/dnsmasq-dns-76fcf4b695-qjn7w" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.833508 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0-horizon-secret-key\") pod \"horizon-7975b9b95f-8lwcw\" (UID: \"e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0\") " pod="openstack/horizon-7975b9b95f-8lwcw" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.833532 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ceb96a8c-ce5a-420e-aa1e-594d09fc1487-dns-swift-storage-0\") pod \"dnsmasq-dns-76fcf4b695-qjn7w\" (UID: \"ceb96a8c-ce5a-420e-aa1e-594d09fc1487\") " pod="openstack/dnsmasq-dns-76fcf4b695-qjn7w" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.833550 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0-logs\") pod \"horizon-7975b9b95f-8lwcw\" (UID: \"e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0\") " pod="openstack/horizon-7975b9b95f-8lwcw" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.833569 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0-scripts\") pod \"horizon-7975b9b95f-8lwcw\" (UID: \"e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0\") " pod="openstack/horizon-7975b9b95f-8lwcw" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.833603 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91950d77-9457-412f-be07-626b553f6b8d-logs\") pod \"placement-db-sync-d4zpd\" (UID: \"91950d77-9457-412f-be07-626b553f6b8d\") " pod="openstack/placement-db-sync-d4zpd" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.833634 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/313f9ab1-8fa8-476f-94cb-1d94bd975a06-combined-ca-bundle\") pod \"barbican-db-sync-nvszz\" (UID: \"313f9ab1-8fa8-476f-94cb-1d94bd975a06\") " pod="openstack/barbican-db-sync-nvszz" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.833654 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0-config-data\") pod \"horizon-7975b9b95f-8lwcw\" (UID: \"e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0\") " pod="openstack/horizon-7975b9b95f-8lwcw" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.833695 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/313f9ab1-8fa8-476f-94cb-1d94bd975a06-db-sync-config-data\") pod \"barbican-db-sync-nvszz\" (UID: \"313f9ab1-8fa8-476f-94cb-1d94bd975a06\") " pod="openstack/barbican-db-sync-nvszz" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.833716 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91950d77-9457-412f-be07-626b553f6b8d-scripts\") pod \"placement-db-sync-d4zpd\" (UID: \"91950d77-9457-412f-be07-626b553f6b8d\") " pod="openstack/placement-db-sync-d4zpd" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.833746 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tg27h\" (UniqueName: \"kubernetes.io/projected/313f9ab1-8fa8-476f-94cb-1d94bd975a06-kube-api-access-tg27h\") pod \"barbican-db-sync-nvszz\" (UID: \"313f9ab1-8fa8-476f-94cb-1d94bd975a06\") " pod="openstack/barbican-db-sync-nvszz" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.833792 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czs4n\" (UniqueName: \"kubernetes.io/projected/c3158c13-7e3a-4467-85bb-acb01fcba800-kube-api-access-czs4n\") on node \"crc\" DevicePath \"\"" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.835941 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91950d77-9457-412f-be07-626b553f6b8d-logs\") pod \"placement-db-sync-d4zpd\" (UID: \"91950d77-9457-412f-be07-626b553f6b8d\") " pod="openstack/placement-db-sync-d4zpd" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.836581 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ceb96a8c-ce5a-420e-aa1e-594d09fc1487-dns-swift-storage-0\") pod \"dnsmasq-dns-76fcf4b695-qjn7w\" (UID: \"ceb96a8c-ce5a-420e-aa1e-594d09fc1487\") " pod="openstack/dnsmasq-dns-76fcf4b695-qjn7w" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.836869 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0-logs\") pod \"horizon-7975b9b95f-8lwcw\" (UID: \"e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0\") " pod="openstack/horizon-7975b9b95f-8lwcw" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.837279 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ceb96a8c-ce5a-420e-aa1e-594d09fc1487-ovsdbserver-nb\") pod \"dnsmasq-dns-76fcf4b695-qjn7w\" (UID: \"ceb96a8c-ce5a-420e-aa1e-594d09fc1487\") " pod="openstack/dnsmasq-dns-76fcf4b695-qjn7w" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.837340 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0-scripts\") pod \"horizon-7975b9b95f-8lwcw\" (UID: \"e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0\") " pod="openstack/horizon-7975b9b95f-8lwcw" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.838805 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ceb96a8c-ce5a-420e-aa1e-594d09fc1487-config\") pod \"dnsmasq-dns-76fcf4b695-qjn7w\" (UID: \"ceb96a8c-ce5a-420e-aa1e-594d09fc1487\") " pod="openstack/dnsmasq-dns-76fcf4b695-qjn7w" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.838837 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0-config-data\") pod \"horizon-7975b9b95f-8lwcw\" (UID: \"e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0\") " pod="openstack/horizon-7975b9b95f-8lwcw" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.839531 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ceb96a8c-ce5a-420e-aa1e-594d09fc1487-dns-svc\") pod \"dnsmasq-dns-76fcf4b695-qjn7w\" (UID: \"ceb96a8c-ce5a-420e-aa1e-594d09fc1487\") " pod="openstack/dnsmasq-dns-76fcf4b695-qjn7w" Oct 01 15:13:21 crc kubenswrapper[4771]: I1001 15:13:21.840033 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ceb96a8c-ce5a-420e-aa1e-594d09fc1487-ovsdbserver-sb\") pod \"dnsmasq-dns-76fcf4b695-qjn7w\" (UID: \"ceb96a8c-ce5a-420e-aa1e-594d09fc1487\") " pod="openstack/dnsmasq-dns-76fcf4b695-qjn7w" Oct 01 15:13:22 crc kubenswrapper[4771]: I1001 15:13:21.854574 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/313f9ab1-8fa8-476f-94cb-1d94bd975a06-db-sync-config-data\") pod \"barbican-db-sync-nvszz\" (UID: \"313f9ab1-8fa8-476f-94cb-1d94bd975a06\") " pod="openstack/barbican-db-sync-nvszz" Oct 01 15:13:22 crc kubenswrapper[4771]: I1001 15:13:21.854790 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91950d77-9457-412f-be07-626b553f6b8d-scripts\") pod \"placement-db-sync-d4zpd\" (UID: \"91950d77-9457-412f-be07-626b553f6b8d\") " pod="openstack/placement-db-sync-d4zpd" Oct 01 15:13:22 crc kubenswrapper[4771]: I1001 15:13:21.854935 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/313f9ab1-8fa8-476f-94cb-1d94bd975a06-combined-ca-bundle\") pod \"barbican-db-sync-nvszz\" (UID: \"313f9ab1-8fa8-476f-94cb-1d94bd975a06\") " pod="openstack/barbican-db-sync-nvszz" Oct 01 15:13:22 crc kubenswrapper[4771]: I1001 15:13:21.857215 4771 scope.go:117] "RemoveContainer" containerID="d5a986d2ce17ea560ae7a61d317eda70744bada3d1bcbf933a0008de1694ebeb" Oct 01 15:13:22 crc kubenswrapper[4771]: I1001 15:13:21.860894 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0-horizon-secret-key\") pod \"horizon-7975b9b95f-8lwcw\" (UID: \"e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0\") " pod="openstack/horizon-7975b9b95f-8lwcw" Oct 01 15:13:22 crc kubenswrapper[4771]: I1001 15:13:21.860971 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdbvj\" (UniqueName: \"kubernetes.io/projected/e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0-kube-api-access-pdbvj\") pod \"horizon-7975b9b95f-8lwcw\" (UID: \"e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0\") " pod="openstack/horizon-7975b9b95f-8lwcw" Oct 01 15:13:22 crc kubenswrapper[4771]: I1001 15:13:21.860996 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91950d77-9457-412f-be07-626b553f6b8d-config-data\") pod \"placement-db-sync-d4zpd\" (UID: \"91950d77-9457-412f-be07-626b553f6b8d\") " pod="openstack/placement-db-sync-d4zpd" Oct 01 15:13:22 crc kubenswrapper[4771]: I1001 15:13:21.864538 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhkrp\" (UniqueName: \"kubernetes.io/projected/ceb96a8c-ce5a-420e-aa1e-594d09fc1487-kube-api-access-jhkrp\") pod \"dnsmasq-dns-76fcf4b695-qjn7w\" (UID: \"ceb96a8c-ce5a-420e-aa1e-594d09fc1487\") " pod="openstack/dnsmasq-dns-76fcf4b695-qjn7w" Oct 01 15:13:22 crc kubenswrapper[4771]: I1001 15:13:21.869089 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tg27h\" (UniqueName: \"kubernetes.io/projected/313f9ab1-8fa8-476f-94cb-1d94bd975a06-kube-api-access-tg27h\") pod \"barbican-db-sync-nvszz\" (UID: \"313f9ab1-8fa8-476f-94cb-1d94bd975a06\") " pod="openstack/barbican-db-sync-nvszz" Oct 01 15:13:22 crc kubenswrapper[4771]: I1001 15:13:21.873015 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlfnq\" (UniqueName: \"kubernetes.io/projected/91950d77-9457-412f-be07-626b553f6b8d-kube-api-access-zlfnq\") pod \"placement-db-sync-d4zpd\" (UID: \"91950d77-9457-412f-be07-626b553f6b8d\") " pod="openstack/placement-db-sync-d4zpd" Oct 01 15:13:22 crc kubenswrapper[4771]: I1001 15:13:21.890630 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3158c13-7e3a-4467-85bb-acb01fcba800-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c3158c13-7e3a-4467-85bb-acb01fcba800" (UID: "c3158c13-7e3a-4467-85bb-acb01fcba800"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:13:22 crc kubenswrapper[4771]: I1001 15:13:21.892203 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3158c13-7e3a-4467-85bb-acb01fcba800-config" (OuterVolumeSpecName: "config") pod "c3158c13-7e3a-4467-85bb-acb01fcba800" (UID: "c3158c13-7e3a-4467-85bb-acb01fcba800"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:13:22 crc kubenswrapper[4771]: I1001 15:13:21.903460 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3158c13-7e3a-4467-85bb-acb01fcba800-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c3158c13-7e3a-4467-85bb-acb01fcba800" (UID: "c3158c13-7e3a-4467-85bb-acb01fcba800"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:13:22 crc kubenswrapper[4771]: I1001 15:13:21.904202 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91950d77-9457-412f-be07-626b553f6b8d-combined-ca-bundle\") pod \"placement-db-sync-d4zpd\" (UID: \"91950d77-9457-412f-be07-626b553f6b8d\") " pod="openstack/placement-db-sync-d4zpd" Oct 01 15:13:22 crc kubenswrapper[4771]: I1001 15:13:21.905087 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3158c13-7e3a-4467-85bb-acb01fcba800-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c3158c13-7e3a-4467-85bb-acb01fcba800" (UID: "c3158c13-7e3a-4467-85bb-acb01fcba800"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:13:22 crc kubenswrapper[4771]: I1001 15:13:21.908512 4771 scope.go:117] "RemoveContainer" containerID="ec3c80d24487130f95d057281465248d12cfa12c294ac6271b7edc9b977bad77" Oct 01 15:13:22 crc kubenswrapper[4771]: I1001 15:13:21.909177 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3158c13-7e3a-4467-85bb-acb01fcba800-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c3158c13-7e3a-4467-85bb-acb01fcba800" (UID: "c3158c13-7e3a-4467-85bb-acb01fcba800"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:13:22 crc kubenswrapper[4771]: E1001 15:13:21.910182 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec3c80d24487130f95d057281465248d12cfa12c294ac6271b7edc9b977bad77\": container with ID starting with ec3c80d24487130f95d057281465248d12cfa12c294ac6271b7edc9b977bad77 not found: ID does not exist" containerID="ec3c80d24487130f95d057281465248d12cfa12c294ac6271b7edc9b977bad77" Oct 01 15:13:22 crc kubenswrapper[4771]: I1001 15:13:21.910224 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec3c80d24487130f95d057281465248d12cfa12c294ac6271b7edc9b977bad77"} err="failed to get container status \"ec3c80d24487130f95d057281465248d12cfa12c294ac6271b7edc9b977bad77\": rpc error: code = NotFound desc = could not find container \"ec3c80d24487130f95d057281465248d12cfa12c294ac6271b7edc9b977bad77\": container with ID starting with ec3c80d24487130f95d057281465248d12cfa12c294ac6271b7edc9b977bad77 not found: ID does not exist" Oct 01 15:13:22 crc kubenswrapper[4771]: I1001 15:13:21.910262 4771 scope.go:117] "RemoveContainer" containerID="d5a986d2ce17ea560ae7a61d317eda70744bada3d1bcbf933a0008de1694ebeb" Oct 01 15:13:22 crc kubenswrapper[4771]: E1001 15:13:21.910628 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5a986d2ce17ea560ae7a61d317eda70744bada3d1bcbf933a0008de1694ebeb\": container with ID starting with d5a986d2ce17ea560ae7a61d317eda70744bada3d1bcbf933a0008de1694ebeb not found: ID does not exist" containerID="d5a986d2ce17ea560ae7a61d317eda70744bada3d1bcbf933a0008de1694ebeb" Oct 01 15:13:22 crc kubenswrapper[4771]: I1001 15:13:21.910645 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5a986d2ce17ea560ae7a61d317eda70744bada3d1bcbf933a0008de1694ebeb"} err="failed to get container status \"d5a986d2ce17ea560ae7a61d317eda70744bada3d1bcbf933a0008de1694ebeb\": rpc error: code = NotFound desc = could not find container \"d5a986d2ce17ea560ae7a61d317eda70744bada3d1bcbf933a0008de1694ebeb\": container with ID starting with d5a986d2ce17ea560ae7a61d317eda70744bada3d1bcbf933a0008de1694ebeb not found: ID does not exist" Oct 01 15:13:22 crc kubenswrapper[4771]: I1001 15:13:21.914678 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-99rss" Oct 01 15:13:22 crc kubenswrapper[4771]: I1001 15:13:21.933098 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 15:13:22 crc kubenswrapper[4771]: I1001 15:13:21.936895 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c3158c13-7e3a-4467-85bb-acb01fcba800-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 15:13:22 crc kubenswrapper[4771]: I1001 15:13:21.936929 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c3158c13-7e3a-4467-85bb-acb01fcba800-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 15:13:22 crc kubenswrapper[4771]: I1001 15:13:21.936941 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c3158c13-7e3a-4467-85bb-acb01fcba800-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 15:13:22 crc kubenswrapper[4771]: I1001 15:13:21.936952 4771 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c3158c13-7e3a-4467-85bb-acb01fcba800-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 15:13:22 crc kubenswrapper[4771]: I1001 15:13:21.936962 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3158c13-7e3a-4467-85bb-acb01fcba800-config\") on node \"crc\" DevicePath \"\"" Oct 01 15:13:22 crc kubenswrapper[4771]: I1001 15:13:21.948177 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-nvszz" Oct 01 15:13:22 crc kubenswrapper[4771]: I1001 15:13:21.987452 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-d4zpd" Oct 01 15:13:22 crc kubenswrapper[4771]: I1001 15:13:22.018462 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-qjn7w" Oct 01 15:13:22 crc kubenswrapper[4771]: I1001 15:13:22.039870 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7975b9b95f-8lwcw" Oct 01 15:13:22 crc kubenswrapper[4771]: I1001 15:13:22.042108 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-t97db"] Oct 01 15:13:22 crc kubenswrapper[4771]: W1001 15:13:22.084719 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podce394a9f_0955_4eaf_8617_c0087216c295.slice/crio-dbba7dda23d192bfe44659c7ee196c040079bcd228604189fcd73d43e4733fe5 WatchSource:0}: Error finding container dbba7dda23d192bfe44659c7ee196c040079bcd228604189fcd73d43e4733fe5: Status 404 returned error can't find the container with id dbba7dda23d192bfe44659c7ee196c040079bcd228604189fcd73d43e4733fe5 Oct 01 15:13:22 crc kubenswrapper[4771]: I1001 15:13:22.157421 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-wrk8t"] Oct 01 15:13:22 crc kubenswrapper[4771]: I1001 15:13:22.164050 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-wrk8t"] Oct 01 15:13:22 crc kubenswrapper[4771]: I1001 15:13:22.837457 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-t97db" event={"ID":"ce394a9f-0955-4eaf-8617-c0087216c295","Type":"ContainerStarted","Data":"78f6f199594e13c88dab6b12df66e120e384a65068fea0849457119f74ae27fc"} Oct 01 15:13:22 crc kubenswrapper[4771]: I1001 15:13:22.837866 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-t97db" event={"ID":"ce394a9f-0955-4eaf-8617-c0087216c295","Type":"ContainerStarted","Data":"dbba7dda23d192bfe44659c7ee196c040079bcd228604189fcd73d43e4733fe5"} Oct 01 15:13:23 crc kubenswrapper[4771]: I1001 15:13:23.312078 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-t97db" podStartSLOduration=2.312059068 podStartE2EDuration="2.312059068s" podCreationTimestamp="2025-10-01 15:13:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:13:22.863096165 +0000 UTC m=+1047.482271326" watchObservedRunningTime="2025-10-01 15:13:23.312059068 +0000 UTC m=+1047.931234239" Oct 01 15:13:23 crc kubenswrapper[4771]: I1001 15:13:23.316176 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7ccd9d5bb7-8mdjn"] Oct 01 15:13:23 crc kubenswrapper[4771]: I1001 15:13:23.352910 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-d4zpd"] Oct 01 15:13:23 crc kubenswrapper[4771]: I1001 15:13:23.363539 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-99rss"] Oct 01 15:13:23 crc kubenswrapper[4771]: I1001 15:13:23.382765 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7975b9b95f-8lwcw"] Oct 01 15:13:23 crc kubenswrapper[4771]: W1001 15:13:23.387455 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91950d77_9457_412f_be07_626b553f6b8d.slice/crio-7cb9677ed675d24cb8cd0ba643e00fd75391fcb66fa1656591799e73b061532e WatchSource:0}: Error finding container 7cb9677ed675d24cb8cd0ba643e00fd75391fcb66fa1656591799e73b061532e: Status 404 returned error can't find the container with id 7cb9677ed675d24cb8cd0ba643e00fd75391fcb66fa1656591799e73b061532e Oct 01 15:13:23 crc kubenswrapper[4771]: I1001 15:13:23.394339 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 15:13:23 crc kubenswrapper[4771]: W1001 15:13:23.406861 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod127a1d07_d1f4_4c95_abf8_da08884ea57a.slice/crio-a75867f32bf3e6dab26e393500b4de93fda7ff71f88a394bd3f0c6bfa4b83619 WatchSource:0}: Error finding container a75867f32bf3e6dab26e393500b4de93fda7ff71f88a394bd3f0c6bfa4b83619: Status 404 returned error can't find the container with id a75867f32bf3e6dab26e393500b4de93fda7ff71f88a394bd3f0c6bfa4b83619 Oct 01 15:13:23 crc kubenswrapper[4771]: W1001 15:13:23.411041 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7709b00c_a5a2_46c9_a4dc_ffa7fe3c912c.slice/crio-f005d94ff6b702b1b395c577e4e295bb5f7843ba209aadf1c1c6a3b26730a72b WatchSource:0}: Error finding container f005d94ff6b702b1b395c577e4e295bb5f7843ba209aadf1c1c6a3b26730a72b: Status 404 returned error can't find the container with id f005d94ff6b702b1b395c577e4e295bb5f7843ba209aadf1c1c6a3b26730a72b Oct 01 15:13:23 crc kubenswrapper[4771]: I1001 15:13:23.414191 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-nvszz"] Oct 01 15:13:23 crc kubenswrapper[4771]: W1001 15:13:23.415775 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod313f9ab1_8fa8_476f_94cb_1d94bd975a06.slice/crio-dc53eed9b5af672fac2aa9b758cb652b9ba69ac0bdd36a39b3f5466d7b845360 WatchSource:0}: Error finding container dc53eed9b5af672fac2aa9b758cb652b9ba69ac0bdd36a39b3f5466d7b845360: Status 404 returned error can't find the container with id dc53eed9b5af672fac2aa9b758cb652b9ba69ac0bdd36a39b3f5466d7b845360 Oct 01 15:13:23 crc kubenswrapper[4771]: W1001 15:13:23.423045 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode48032b5_450d_4975_9657_d7dd1fa95f3b.slice/crio-2c91548e2422a803d2392264a2c3f0843c5d0c6552d95404cd0d0b7120d7c890 WatchSource:0}: Error finding container 2c91548e2422a803d2392264a2c3f0843c5d0c6552d95404cd0d0b7120d7c890: Status 404 returned error can't find the container with id 2c91548e2422a803d2392264a2c3f0843c5d0c6552d95404cd0d0b7120d7c890 Oct 01 15:13:23 crc kubenswrapper[4771]: I1001 15:13:23.423555 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-qfvj2"] Oct 01 15:13:23 crc kubenswrapper[4771]: I1001 15:13:23.434877 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-qjn7w"] Oct 01 15:13:23 crc kubenswrapper[4771]: I1001 15:13:23.442693 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-mpc2n"] Oct 01 15:13:23 crc kubenswrapper[4771]: I1001 15:13:23.690282 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7975b9b95f-8lwcw"] Oct 01 15:13:23 crc kubenswrapper[4771]: I1001 15:13:23.730069 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-d8c64bd85-clrr4"] Oct 01 15:13:23 crc kubenswrapper[4771]: I1001 15:13:23.733638 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-d8c64bd85-clrr4" Oct 01 15:13:23 crc kubenswrapper[4771]: I1001 15:13:23.737615 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 15:13:23 crc kubenswrapper[4771]: I1001 15:13:23.761649 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-d8c64bd85-clrr4"] Oct 01 15:13:23 crc kubenswrapper[4771]: I1001 15:13:23.787176 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fe922d16-c5a9-4d8d-ba3e-33042e83372b-config-data\") pod \"horizon-d8c64bd85-clrr4\" (UID: \"fe922d16-c5a9-4d8d-ba3e-33042e83372b\") " pod="openstack/horizon-d8c64bd85-clrr4" Oct 01 15:13:23 crc kubenswrapper[4771]: I1001 15:13:23.787292 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fe922d16-c5a9-4d8d-ba3e-33042e83372b-horizon-secret-key\") pod \"horizon-d8c64bd85-clrr4\" (UID: \"fe922d16-c5a9-4d8d-ba3e-33042e83372b\") " pod="openstack/horizon-d8c64bd85-clrr4" Oct 01 15:13:23 crc kubenswrapper[4771]: I1001 15:13:23.787332 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe922d16-c5a9-4d8d-ba3e-33042e83372b-logs\") pod \"horizon-d8c64bd85-clrr4\" (UID: \"fe922d16-c5a9-4d8d-ba3e-33042e83372b\") " pod="openstack/horizon-d8c64bd85-clrr4" Oct 01 15:13:23 crc kubenswrapper[4771]: I1001 15:13:23.787406 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhghl\" (UniqueName: \"kubernetes.io/projected/fe922d16-c5a9-4d8d-ba3e-33042e83372b-kube-api-access-qhghl\") pod \"horizon-d8c64bd85-clrr4\" (UID: \"fe922d16-c5a9-4d8d-ba3e-33042e83372b\") " pod="openstack/horizon-d8c64bd85-clrr4" Oct 01 15:13:23 crc kubenswrapper[4771]: I1001 15:13:23.789048 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe922d16-c5a9-4d8d-ba3e-33042e83372b-scripts\") pod \"horizon-d8c64bd85-clrr4\" (UID: \"fe922d16-c5a9-4d8d-ba3e-33042e83372b\") " pod="openstack/horizon-d8c64bd85-clrr4" Oct 01 15:13:23 crc kubenswrapper[4771]: I1001 15:13:23.861221 4771 generic.go:334] "Generic (PLEG): container finished" podID="ceb96a8c-ce5a-420e-aa1e-594d09fc1487" containerID="b7156af422e3c6aaf0282ef2b69573b0ade9765298ff0bfabec5bc5907220610" exitCode=0 Oct 01 15:13:23 crc kubenswrapper[4771]: I1001 15:13:23.861303 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-qjn7w" event={"ID":"ceb96a8c-ce5a-420e-aa1e-594d09fc1487","Type":"ContainerDied","Data":"b7156af422e3c6aaf0282ef2b69573b0ade9765298ff0bfabec5bc5907220610"} Oct 01 15:13:23 crc kubenswrapper[4771]: I1001 15:13:23.861330 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-qjn7w" event={"ID":"ceb96a8c-ce5a-420e-aa1e-594d09fc1487","Type":"ContainerStarted","Data":"07acfd45c9803ba4f89499650d17e0b3a8fce807090693a47c1ee664dee47e74"} Oct 01 15:13:23 crc kubenswrapper[4771]: I1001 15:13:23.868439 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7975b9b95f-8lwcw" event={"ID":"e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0","Type":"ContainerStarted","Data":"f28e2c258c09bbe46fa44aceba7b17ea14f490c94b0e647ad4001180b0a14c1e"} Oct 01 15:13:23 crc kubenswrapper[4771]: I1001 15:13:23.880174 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"127a1d07-d1f4-4c95-abf8-da08884ea57a","Type":"ContainerStarted","Data":"a75867f32bf3e6dab26e393500b4de93fda7ff71f88a394bd3f0c6bfa4b83619"} Oct 01 15:13:23 crc kubenswrapper[4771]: I1001 15:13:23.883117 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7ccd9d5bb7-8mdjn" event={"ID":"b474b0f5-0050-4edd-afac-9237aa7284a5","Type":"ContainerStarted","Data":"4fb064a0edef6b0399c5f9110bfc424ad88a901268cefcd1373c1ef812e65baf"} Oct 01 15:13:23 crc kubenswrapper[4771]: I1001 15:13:23.891697 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fe922d16-c5a9-4d8d-ba3e-33042e83372b-config-data\") pod \"horizon-d8c64bd85-clrr4\" (UID: \"fe922d16-c5a9-4d8d-ba3e-33042e83372b\") " pod="openstack/horizon-d8c64bd85-clrr4" Oct 01 15:13:23 crc kubenswrapper[4771]: I1001 15:13:23.891830 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fe922d16-c5a9-4d8d-ba3e-33042e83372b-horizon-secret-key\") pod \"horizon-d8c64bd85-clrr4\" (UID: \"fe922d16-c5a9-4d8d-ba3e-33042e83372b\") " pod="openstack/horizon-d8c64bd85-clrr4" Oct 01 15:13:23 crc kubenswrapper[4771]: I1001 15:13:23.891860 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe922d16-c5a9-4d8d-ba3e-33042e83372b-logs\") pod \"horizon-d8c64bd85-clrr4\" (UID: \"fe922d16-c5a9-4d8d-ba3e-33042e83372b\") " pod="openstack/horizon-d8c64bd85-clrr4" Oct 01 15:13:23 crc kubenswrapper[4771]: I1001 15:13:23.891919 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhghl\" (UniqueName: \"kubernetes.io/projected/fe922d16-c5a9-4d8d-ba3e-33042e83372b-kube-api-access-qhghl\") pod \"horizon-d8c64bd85-clrr4\" (UID: \"fe922d16-c5a9-4d8d-ba3e-33042e83372b\") " pod="openstack/horizon-d8c64bd85-clrr4" Oct 01 15:13:23 crc kubenswrapper[4771]: I1001 15:13:23.891938 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe922d16-c5a9-4d8d-ba3e-33042e83372b-scripts\") pod \"horizon-d8c64bd85-clrr4\" (UID: \"fe922d16-c5a9-4d8d-ba3e-33042e83372b\") " pod="openstack/horizon-d8c64bd85-clrr4" Oct 01 15:13:23 crc kubenswrapper[4771]: I1001 15:13:23.892385 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-99rss" event={"ID":"9a523e37-804a-4173-8012-19848efc8cc0","Type":"ContainerStarted","Data":"0daed170c84f6be65c4e73f4b070a7cac27dccbf117041fd0b321f1dc175d1b6"} Oct 01 15:13:23 crc kubenswrapper[4771]: I1001 15:13:23.892920 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe922d16-c5a9-4d8d-ba3e-33042e83372b-logs\") pod \"horizon-d8c64bd85-clrr4\" (UID: \"fe922d16-c5a9-4d8d-ba3e-33042e83372b\") " pod="openstack/horizon-d8c64bd85-clrr4" Oct 01 15:13:23 crc kubenswrapper[4771]: I1001 15:13:23.893247 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fe922d16-c5a9-4d8d-ba3e-33042e83372b-config-data\") pod \"horizon-d8c64bd85-clrr4\" (UID: \"fe922d16-c5a9-4d8d-ba3e-33042e83372b\") " pod="openstack/horizon-d8c64bd85-clrr4" Oct 01 15:13:23 crc kubenswrapper[4771]: I1001 15:13:23.894085 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe922d16-c5a9-4d8d-ba3e-33042e83372b-scripts\") pod \"horizon-d8c64bd85-clrr4\" (UID: \"fe922d16-c5a9-4d8d-ba3e-33042e83372b\") " pod="openstack/horizon-d8c64bd85-clrr4" Oct 01 15:13:23 crc kubenswrapper[4771]: I1001 15:13:23.900301 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-qfvj2" event={"ID":"7709b00c-a5a2-46c9-a4dc-ffa7fe3c912c","Type":"ContainerStarted","Data":"758ccba9aed842f21dfca918e8574ac6b220bd4a6467406d8a7aafee26ba08a9"} Oct 01 15:13:23 crc kubenswrapper[4771]: I1001 15:13:23.900344 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-qfvj2" event={"ID":"7709b00c-a5a2-46c9-a4dc-ffa7fe3c912c","Type":"ContainerStarted","Data":"f005d94ff6b702b1b395c577e4e295bb5f7843ba209aadf1c1c6a3b26730a72b"} Oct 01 15:13:23 crc kubenswrapper[4771]: I1001 15:13:23.907983 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fe922d16-c5a9-4d8d-ba3e-33042e83372b-horizon-secret-key\") pod \"horizon-d8c64bd85-clrr4\" (UID: \"fe922d16-c5a9-4d8d-ba3e-33042e83372b\") " pod="openstack/horizon-d8c64bd85-clrr4" Oct 01 15:13:23 crc kubenswrapper[4771]: I1001 15:13:23.912159 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-nvszz" event={"ID":"313f9ab1-8fa8-476f-94cb-1d94bd975a06","Type":"ContainerStarted","Data":"dc53eed9b5af672fac2aa9b758cb652b9ba69ac0bdd36a39b3f5466d7b845360"} Oct 01 15:13:23 crc kubenswrapper[4771]: I1001 15:13:23.918262 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhghl\" (UniqueName: \"kubernetes.io/projected/fe922d16-c5a9-4d8d-ba3e-33042e83372b-kube-api-access-qhghl\") pod \"horizon-d8c64bd85-clrr4\" (UID: \"fe922d16-c5a9-4d8d-ba3e-33042e83372b\") " pod="openstack/horizon-d8c64bd85-clrr4" Oct 01 15:13:23 crc kubenswrapper[4771]: I1001 15:13:23.918776 4771 generic.go:334] "Generic (PLEG): container finished" podID="e48032b5-450d-4975-9657-d7dd1fa95f3b" containerID="76d84142faedbad0436c1d37d99a8e5d8a6b7d3db17d6a08e81984a5f20a8bff" exitCode=0 Oct 01 15:13:23 crc kubenswrapper[4771]: I1001 15:13:23.918837 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-mpc2n" event={"ID":"e48032b5-450d-4975-9657-d7dd1fa95f3b","Type":"ContainerDied","Data":"76d84142faedbad0436c1d37d99a8e5d8a6b7d3db17d6a08e81984a5f20a8bff"} Oct 01 15:13:23 crc kubenswrapper[4771]: I1001 15:13:23.918865 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-mpc2n" event={"ID":"e48032b5-450d-4975-9657-d7dd1fa95f3b","Type":"ContainerStarted","Data":"2c91548e2422a803d2392264a2c3f0843c5d0c6552d95404cd0d0b7120d7c890"} Oct 01 15:13:23 crc kubenswrapper[4771]: I1001 15:13:23.921078 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-qfvj2" podStartSLOduration=2.921067237 podStartE2EDuration="2.921067237s" podCreationTimestamp="2025-10-01 15:13:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:13:23.918603797 +0000 UTC m=+1048.537778968" watchObservedRunningTime="2025-10-01 15:13:23.921067237 +0000 UTC m=+1048.540242408" Oct 01 15:13:23 crc kubenswrapper[4771]: I1001 15:13:23.929450 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-d4zpd" event={"ID":"91950d77-9457-412f-be07-626b553f6b8d","Type":"ContainerStarted","Data":"7cb9677ed675d24cb8cd0ba643e00fd75391fcb66fa1656591799e73b061532e"} Oct 01 15:13:24 crc kubenswrapper[4771]: I1001 15:13:24.049314 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3158c13-7e3a-4467-85bb-acb01fcba800" path="/var/lib/kubelet/pods/c3158c13-7e3a-4467-85bb-acb01fcba800/volumes" Oct 01 15:13:24 crc kubenswrapper[4771]: I1001 15:13:24.075349 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-d8c64bd85-clrr4" Oct 01 15:13:24 crc kubenswrapper[4771]: I1001 15:13:24.199842 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-mpc2n" Oct 01 15:13:24 crc kubenswrapper[4771]: I1001 15:13:24.300574 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e48032b5-450d-4975-9657-d7dd1fa95f3b-ovsdbserver-nb\") pod \"e48032b5-450d-4975-9657-d7dd1fa95f3b\" (UID: \"e48032b5-450d-4975-9657-d7dd1fa95f3b\") " Oct 01 15:13:24 crc kubenswrapper[4771]: I1001 15:13:24.301010 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e48032b5-450d-4975-9657-d7dd1fa95f3b-dns-swift-storage-0\") pod \"e48032b5-450d-4975-9657-d7dd1fa95f3b\" (UID: \"e48032b5-450d-4975-9657-d7dd1fa95f3b\") " Oct 01 15:13:24 crc kubenswrapper[4771]: I1001 15:13:24.301259 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e48032b5-450d-4975-9657-d7dd1fa95f3b-dns-svc\") pod \"e48032b5-450d-4975-9657-d7dd1fa95f3b\" (UID: \"e48032b5-450d-4975-9657-d7dd1fa95f3b\") " Oct 01 15:13:24 crc kubenswrapper[4771]: I1001 15:13:24.301302 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e48032b5-450d-4975-9657-d7dd1fa95f3b-config\") pod \"e48032b5-450d-4975-9657-d7dd1fa95f3b\" (UID: \"e48032b5-450d-4975-9657-d7dd1fa95f3b\") " Oct 01 15:13:24 crc kubenswrapper[4771]: I1001 15:13:24.301378 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e48032b5-450d-4975-9657-d7dd1fa95f3b-ovsdbserver-sb\") pod \"e48032b5-450d-4975-9657-d7dd1fa95f3b\" (UID: \"e48032b5-450d-4975-9657-d7dd1fa95f3b\") " Oct 01 15:13:24 crc kubenswrapper[4771]: I1001 15:13:24.301423 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdn5c\" (UniqueName: \"kubernetes.io/projected/e48032b5-450d-4975-9657-d7dd1fa95f3b-kube-api-access-mdn5c\") pod \"e48032b5-450d-4975-9657-d7dd1fa95f3b\" (UID: \"e48032b5-450d-4975-9657-d7dd1fa95f3b\") " Oct 01 15:13:24 crc kubenswrapper[4771]: I1001 15:13:24.305831 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e48032b5-450d-4975-9657-d7dd1fa95f3b-kube-api-access-mdn5c" (OuterVolumeSpecName: "kube-api-access-mdn5c") pod "e48032b5-450d-4975-9657-d7dd1fa95f3b" (UID: "e48032b5-450d-4975-9657-d7dd1fa95f3b"). InnerVolumeSpecName "kube-api-access-mdn5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:13:24 crc kubenswrapper[4771]: I1001 15:13:24.322883 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e48032b5-450d-4975-9657-d7dd1fa95f3b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e48032b5-450d-4975-9657-d7dd1fa95f3b" (UID: "e48032b5-450d-4975-9657-d7dd1fa95f3b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:13:24 crc kubenswrapper[4771]: I1001 15:13:24.330308 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e48032b5-450d-4975-9657-d7dd1fa95f3b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e48032b5-450d-4975-9657-d7dd1fa95f3b" (UID: "e48032b5-450d-4975-9657-d7dd1fa95f3b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:13:24 crc kubenswrapper[4771]: I1001 15:13:24.338347 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e48032b5-450d-4975-9657-d7dd1fa95f3b-config" (OuterVolumeSpecName: "config") pod "e48032b5-450d-4975-9657-d7dd1fa95f3b" (UID: "e48032b5-450d-4975-9657-d7dd1fa95f3b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:13:24 crc kubenswrapper[4771]: I1001 15:13:24.347298 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e48032b5-450d-4975-9657-d7dd1fa95f3b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e48032b5-450d-4975-9657-d7dd1fa95f3b" (UID: "e48032b5-450d-4975-9657-d7dd1fa95f3b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:13:24 crc kubenswrapper[4771]: I1001 15:13:24.350575 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e48032b5-450d-4975-9657-d7dd1fa95f3b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e48032b5-450d-4975-9657-d7dd1fa95f3b" (UID: "e48032b5-450d-4975-9657-d7dd1fa95f3b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:13:24 crc kubenswrapper[4771]: I1001 15:13:24.403789 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e48032b5-450d-4975-9657-d7dd1fa95f3b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 15:13:24 crc kubenswrapper[4771]: I1001 15:13:24.403830 4771 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e48032b5-450d-4975-9657-d7dd1fa95f3b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 15:13:24 crc kubenswrapper[4771]: I1001 15:13:24.403842 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e48032b5-450d-4975-9657-d7dd1fa95f3b-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 15:13:24 crc kubenswrapper[4771]: I1001 15:13:24.403853 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e48032b5-450d-4975-9657-d7dd1fa95f3b-config\") on node \"crc\" DevicePath \"\"" Oct 01 15:13:24 crc kubenswrapper[4771]: I1001 15:13:24.403862 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e48032b5-450d-4975-9657-d7dd1fa95f3b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 15:13:24 crc kubenswrapper[4771]: I1001 15:13:24.403870 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdn5c\" (UniqueName: \"kubernetes.io/projected/e48032b5-450d-4975-9657-d7dd1fa95f3b-kube-api-access-mdn5c\") on node \"crc\" DevicePath \"\"" Oct 01 15:13:24 crc kubenswrapper[4771]: I1001 15:13:24.562866 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-d8c64bd85-clrr4"] Oct 01 15:13:24 crc kubenswrapper[4771]: W1001 15:13:24.625506 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe922d16_c5a9_4d8d_ba3e_33042e83372b.slice/crio-bfbd67eec3cfad1f839156a32e32f0619a78278bc33157788236f20470e12e1a WatchSource:0}: Error finding container bfbd67eec3cfad1f839156a32e32f0619a78278bc33157788236f20470e12e1a: Status 404 returned error can't find the container with id bfbd67eec3cfad1f839156a32e32f0619a78278bc33157788236f20470e12e1a Oct 01 15:13:24 crc kubenswrapper[4771]: I1001 15:13:24.947992 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-d8c64bd85-clrr4" event={"ID":"fe922d16-c5a9-4d8d-ba3e-33042e83372b","Type":"ContainerStarted","Data":"bfbd67eec3cfad1f839156a32e32f0619a78278bc33157788236f20470e12e1a"} Oct 01 15:13:24 crc kubenswrapper[4771]: I1001 15:13:24.961260 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-qjn7w" event={"ID":"ceb96a8c-ce5a-420e-aa1e-594d09fc1487","Type":"ContainerStarted","Data":"f749f84155fd265a1e036a5648224fc00c532993dd4d7c8e0cd2bc56722c553a"} Oct 01 15:13:24 crc kubenswrapper[4771]: I1001 15:13:24.961410 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76fcf4b695-qjn7w" Oct 01 15:13:24 crc kubenswrapper[4771]: I1001 15:13:24.968603 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55fff446b9-mpc2n" Oct 01 15:13:24 crc kubenswrapper[4771]: I1001 15:13:24.968688 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55fff446b9-mpc2n" event={"ID":"e48032b5-450d-4975-9657-d7dd1fa95f3b","Type":"ContainerDied","Data":"2c91548e2422a803d2392264a2c3f0843c5d0c6552d95404cd0d0b7120d7c890"} Oct 01 15:13:24 crc kubenswrapper[4771]: I1001 15:13:24.968743 4771 scope.go:117] "RemoveContainer" containerID="76d84142faedbad0436c1d37d99a8e5d8a6b7d3db17d6a08e81984a5f20a8bff" Oct 01 15:13:24 crc kubenswrapper[4771]: I1001 15:13:24.984808 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76fcf4b695-qjn7w" podStartSLOduration=3.984791313 podStartE2EDuration="3.984791313s" podCreationTimestamp="2025-10-01 15:13:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:13:24.981879461 +0000 UTC m=+1049.601054632" watchObservedRunningTime="2025-10-01 15:13:24.984791313 +0000 UTC m=+1049.603966484" Oct 01 15:13:25 crc kubenswrapper[4771]: I1001 15:13:25.023772 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-mpc2n"] Oct 01 15:13:25 crc kubenswrapper[4771]: I1001 15:13:25.032090 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55fff446b9-mpc2n"] Oct 01 15:13:25 crc kubenswrapper[4771]: I1001 15:13:25.994750 4771 generic.go:334] "Generic (PLEG): container finished" podID="ce394a9f-0955-4eaf-8617-c0087216c295" containerID="78f6f199594e13c88dab6b12df66e120e384a65068fea0849457119f74ae27fc" exitCode=0 Oct 01 15:13:26 crc kubenswrapper[4771]: I1001 15:13:26.002339 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e48032b5-450d-4975-9657-d7dd1fa95f3b" path="/var/lib/kubelet/pods/e48032b5-450d-4975-9657-d7dd1fa95f3b/volumes" Oct 01 15:13:26 crc kubenswrapper[4771]: I1001 15:13:26.002830 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-t97db" event={"ID":"ce394a9f-0955-4eaf-8617-c0087216c295","Type":"ContainerDied","Data":"78f6f199594e13c88dab6b12df66e120e384a65068fea0849457119f74ae27fc"} Oct 01 15:13:27 crc kubenswrapper[4771]: I1001 15:13:27.365207 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-t97db" Oct 01 15:13:27 crc kubenswrapper[4771]: I1001 15:13:27.467008 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce394a9f-0955-4eaf-8617-c0087216c295-config-data\") pod \"ce394a9f-0955-4eaf-8617-c0087216c295\" (UID: \"ce394a9f-0955-4eaf-8617-c0087216c295\") " Oct 01 15:13:27 crc kubenswrapper[4771]: I1001 15:13:27.467063 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62q6f\" (UniqueName: \"kubernetes.io/projected/ce394a9f-0955-4eaf-8617-c0087216c295-kube-api-access-62q6f\") pod \"ce394a9f-0955-4eaf-8617-c0087216c295\" (UID: \"ce394a9f-0955-4eaf-8617-c0087216c295\") " Oct 01 15:13:27 crc kubenswrapper[4771]: I1001 15:13:27.467102 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce394a9f-0955-4eaf-8617-c0087216c295-combined-ca-bundle\") pod \"ce394a9f-0955-4eaf-8617-c0087216c295\" (UID: \"ce394a9f-0955-4eaf-8617-c0087216c295\") " Oct 01 15:13:27 crc kubenswrapper[4771]: I1001 15:13:27.467124 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce394a9f-0955-4eaf-8617-c0087216c295-scripts\") pod \"ce394a9f-0955-4eaf-8617-c0087216c295\" (UID: \"ce394a9f-0955-4eaf-8617-c0087216c295\") " Oct 01 15:13:27 crc kubenswrapper[4771]: I1001 15:13:27.467182 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ce394a9f-0955-4eaf-8617-c0087216c295-credential-keys\") pod \"ce394a9f-0955-4eaf-8617-c0087216c295\" (UID: \"ce394a9f-0955-4eaf-8617-c0087216c295\") " Oct 01 15:13:27 crc kubenswrapper[4771]: I1001 15:13:27.468410 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ce394a9f-0955-4eaf-8617-c0087216c295-fernet-keys\") pod \"ce394a9f-0955-4eaf-8617-c0087216c295\" (UID: \"ce394a9f-0955-4eaf-8617-c0087216c295\") " Oct 01 15:13:27 crc kubenswrapper[4771]: I1001 15:13:27.473765 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce394a9f-0955-4eaf-8617-c0087216c295-kube-api-access-62q6f" (OuterVolumeSpecName: "kube-api-access-62q6f") pod "ce394a9f-0955-4eaf-8617-c0087216c295" (UID: "ce394a9f-0955-4eaf-8617-c0087216c295"). InnerVolumeSpecName "kube-api-access-62q6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:13:27 crc kubenswrapper[4771]: I1001 15:13:27.475482 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce394a9f-0955-4eaf-8617-c0087216c295-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "ce394a9f-0955-4eaf-8617-c0087216c295" (UID: "ce394a9f-0955-4eaf-8617-c0087216c295"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:13:27 crc kubenswrapper[4771]: I1001 15:13:27.487198 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce394a9f-0955-4eaf-8617-c0087216c295-scripts" (OuterVolumeSpecName: "scripts") pod "ce394a9f-0955-4eaf-8617-c0087216c295" (UID: "ce394a9f-0955-4eaf-8617-c0087216c295"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:13:27 crc kubenswrapper[4771]: I1001 15:13:27.487239 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce394a9f-0955-4eaf-8617-c0087216c295-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ce394a9f-0955-4eaf-8617-c0087216c295" (UID: "ce394a9f-0955-4eaf-8617-c0087216c295"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:13:27 crc kubenswrapper[4771]: I1001 15:13:27.495312 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce394a9f-0955-4eaf-8617-c0087216c295-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce394a9f-0955-4eaf-8617-c0087216c295" (UID: "ce394a9f-0955-4eaf-8617-c0087216c295"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:13:27 crc kubenswrapper[4771]: I1001 15:13:27.496301 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce394a9f-0955-4eaf-8617-c0087216c295-config-data" (OuterVolumeSpecName: "config-data") pod "ce394a9f-0955-4eaf-8617-c0087216c295" (UID: "ce394a9f-0955-4eaf-8617-c0087216c295"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:13:27 crc kubenswrapper[4771]: I1001 15:13:27.570659 4771 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ce394a9f-0955-4eaf-8617-c0087216c295-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 01 15:13:27 crc kubenswrapper[4771]: I1001 15:13:27.570695 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce394a9f-0955-4eaf-8617-c0087216c295-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 15:13:27 crc kubenswrapper[4771]: I1001 15:13:27.570709 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62q6f\" (UniqueName: \"kubernetes.io/projected/ce394a9f-0955-4eaf-8617-c0087216c295-kube-api-access-62q6f\") on node \"crc\" DevicePath \"\"" Oct 01 15:13:27 crc kubenswrapper[4771]: I1001 15:13:27.570722 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce394a9f-0955-4eaf-8617-c0087216c295-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:13:27 crc kubenswrapper[4771]: I1001 15:13:27.570749 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce394a9f-0955-4eaf-8617-c0087216c295-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 15:13:27 crc kubenswrapper[4771]: I1001 15:13:27.570759 4771 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ce394a9f-0955-4eaf-8617-c0087216c295-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 01 15:13:28 crc kubenswrapper[4771]: I1001 15:13:28.027444 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-t97db" event={"ID":"ce394a9f-0955-4eaf-8617-c0087216c295","Type":"ContainerDied","Data":"dbba7dda23d192bfe44659c7ee196c040079bcd228604189fcd73d43e4733fe5"} Oct 01 15:13:28 crc kubenswrapper[4771]: I1001 15:13:28.027486 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbba7dda23d192bfe44659c7ee196c040079bcd228604189fcd73d43e4733fe5" Oct 01 15:13:28 crc kubenswrapper[4771]: I1001 15:13:28.027571 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-t97db" Oct 01 15:13:28 crc kubenswrapper[4771]: I1001 15:13:28.310240 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-t97db"] Oct 01 15:13:28 crc kubenswrapper[4771]: I1001 15:13:28.317380 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-t97db"] Oct 01 15:13:28 crc kubenswrapper[4771]: I1001 15:13:28.406250 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-r9ffd"] Oct 01 15:13:28 crc kubenswrapper[4771]: E1001 15:13:28.406805 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e48032b5-450d-4975-9657-d7dd1fa95f3b" containerName="init" Oct 01 15:13:28 crc kubenswrapper[4771]: I1001 15:13:28.407092 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e48032b5-450d-4975-9657-d7dd1fa95f3b" containerName="init" Oct 01 15:13:28 crc kubenswrapper[4771]: E1001 15:13:28.407132 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce394a9f-0955-4eaf-8617-c0087216c295" containerName="keystone-bootstrap" Oct 01 15:13:28 crc kubenswrapper[4771]: I1001 15:13:28.407141 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce394a9f-0955-4eaf-8617-c0087216c295" containerName="keystone-bootstrap" Oct 01 15:13:28 crc kubenswrapper[4771]: I1001 15:13:28.407386 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e48032b5-450d-4975-9657-d7dd1fa95f3b" containerName="init" Oct 01 15:13:28 crc kubenswrapper[4771]: I1001 15:13:28.407412 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce394a9f-0955-4eaf-8617-c0087216c295" containerName="keystone-bootstrap" Oct 01 15:13:28 crc kubenswrapper[4771]: I1001 15:13:28.408295 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-r9ffd" Oct 01 15:13:28 crc kubenswrapper[4771]: I1001 15:13:28.413023 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 01 15:13:28 crc kubenswrapper[4771]: I1001 15:13:28.413932 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 01 15:13:28 crc kubenswrapper[4771]: I1001 15:13:28.419148 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 01 15:13:28 crc kubenswrapper[4771]: I1001 15:13:28.420130 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-h6k26" Oct 01 15:13:28 crc kubenswrapper[4771]: I1001 15:13:28.443299 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-r9ffd"] Oct 01 15:13:28 crc kubenswrapper[4771]: I1001 15:13:28.486676 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a10ee2ec-1ec2-4353-ad8b-74ac0e031289-fernet-keys\") pod \"keystone-bootstrap-r9ffd\" (UID: \"a10ee2ec-1ec2-4353-ad8b-74ac0e031289\") " pod="openstack/keystone-bootstrap-r9ffd" Oct 01 15:13:28 crc kubenswrapper[4771]: I1001 15:13:28.486744 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p27m\" (UniqueName: \"kubernetes.io/projected/a10ee2ec-1ec2-4353-ad8b-74ac0e031289-kube-api-access-9p27m\") pod \"keystone-bootstrap-r9ffd\" (UID: \"a10ee2ec-1ec2-4353-ad8b-74ac0e031289\") " pod="openstack/keystone-bootstrap-r9ffd" Oct 01 15:13:28 crc kubenswrapper[4771]: I1001 15:13:28.486774 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a10ee2ec-1ec2-4353-ad8b-74ac0e031289-credential-keys\") pod \"keystone-bootstrap-r9ffd\" (UID: \"a10ee2ec-1ec2-4353-ad8b-74ac0e031289\") " pod="openstack/keystone-bootstrap-r9ffd" Oct 01 15:13:28 crc kubenswrapper[4771]: I1001 15:13:28.486882 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a10ee2ec-1ec2-4353-ad8b-74ac0e031289-config-data\") pod \"keystone-bootstrap-r9ffd\" (UID: \"a10ee2ec-1ec2-4353-ad8b-74ac0e031289\") " pod="openstack/keystone-bootstrap-r9ffd" Oct 01 15:13:28 crc kubenswrapper[4771]: I1001 15:13:28.486947 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a10ee2ec-1ec2-4353-ad8b-74ac0e031289-scripts\") pod \"keystone-bootstrap-r9ffd\" (UID: \"a10ee2ec-1ec2-4353-ad8b-74ac0e031289\") " pod="openstack/keystone-bootstrap-r9ffd" Oct 01 15:13:28 crc kubenswrapper[4771]: I1001 15:13:28.487035 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a10ee2ec-1ec2-4353-ad8b-74ac0e031289-combined-ca-bundle\") pod \"keystone-bootstrap-r9ffd\" (UID: \"a10ee2ec-1ec2-4353-ad8b-74ac0e031289\") " pod="openstack/keystone-bootstrap-r9ffd" Oct 01 15:13:28 crc kubenswrapper[4771]: I1001 15:13:28.601612 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9p27m\" (UniqueName: \"kubernetes.io/projected/a10ee2ec-1ec2-4353-ad8b-74ac0e031289-kube-api-access-9p27m\") pod \"keystone-bootstrap-r9ffd\" (UID: \"a10ee2ec-1ec2-4353-ad8b-74ac0e031289\") " pod="openstack/keystone-bootstrap-r9ffd" Oct 01 15:13:28 crc kubenswrapper[4771]: I1001 15:13:28.601689 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a10ee2ec-1ec2-4353-ad8b-74ac0e031289-credential-keys\") pod \"keystone-bootstrap-r9ffd\" (UID: \"a10ee2ec-1ec2-4353-ad8b-74ac0e031289\") " pod="openstack/keystone-bootstrap-r9ffd" Oct 01 15:13:28 crc kubenswrapper[4771]: I1001 15:13:28.601802 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a10ee2ec-1ec2-4353-ad8b-74ac0e031289-config-data\") pod \"keystone-bootstrap-r9ffd\" (UID: \"a10ee2ec-1ec2-4353-ad8b-74ac0e031289\") " pod="openstack/keystone-bootstrap-r9ffd" Oct 01 15:13:28 crc kubenswrapper[4771]: I1001 15:13:28.601835 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a10ee2ec-1ec2-4353-ad8b-74ac0e031289-scripts\") pod \"keystone-bootstrap-r9ffd\" (UID: \"a10ee2ec-1ec2-4353-ad8b-74ac0e031289\") " pod="openstack/keystone-bootstrap-r9ffd" Oct 01 15:13:28 crc kubenswrapper[4771]: I1001 15:13:28.601902 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a10ee2ec-1ec2-4353-ad8b-74ac0e031289-combined-ca-bundle\") pod \"keystone-bootstrap-r9ffd\" (UID: \"a10ee2ec-1ec2-4353-ad8b-74ac0e031289\") " pod="openstack/keystone-bootstrap-r9ffd" Oct 01 15:13:28 crc kubenswrapper[4771]: I1001 15:13:28.601955 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a10ee2ec-1ec2-4353-ad8b-74ac0e031289-fernet-keys\") pod \"keystone-bootstrap-r9ffd\" (UID: \"a10ee2ec-1ec2-4353-ad8b-74ac0e031289\") " pod="openstack/keystone-bootstrap-r9ffd" Oct 01 15:13:28 crc kubenswrapper[4771]: I1001 15:13:28.606595 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a10ee2ec-1ec2-4353-ad8b-74ac0e031289-config-data\") pod \"keystone-bootstrap-r9ffd\" (UID: \"a10ee2ec-1ec2-4353-ad8b-74ac0e031289\") " pod="openstack/keystone-bootstrap-r9ffd" Oct 01 15:13:28 crc kubenswrapper[4771]: I1001 15:13:28.606592 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a10ee2ec-1ec2-4353-ad8b-74ac0e031289-fernet-keys\") pod \"keystone-bootstrap-r9ffd\" (UID: \"a10ee2ec-1ec2-4353-ad8b-74ac0e031289\") " pod="openstack/keystone-bootstrap-r9ffd" Oct 01 15:13:28 crc kubenswrapper[4771]: I1001 15:13:28.611976 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a10ee2ec-1ec2-4353-ad8b-74ac0e031289-scripts\") pod \"keystone-bootstrap-r9ffd\" (UID: \"a10ee2ec-1ec2-4353-ad8b-74ac0e031289\") " pod="openstack/keystone-bootstrap-r9ffd" Oct 01 15:13:28 crc kubenswrapper[4771]: I1001 15:13:28.622155 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a10ee2ec-1ec2-4353-ad8b-74ac0e031289-combined-ca-bundle\") pod \"keystone-bootstrap-r9ffd\" (UID: \"a10ee2ec-1ec2-4353-ad8b-74ac0e031289\") " pod="openstack/keystone-bootstrap-r9ffd" Oct 01 15:13:28 crc kubenswrapper[4771]: I1001 15:13:28.622706 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a10ee2ec-1ec2-4353-ad8b-74ac0e031289-credential-keys\") pod \"keystone-bootstrap-r9ffd\" (UID: \"a10ee2ec-1ec2-4353-ad8b-74ac0e031289\") " pod="openstack/keystone-bootstrap-r9ffd" Oct 01 15:13:28 crc kubenswrapper[4771]: I1001 15:13:28.652558 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p27m\" (UniqueName: \"kubernetes.io/projected/a10ee2ec-1ec2-4353-ad8b-74ac0e031289-kube-api-access-9p27m\") pod \"keystone-bootstrap-r9ffd\" (UID: \"a10ee2ec-1ec2-4353-ad8b-74ac0e031289\") " pod="openstack/keystone-bootstrap-r9ffd" Oct 01 15:13:28 crc kubenswrapper[4771]: I1001 15:13:28.739539 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-r9ffd" Oct 01 15:13:29 crc kubenswrapper[4771]: I1001 15:13:29.999263 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce394a9f-0955-4eaf-8617-c0087216c295" path="/var/lib/kubelet/pods/ce394a9f-0955-4eaf-8617-c0087216c295/volumes" Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.464591 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7ccd9d5bb7-8mdjn"] Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.492235 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-b48cbbc84-ndtts"] Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.494175 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b48cbbc84-ndtts" Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.498320 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.512469 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-b48cbbc84-ndtts"] Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.570485 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-d8c64bd85-clrr4"] Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.597352 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-66679756f6-g56hw"] Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.601072 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-66679756f6-g56hw" Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.617633 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-66679756f6-g56hw"] Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.633043 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5159749-76e2-4e89-8a81-59d8ea1ab063-logs\") pod \"horizon-b48cbbc84-ndtts\" (UID: \"a5159749-76e2-4e89-8a81-59d8ea1ab063\") " pod="openstack/horizon-b48cbbc84-ndtts" Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.633106 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a5159749-76e2-4e89-8a81-59d8ea1ab063-scripts\") pod \"horizon-b48cbbc84-ndtts\" (UID: \"a5159749-76e2-4e89-8a81-59d8ea1ab063\") " pod="openstack/horizon-b48cbbc84-ndtts" Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.633143 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7trtk\" (UniqueName: \"kubernetes.io/projected/a5159749-76e2-4e89-8a81-59d8ea1ab063-kube-api-access-7trtk\") pod \"horizon-b48cbbc84-ndtts\" (UID: \"a5159749-76e2-4e89-8a81-59d8ea1ab063\") " pod="openstack/horizon-b48cbbc84-ndtts" Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.633199 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5159749-76e2-4e89-8a81-59d8ea1ab063-horizon-tls-certs\") pod \"horizon-b48cbbc84-ndtts\" (UID: \"a5159749-76e2-4e89-8a81-59d8ea1ab063\") " pod="openstack/horizon-b48cbbc84-ndtts" Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.633219 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a5159749-76e2-4e89-8a81-59d8ea1ab063-horizon-secret-key\") pod \"horizon-b48cbbc84-ndtts\" (UID: \"a5159749-76e2-4e89-8a81-59d8ea1ab063\") " pod="openstack/horizon-b48cbbc84-ndtts" Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.633243 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5159749-76e2-4e89-8a81-59d8ea1ab063-combined-ca-bundle\") pod \"horizon-b48cbbc84-ndtts\" (UID: \"a5159749-76e2-4e89-8a81-59d8ea1ab063\") " pod="openstack/horizon-b48cbbc84-ndtts" Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.633279 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a5159749-76e2-4e89-8a81-59d8ea1ab063-config-data\") pod \"horizon-b48cbbc84-ndtts\" (UID: \"a5159749-76e2-4e89-8a81-59d8ea1ab063\") " pod="openstack/horizon-b48cbbc84-ndtts" Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.734569 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a5159749-76e2-4e89-8a81-59d8ea1ab063-scripts\") pod \"horizon-b48cbbc84-ndtts\" (UID: \"a5159749-76e2-4e89-8a81-59d8ea1ab063\") " pod="openstack/horizon-b48cbbc84-ndtts" Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.734625 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/405e12dd-6888-4994-ac26-b2836ad9069c-scripts\") pod \"horizon-66679756f6-g56hw\" (UID: \"405e12dd-6888-4994-ac26-b2836ad9069c\") " pod="openstack/horizon-66679756f6-g56hw" Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.734708 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/405e12dd-6888-4994-ac26-b2836ad9069c-horizon-tls-certs\") pod \"horizon-66679756f6-g56hw\" (UID: \"405e12dd-6888-4994-ac26-b2836ad9069c\") " pod="openstack/horizon-66679756f6-g56hw" Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.734771 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/405e12dd-6888-4994-ac26-b2836ad9069c-combined-ca-bundle\") pod \"horizon-66679756f6-g56hw\" (UID: \"405e12dd-6888-4994-ac26-b2836ad9069c\") " pod="openstack/horizon-66679756f6-g56hw" Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.734793 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7trtk\" (UniqueName: \"kubernetes.io/projected/a5159749-76e2-4e89-8a81-59d8ea1ab063-kube-api-access-7trtk\") pod \"horizon-b48cbbc84-ndtts\" (UID: \"a5159749-76e2-4e89-8a81-59d8ea1ab063\") " pod="openstack/horizon-b48cbbc84-ndtts" Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.734969 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/405e12dd-6888-4994-ac26-b2836ad9069c-horizon-secret-key\") pod \"horizon-66679756f6-g56hw\" (UID: \"405e12dd-6888-4994-ac26-b2836ad9069c\") " pod="openstack/horizon-66679756f6-g56hw" Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.735131 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5159749-76e2-4e89-8a81-59d8ea1ab063-horizon-tls-certs\") pod \"horizon-b48cbbc84-ndtts\" (UID: \"a5159749-76e2-4e89-8a81-59d8ea1ab063\") " pod="openstack/horizon-b48cbbc84-ndtts" Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.735203 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a5159749-76e2-4e89-8a81-59d8ea1ab063-horizon-secret-key\") pod \"horizon-b48cbbc84-ndtts\" (UID: \"a5159749-76e2-4e89-8a81-59d8ea1ab063\") " pod="openstack/horizon-b48cbbc84-ndtts" Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.735288 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5159749-76e2-4e89-8a81-59d8ea1ab063-combined-ca-bundle\") pod \"horizon-b48cbbc84-ndtts\" (UID: \"a5159749-76e2-4e89-8a81-59d8ea1ab063\") " pod="openstack/horizon-b48cbbc84-ndtts" Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.735334 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/405e12dd-6888-4994-ac26-b2836ad9069c-logs\") pod \"horizon-66679756f6-g56hw\" (UID: \"405e12dd-6888-4994-ac26-b2836ad9069c\") " pod="openstack/horizon-66679756f6-g56hw" Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.735362 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/405e12dd-6888-4994-ac26-b2836ad9069c-config-data\") pod \"horizon-66679756f6-g56hw\" (UID: \"405e12dd-6888-4994-ac26-b2836ad9069c\") " pod="openstack/horizon-66679756f6-g56hw" Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.735400 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a5159749-76e2-4e89-8a81-59d8ea1ab063-config-data\") pod \"horizon-b48cbbc84-ndtts\" (UID: \"a5159749-76e2-4e89-8a81-59d8ea1ab063\") " pod="openstack/horizon-b48cbbc84-ndtts" Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.735532 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pzzh\" (UniqueName: \"kubernetes.io/projected/405e12dd-6888-4994-ac26-b2836ad9069c-kube-api-access-8pzzh\") pod \"horizon-66679756f6-g56hw\" (UID: \"405e12dd-6888-4994-ac26-b2836ad9069c\") " pod="openstack/horizon-66679756f6-g56hw" Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.735623 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5159749-76e2-4e89-8a81-59d8ea1ab063-logs\") pod \"horizon-b48cbbc84-ndtts\" (UID: \"a5159749-76e2-4e89-8a81-59d8ea1ab063\") " pod="openstack/horizon-b48cbbc84-ndtts" Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.736122 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5159749-76e2-4e89-8a81-59d8ea1ab063-logs\") pod \"horizon-b48cbbc84-ndtts\" (UID: \"a5159749-76e2-4e89-8a81-59d8ea1ab063\") " pod="openstack/horizon-b48cbbc84-ndtts" Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.735529 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a5159749-76e2-4e89-8a81-59d8ea1ab063-scripts\") pod \"horizon-b48cbbc84-ndtts\" (UID: \"a5159749-76e2-4e89-8a81-59d8ea1ab063\") " pod="openstack/horizon-b48cbbc84-ndtts" Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.738804 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a5159749-76e2-4e89-8a81-59d8ea1ab063-config-data\") pod \"horizon-b48cbbc84-ndtts\" (UID: \"a5159749-76e2-4e89-8a81-59d8ea1ab063\") " pod="openstack/horizon-b48cbbc84-ndtts" Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.743230 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a5159749-76e2-4e89-8a81-59d8ea1ab063-horizon-secret-key\") pod \"horizon-b48cbbc84-ndtts\" (UID: \"a5159749-76e2-4e89-8a81-59d8ea1ab063\") " pod="openstack/horizon-b48cbbc84-ndtts" Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.743270 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5159749-76e2-4e89-8a81-59d8ea1ab063-combined-ca-bundle\") pod \"horizon-b48cbbc84-ndtts\" (UID: \"a5159749-76e2-4e89-8a81-59d8ea1ab063\") " pod="openstack/horizon-b48cbbc84-ndtts" Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.745033 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5159749-76e2-4e89-8a81-59d8ea1ab063-horizon-tls-certs\") pod \"horizon-b48cbbc84-ndtts\" (UID: \"a5159749-76e2-4e89-8a81-59d8ea1ab063\") " pod="openstack/horizon-b48cbbc84-ndtts" Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.758786 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7trtk\" (UniqueName: \"kubernetes.io/projected/a5159749-76e2-4e89-8a81-59d8ea1ab063-kube-api-access-7trtk\") pod \"horizon-b48cbbc84-ndtts\" (UID: \"a5159749-76e2-4e89-8a81-59d8ea1ab063\") " pod="openstack/horizon-b48cbbc84-ndtts" Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.837480 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/405e12dd-6888-4994-ac26-b2836ad9069c-scripts\") pod \"horizon-66679756f6-g56hw\" (UID: \"405e12dd-6888-4994-ac26-b2836ad9069c\") " pod="openstack/horizon-66679756f6-g56hw" Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.837781 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/405e12dd-6888-4994-ac26-b2836ad9069c-horizon-tls-certs\") pod \"horizon-66679756f6-g56hw\" (UID: \"405e12dd-6888-4994-ac26-b2836ad9069c\") " pod="openstack/horizon-66679756f6-g56hw" Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.837809 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/405e12dd-6888-4994-ac26-b2836ad9069c-combined-ca-bundle\") pod \"horizon-66679756f6-g56hw\" (UID: \"405e12dd-6888-4994-ac26-b2836ad9069c\") " pod="openstack/horizon-66679756f6-g56hw" Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.837854 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/405e12dd-6888-4994-ac26-b2836ad9069c-horizon-secret-key\") pod \"horizon-66679756f6-g56hw\" (UID: \"405e12dd-6888-4994-ac26-b2836ad9069c\") " pod="openstack/horizon-66679756f6-g56hw" Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.838135 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/405e12dd-6888-4994-ac26-b2836ad9069c-logs\") pod \"horizon-66679756f6-g56hw\" (UID: \"405e12dd-6888-4994-ac26-b2836ad9069c\") " pod="openstack/horizon-66679756f6-g56hw" Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.838206 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/405e12dd-6888-4994-ac26-b2836ad9069c-config-data\") pod \"horizon-66679756f6-g56hw\" (UID: \"405e12dd-6888-4994-ac26-b2836ad9069c\") " pod="openstack/horizon-66679756f6-g56hw" Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.838356 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pzzh\" (UniqueName: \"kubernetes.io/projected/405e12dd-6888-4994-ac26-b2836ad9069c-kube-api-access-8pzzh\") pod \"horizon-66679756f6-g56hw\" (UID: \"405e12dd-6888-4994-ac26-b2836ad9069c\") " pod="openstack/horizon-66679756f6-g56hw" Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.838454 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/405e12dd-6888-4994-ac26-b2836ad9069c-scripts\") pod \"horizon-66679756f6-g56hw\" (UID: \"405e12dd-6888-4994-ac26-b2836ad9069c\") " pod="openstack/horizon-66679756f6-g56hw" Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.838665 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/405e12dd-6888-4994-ac26-b2836ad9069c-logs\") pod \"horizon-66679756f6-g56hw\" (UID: \"405e12dd-6888-4994-ac26-b2836ad9069c\") " pod="openstack/horizon-66679756f6-g56hw" Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.839291 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/405e12dd-6888-4994-ac26-b2836ad9069c-config-data\") pod \"horizon-66679756f6-g56hw\" (UID: \"405e12dd-6888-4994-ac26-b2836ad9069c\") " pod="openstack/horizon-66679756f6-g56hw" Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.841416 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/405e12dd-6888-4994-ac26-b2836ad9069c-horizon-secret-key\") pod \"horizon-66679756f6-g56hw\" (UID: \"405e12dd-6888-4994-ac26-b2836ad9069c\") " pod="openstack/horizon-66679756f6-g56hw" Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.841770 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/405e12dd-6888-4994-ac26-b2836ad9069c-combined-ca-bundle\") pod \"horizon-66679756f6-g56hw\" (UID: \"405e12dd-6888-4994-ac26-b2836ad9069c\") " pod="openstack/horizon-66679756f6-g56hw" Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.842020 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b48cbbc84-ndtts" Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.843583 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/405e12dd-6888-4994-ac26-b2836ad9069c-horizon-tls-certs\") pod \"horizon-66679756f6-g56hw\" (UID: \"405e12dd-6888-4994-ac26-b2836ad9069c\") " pod="openstack/horizon-66679756f6-g56hw" Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.857287 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pzzh\" (UniqueName: \"kubernetes.io/projected/405e12dd-6888-4994-ac26-b2836ad9069c-kube-api-access-8pzzh\") pod \"horizon-66679756f6-g56hw\" (UID: \"405e12dd-6888-4994-ac26-b2836ad9069c\") " pod="openstack/horizon-66679756f6-g56hw" Oct 01 15:13:30 crc kubenswrapper[4771]: I1001 15:13:30.921804 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-66679756f6-g56hw" Oct 01 15:13:32 crc kubenswrapper[4771]: I1001 15:13:32.020147 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76fcf4b695-qjn7w" Oct 01 15:13:32 crc kubenswrapper[4771]: I1001 15:13:32.089823 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-4tm2t"] Oct 01 15:13:32 crc kubenswrapper[4771]: I1001 15:13:32.090082 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-4tm2t" podUID="a66f8282-966c-40de-9bfd-b6b75d5f519c" containerName="dnsmasq-dns" containerID="cri-o://28bfc8f04d9096eedd6aed5ba39ff9d3b683674cb90a41f4b2e783c56d01aa3b" gracePeriod=10 Oct 01 15:13:33 crc kubenswrapper[4771]: I1001 15:13:33.084830 4771 generic.go:334] "Generic (PLEG): container finished" podID="a66f8282-966c-40de-9bfd-b6b75d5f519c" containerID="28bfc8f04d9096eedd6aed5ba39ff9d3b683674cb90a41f4b2e783c56d01aa3b" exitCode=0 Oct 01 15:13:33 crc kubenswrapper[4771]: I1001 15:13:33.084918 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-4tm2t" event={"ID":"a66f8282-966c-40de-9bfd-b6b75d5f519c","Type":"ContainerDied","Data":"28bfc8f04d9096eedd6aed5ba39ff9d3b683674cb90a41f4b2e783c56d01aa3b"} Oct 01 15:13:33 crc kubenswrapper[4771]: I1001 15:13:33.640929 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-4tm2t" podUID="a66f8282-966c-40de-9bfd-b6b75d5f519c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: connect: connection refused" Oct 01 15:13:43 crc kubenswrapper[4771]: I1001 15:13:43.641481 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-4tm2t" podUID="a66f8282-966c-40de-9bfd-b6b75d5f519c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: i/o timeout" Oct 01 15:13:48 crc kubenswrapper[4771]: I1001 15:13:48.643146 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-4tm2t" podUID="a66f8282-966c-40de-9bfd-b6b75d5f519c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: i/o timeout" Oct 01 15:13:48 crc kubenswrapper[4771]: I1001 15:13:48.644032 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-4tm2t" Oct 01 15:13:49 crc kubenswrapper[4771]: E1001 15:13:49.099097 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Oct 01 15:13:49 crc kubenswrapper[4771]: E1001 15:13:49.099279 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6ncw8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-gq7gl_openstack(7b7689a2-6ac8-47ac-86f7-7456994c39ca): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 15:13:49 crc kubenswrapper[4771]: E1001 15:13:49.100357 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-gq7gl" podUID="7b7689a2-6ac8-47ac-86f7-7456994c39ca" Oct 01 15:13:49 crc kubenswrapper[4771]: E1001 15:13:49.215492 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-gq7gl" podUID="7b7689a2-6ac8-47ac-86f7-7456994c39ca" Oct 01 15:13:52 crc kubenswrapper[4771]: E1001 15:13:52.182418 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api:current-podified" Oct 01 15:13:52 crc kubenswrapper[4771]: E1001 15:13:52.183246 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zlfnq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-d4zpd_openstack(91950d77-9457-412f-be07-626b553f6b8d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 15:13:52 crc kubenswrapper[4771]: E1001 15:13:52.184679 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-d4zpd" podUID="91950d77-9457-412f-be07-626b553f6b8d" Oct 01 15:13:52 crc kubenswrapper[4771]: E1001 15:13:52.241282 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api:current-podified\\\"\"" pod="openstack/placement-db-sync-d4zpd" podUID="91950d77-9457-412f-be07-626b553f6b8d" Oct 01 15:13:53 crc kubenswrapper[4771]: I1001 15:13:53.644171 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-4tm2t" podUID="a66f8282-966c-40de-9bfd-b6b75d5f519c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: i/o timeout" Oct 01 15:13:53 crc kubenswrapper[4771]: I1001 15:13:53.881270 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-4tm2t" Oct 01 15:13:53 crc kubenswrapper[4771]: I1001 15:13:53.984261 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qldhc\" (UniqueName: \"kubernetes.io/projected/a66f8282-966c-40de-9bfd-b6b75d5f519c-kube-api-access-qldhc\") pod \"a66f8282-966c-40de-9bfd-b6b75d5f519c\" (UID: \"a66f8282-966c-40de-9bfd-b6b75d5f519c\") " Oct 01 15:13:53 crc kubenswrapper[4771]: I1001 15:13:53.984365 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a66f8282-966c-40de-9bfd-b6b75d5f519c-ovsdbserver-nb\") pod \"a66f8282-966c-40de-9bfd-b6b75d5f519c\" (UID: \"a66f8282-966c-40de-9bfd-b6b75d5f519c\") " Oct 01 15:13:53 crc kubenswrapper[4771]: I1001 15:13:53.984438 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a66f8282-966c-40de-9bfd-b6b75d5f519c-ovsdbserver-sb\") pod \"a66f8282-966c-40de-9bfd-b6b75d5f519c\" (UID: \"a66f8282-966c-40de-9bfd-b6b75d5f519c\") " Oct 01 15:13:53 crc kubenswrapper[4771]: I1001 15:13:53.984483 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a66f8282-966c-40de-9bfd-b6b75d5f519c-dns-svc\") pod \"a66f8282-966c-40de-9bfd-b6b75d5f519c\" (UID: \"a66f8282-966c-40de-9bfd-b6b75d5f519c\") " Oct 01 15:13:53 crc kubenswrapper[4771]: I1001 15:13:53.984838 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a66f8282-966c-40de-9bfd-b6b75d5f519c-config\") pod \"a66f8282-966c-40de-9bfd-b6b75d5f519c\" (UID: \"a66f8282-966c-40de-9bfd-b6b75d5f519c\") " Oct 01 15:13:53 crc kubenswrapper[4771]: I1001 15:13:53.995991 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a66f8282-966c-40de-9bfd-b6b75d5f519c-kube-api-access-qldhc" (OuterVolumeSpecName: "kube-api-access-qldhc") pod "a66f8282-966c-40de-9bfd-b6b75d5f519c" (UID: "a66f8282-966c-40de-9bfd-b6b75d5f519c"). InnerVolumeSpecName "kube-api-access-qldhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:13:54 crc kubenswrapper[4771]: I1001 15:13:54.027337 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a66f8282-966c-40de-9bfd-b6b75d5f519c-config" (OuterVolumeSpecName: "config") pod "a66f8282-966c-40de-9bfd-b6b75d5f519c" (UID: "a66f8282-966c-40de-9bfd-b6b75d5f519c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:13:54 crc kubenswrapper[4771]: I1001 15:13:54.031782 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a66f8282-966c-40de-9bfd-b6b75d5f519c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a66f8282-966c-40de-9bfd-b6b75d5f519c" (UID: "a66f8282-966c-40de-9bfd-b6b75d5f519c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:13:54 crc kubenswrapper[4771]: I1001 15:13:54.032705 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a66f8282-966c-40de-9bfd-b6b75d5f519c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a66f8282-966c-40de-9bfd-b6b75d5f519c" (UID: "a66f8282-966c-40de-9bfd-b6b75d5f519c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:13:54 crc kubenswrapper[4771]: I1001 15:13:54.049088 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a66f8282-966c-40de-9bfd-b6b75d5f519c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a66f8282-966c-40de-9bfd-b6b75d5f519c" (UID: "a66f8282-966c-40de-9bfd-b6b75d5f519c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:13:54 crc kubenswrapper[4771]: I1001 15:13:54.087385 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a66f8282-966c-40de-9bfd-b6b75d5f519c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 15:13:54 crc kubenswrapper[4771]: I1001 15:13:54.087423 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a66f8282-966c-40de-9bfd-b6b75d5f519c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 15:13:54 crc kubenswrapper[4771]: I1001 15:13:54.087434 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a66f8282-966c-40de-9bfd-b6b75d5f519c-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 15:13:54 crc kubenswrapper[4771]: I1001 15:13:54.087444 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a66f8282-966c-40de-9bfd-b6b75d5f519c-config\") on node \"crc\" DevicePath \"\"" Oct 01 15:13:54 crc kubenswrapper[4771]: I1001 15:13:54.087455 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qldhc\" (UniqueName: \"kubernetes.io/projected/a66f8282-966c-40de-9bfd-b6b75d5f519c-kube-api-access-qldhc\") on node \"crc\" DevicePath \"\"" Oct 01 15:13:54 crc kubenswrapper[4771]: I1001 15:13:54.256973 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-4tm2t" Oct 01 15:13:54 crc kubenswrapper[4771]: I1001 15:13:54.256843 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-4tm2t" event={"ID":"a66f8282-966c-40de-9bfd-b6b75d5f519c","Type":"ContainerDied","Data":"148b87f32c66d63c41fa765c5c9ec6366ed06e55a0e8a472fac7399e98760609"} Oct 01 15:13:54 crc kubenswrapper[4771]: I1001 15:13:54.257845 4771 scope.go:117] "RemoveContainer" containerID="28bfc8f04d9096eedd6aed5ba39ff9d3b683674cb90a41f4b2e783c56d01aa3b" Oct 01 15:13:54 crc kubenswrapper[4771]: I1001 15:13:54.294884 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-4tm2t"] Oct 01 15:13:54 crc kubenswrapper[4771]: I1001 15:13:54.305061 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-4tm2t"] Oct 01 15:13:55 crc kubenswrapper[4771]: E1001 15:13:55.222569 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Oct 01 15:13:55 crc kubenswrapper[4771]: E1001 15:13:55.223368 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-75gmn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-99rss_openstack(9a523e37-804a-4173-8012-19848efc8cc0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 15:13:55 crc kubenswrapper[4771]: E1001 15:13:55.224910 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-99rss" podUID="9a523e37-804a-4173-8012-19848efc8cc0" Oct 01 15:13:55 crc kubenswrapper[4771]: E1001 15:13:55.272606 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-99rss" podUID="9a523e37-804a-4173-8012-19848efc8cc0" Oct 01 15:13:55 crc kubenswrapper[4771]: E1001 15:13:55.781205 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Oct 01 15:13:55 crc kubenswrapper[4771]: E1001 15:13:55.781364 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tg27h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-nvszz_openstack(313f9ab1-8fa8-476f-94cb-1d94bd975a06): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 15:13:55 crc kubenswrapper[4771]: E1001 15:13:55.782660 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-nvszz" podUID="313f9ab1-8fa8-476f-94cb-1d94bd975a06" Oct 01 15:13:55 crc kubenswrapper[4771]: I1001 15:13:55.998379 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a66f8282-966c-40de-9bfd-b6b75d5f519c" path="/var/lib/kubelet/pods/a66f8282-966c-40de-9bfd-b6b75d5f519c/volumes" Oct 01 15:13:56 crc kubenswrapper[4771]: E1001 15:13:56.146428 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Oct 01 15:13:56 crc kubenswrapper[4771]: E1001 15:13:56.146663 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nbch5c7h549h699h5c9h568h5c5h9h8bh5fh5b4h5d7h64h59dh4h58fh8dh5ffh58dhbfh57bh544h684h677h685h6ch579h667h586hf7h585hbfq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4g54n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(127a1d07-d1f4-4c95-abf8-da08884ea57a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 15:13:56 crc kubenswrapper[4771]: I1001 15:13:56.165521 4771 scope.go:117] "RemoveContainer" containerID="00b63e9bb2fedec4a96328107b84b5fcd377240fd28be4b85603755ca670a80c" Oct 01 15:13:56 crc kubenswrapper[4771]: E1001 15:13:56.281954 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-nvszz" podUID="313f9ab1-8fa8-476f-94cb-1d94bd975a06" Oct 01 15:13:56 crc kubenswrapper[4771]: I1001 15:13:56.599666 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-r9ffd"] Oct 01 15:13:56 crc kubenswrapper[4771]: W1001 15:13:56.608207 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda10ee2ec_1ec2_4353_ad8b_74ac0e031289.slice/crio-ee70ca3053e8ea9f66137925638401f44d84436c91f57bb12b76bf97773a51de WatchSource:0}: Error finding container ee70ca3053e8ea9f66137925638401f44d84436c91f57bb12b76bf97773a51de: Status 404 returned error can't find the container with id ee70ca3053e8ea9f66137925638401f44d84436c91f57bb12b76bf97773a51de Oct 01 15:13:56 crc kubenswrapper[4771]: I1001 15:13:56.661773 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-b48cbbc84-ndtts"] Oct 01 15:13:56 crc kubenswrapper[4771]: I1001 15:13:56.722017 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-66679756f6-g56hw"] Oct 01 15:13:57 crc kubenswrapper[4771]: I1001 15:13:57.295649 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-d8c64bd85-clrr4" event={"ID":"fe922d16-c5a9-4d8d-ba3e-33042e83372b","Type":"ContainerStarted","Data":"5b2dbe6ff477b1c6c847a723de09cfcb54f04a483fd30ff4dea68ed0c3baa529"} Oct 01 15:13:57 crc kubenswrapper[4771]: I1001 15:13:57.295699 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-d8c64bd85-clrr4" event={"ID":"fe922d16-c5a9-4d8d-ba3e-33042e83372b","Type":"ContainerStarted","Data":"04f3e4001efd65b4a9a1aec7134fb2075a5a1d446ae02e867830c733db9cc0a5"} Oct 01 15:13:57 crc kubenswrapper[4771]: I1001 15:13:57.295703 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-d8c64bd85-clrr4" podUID="fe922d16-c5a9-4d8d-ba3e-33042e83372b" containerName="horizon-log" containerID="cri-o://04f3e4001efd65b4a9a1aec7134fb2075a5a1d446ae02e867830c733db9cc0a5" gracePeriod=30 Oct 01 15:13:57 crc kubenswrapper[4771]: I1001 15:13:57.295845 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-d8c64bd85-clrr4" podUID="fe922d16-c5a9-4d8d-ba3e-33042e83372b" containerName="horizon" containerID="cri-o://5b2dbe6ff477b1c6c847a723de09cfcb54f04a483fd30ff4dea68ed0c3baa529" gracePeriod=30 Oct 01 15:13:57 crc kubenswrapper[4771]: I1001 15:13:57.300597 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7ccd9d5bb7-8mdjn" event={"ID":"b474b0f5-0050-4edd-afac-9237aa7284a5","Type":"ContainerStarted","Data":"1bd15e2f9655dbcc18b1e30a9a51e6d0824a7c6accf54c984e89caec4321ea31"} Oct 01 15:13:57 crc kubenswrapper[4771]: I1001 15:13:57.300642 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7ccd9d5bb7-8mdjn" event={"ID":"b474b0f5-0050-4edd-afac-9237aa7284a5","Type":"ContainerStarted","Data":"2809c1f07caf07bac6fafb7905817befa47759d615f86ffdddc85b049955a712"} Oct 01 15:13:57 crc kubenswrapper[4771]: I1001 15:13:57.300893 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7ccd9d5bb7-8mdjn" podUID="b474b0f5-0050-4edd-afac-9237aa7284a5" containerName="horizon-log" containerID="cri-o://2809c1f07caf07bac6fafb7905817befa47759d615f86ffdddc85b049955a712" gracePeriod=30 Oct 01 15:13:57 crc kubenswrapper[4771]: I1001 15:13:57.301068 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7ccd9d5bb7-8mdjn" podUID="b474b0f5-0050-4edd-afac-9237aa7284a5" containerName="horizon" containerID="cri-o://1bd15e2f9655dbcc18b1e30a9a51e6d0824a7c6accf54c984e89caec4321ea31" gracePeriod=30 Oct 01 15:13:57 crc kubenswrapper[4771]: I1001 15:13:57.302004 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b48cbbc84-ndtts" event={"ID":"a5159749-76e2-4e89-8a81-59d8ea1ab063","Type":"ContainerStarted","Data":"a90b6e35d0b25d1bee2280f135b1a7b7fbd994bb638560538cf71e22bdc7f88b"} Oct 01 15:13:57 crc kubenswrapper[4771]: I1001 15:13:57.302030 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b48cbbc84-ndtts" event={"ID":"a5159749-76e2-4e89-8a81-59d8ea1ab063","Type":"ContainerStarted","Data":"c06305591c15a87728b3c19dea7ad2aaec3af99f9c8d51f5c79833a931704378"} Oct 01 15:13:57 crc kubenswrapper[4771]: I1001 15:13:57.303813 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66679756f6-g56hw" event={"ID":"405e12dd-6888-4994-ac26-b2836ad9069c","Type":"ContainerStarted","Data":"ed304c6ee42aac0789277ee0c25cfa7b29725860cd7c86d35b0959bf03b47633"} Oct 01 15:13:57 crc kubenswrapper[4771]: I1001 15:13:57.303851 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66679756f6-g56hw" event={"ID":"405e12dd-6888-4994-ac26-b2836ad9069c","Type":"ContainerStarted","Data":"e889caad062265be0c395b63f93ad3ddf4aea9ec81e897e2032e8a7a67161231"} Oct 01 15:13:57 crc kubenswrapper[4771]: I1001 15:13:57.306398 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7975b9b95f-8lwcw" event={"ID":"e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0","Type":"ContainerStarted","Data":"b91bbfae12a5fc93cf8ef735589297d3e089c9dc513874859fff4236fc6d7fc0"} Oct 01 15:13:57 crc kubenswrapper[4771]: I1001 15:13:57.306432 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7975b9b95f-8lwcw" event={"ID":"e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0","Type":"ContainerStarted","Data":"916620a2144c8c9eebb35a2e594232a97b679f6372443ca55296a2a41bdc9e41"} Oct 01 15:13:57 crc kubenswrapper[4771]: I1001 15:13:57.306430 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7975b9b95f-8lwcw" podUID="e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0" containerName="horizon-log" containerID="cri-o://916620a2144c8c9eebb35a2e594232a97b679f6372443ca55296a2a41bdc9e41" gracePeriod=30 Oct 01 15:13:57 crc kubenswrapper[4771]: I1001 15:13:57.306482 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-7975b9b95f-8lwcw" podUID="e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0" containerName="horizon" containerID="cri-o://b91bbfae12a5fc93cf8ef735589297d3e089c9dc513874859fff4236fc6d7fc0" gracePeriod=30 Oct 01 15:13:57 crc kubenswrapper[4771]: I1001 15:13:57.309558 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-r9ffd" event={"ID":"a10ee2ec-1ec2-4353-ad8b-74ac0e031289","Type":"ContainerStarted","Data":"24164528286d94c164609d173dbf72e452c655f857e1f9c2834ee0be39e4d572"} Oct 01 15:13:57 crc kubenswrapper[4771]: I1001 15:13:57.309589 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-r9ffd" event={"ID":"a10ee2ec-1ec2-4353-ad8b-74ac0e031289","Type":"ContainerStarted","Data":"ee70ca3053e8ea9f66137925638401f44d84436c91f57bb12b76bf97773a51de"} Oct 01 15:13:57 crc kubenswrapper[4771]: I1001 15:13:57.333295 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-d8c64bd85-clrr4" podStartSLOduration=2.7858490270000003 podStartE2EDuration="34.333274947s" podCreationTimestamp="2025-10-01 15:13:23 +0000 UTC" firstStartedPulling="2025-10-01 15:13:24.627675938 +0000 UTC m=+1049.246851109" lastFinishedPulling="2025-10-01 15:13:56.175101858 +0000 UTC m=+1080.794277029" observedRunningTime="2025-10-01 15:13:57.319321123 +0000 UTC m=+1081.938496294" watchObservedRunningTime="2025-10-01 15:13:57.333274947 +0000 UTC m=+1081.952450118" Oct 01 15:13:57 crc kubenswrapper[4771]: I1001 15:13:57.352982 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7975b9b95f-8lwcw" podStartSLOduration=3.6224723709999997 podStartE2EDuration="36.352961814s" podCreationTimestamp="2025-10-01 15:13:21 +0000 UTC" firstStartedPulling="2025-10-01 15:13:23.407900907 +0000 UTC m=+1048.027076078" lastFinishedPulling="2025-10-01 15:13:56.13839035 +0000 UTC m=+1080.757565521" observedRunningTime="2025-10-01 15:13:57.348992236 +0000 UTC m=+1081.968167417" watchObservedRunningTime="2025-10-01 15:13:57.352961814 +0000 UTC m=+1081.972136985" Oct 01 15:13:57 crc kubenswrapper[4771]: I1001 15:13:57.374941 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-7ccd9d5bb7-8mdjn" podStartSLOduration=3.528436526 podStartE2EDuration="36.374920997s" podCreationTimestamp="2025-10-01 15:13:21 +0000 UTC" firstStartedPulling="2025-10-01 15:13:23.341352392 +0000 UTC m=+1047.960527563" lastFinishedPulling="2025-10-01 15:13:56.187836863 +0000 UTC m=+1080.807012034" observedRunningTime="2025-10-01 15:13:57.365656777 +0000 UTC m=+1081.984831948" watchObservedRunningTime="2025-10-01 15:13:57.374920997 +0000 UTC m=+1081.994096168" Oct 01 15:13:57 crc kubenswrapper[4771]: I1001 15:13:57.392125 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-r9ffd" podStartSLOduration=29.392101231 podStartE2EDuration="29.392101231s" podCreationTimestamp="2025-10-01 15:13:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:13:57.387945198 +0000 UTC m=+1082.007120379" watchObservedRunningTime="2025-10-01 15:13:57.392101231 +0000 UTC m=+1082.011276402" Oct 01 15:13:58 crc kubenswrapper[4771]: I1001 15:13:58.322295 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"127a1d07-d1f4-4c95-abf8-da08884ea57a","Type":"ContainerStarted","Data":"67d4f49b7771a6cd394edbf1f7a2ab2572e824e9ea022772d3e883951c3927c0"} Oct 01 15:13:58 crc kubenswrapper[4771]: I1001 15:13:58.327039 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b48cbbc84-ndtts" event={"ID":"a5159749-76e2-4e89-8a81-59d8ea1ab063","Type":"ContainerStarted","Data":"2d01502bbf6fc58e63063d52d24bfa7a5cf14caa1fd2964932d63d8a121a9349"} Oct 01 15:13:58 crc kubenswrapper[4771]: I1001 15:13:58.333998 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-66679756f6-g56hw" event={"ID":"405e12dd-6888-4994-ac26-b2836ad9069c","Type":"ContainerStarted","Data":"c07d4741182794290063a44c67858beb5c87d2b46bd14c9b5bde4b2a2d824458"} Oct 01 15:13:58 crc kubenswrapper[4771]: I1001 15:13:58.350489 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-b48cbbc84-ndtts" podStartSLOduration=28.350467502 podStartE2EDuration="28.350467502s" podCreationTimestamp="2025-10-01 15:13:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:13:58.349760045 +0000 UTC m=+1082.968935236" watchObservedRunningTime="2025-10-01 15:13:58.350467502 +0000 UTC m=+1082.969642683" Oct 01 15:13:58 crc kubenswrapper[4771]: I1001 15:13:58.377803 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-66679756f6-g56hw" podStartSLOduration=28.377778838 podStartE2EDuration="28.377778838s" podCreationTimestamp="2025-10-01 15:13:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:13:58.366193082 +0000 UTC m=+1082.985368273" watchObservedRunningTime="2025-10-01 15:13:58.377778838 +0000 UTC m=+1082.996954019" Oct 01 15:13:58 crc kubenswrapper[4771]: I1001 15:13:58.644845 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-4tm2t" podUID="a66f8282-966c-40de-9bfd-b6b75d5f519c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.115:5353: i/o timeout" Oct 01 15:14:00 crc kubenswrapper[4771]: I1001 15:14:00.842486 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-b48cbbc84-ndtts" Oct 01 15:14:00 crc kubenswrapper[4771]: I1001 15:14:00.843033 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-b48cbbc84-ndtts" Oct 01 15:14:00 crc kubenswrapper[4771]: I1001 15:14:00.922821 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-66679756f6-g56hw" Oct 01 15:14:00 crc kubenswrapper[4771]: I1001 15:14:00.922871 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-66679756f6-g56hw" Oct 01 15:14:01 crc kubenswrapper[4771]: I1001 15:14:01.641258 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7ccd9d5bb7-8mdjn" Oct 01 15:14:02 crc kubenswrapper[4771]: I1001 15:14:02.040996 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-7975b9b95f-8lwcw" Oct 01 15:14:02 crc kubenswrapper[4771]: I1001 15:14:02.371217 4771 generic.go:334] "Generic (PLEG): container finished" podID="a10ee2ec-1ec2-4353-ad8b-74ac0e031289" containerID="24164528286d94c164609d173dbf72e452c655f857e1f9c2834ee0be39e4d572" exitCode=0 Oct 01 15:14:02 crc kubenswrapper[4771]: I1001 15:14:02.371280 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-r9ffd" event={"ID":"a10ee2ec-1ec2-4353-ad8b-74ac0e031289","Type":"ContainerDied","Data":"24164528286d94c164609d173dbf72e452c655f857e1f9c2834ee0be39e4d572"} Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.076923 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-d8c64bd85-clrr4" Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.219266 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-r9ffd" Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.389492 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-r9ffd" event={"ID":"a10ee2ec-1ec2-4353-ad8b-74ac0e031289","Type":"ContainerDied","Data":"ee70ca3053e8ea9f66137925638401f44d84436c91f57bb12b76bf97773a51de"} Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.389539 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee70ca3053e8ea9f66137925638401f44d84436c91f57bb12b76bf97773a51de" Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.389576 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-r9ffd" Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.409105 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a10ee2ec-1ec2-4353-ad8b-74ac0e031289-scripts\") pod \"a10ee2ec-1ec2-4353-ad8b-74ac0e031289\" (UID: \"a10ee2ec-1ec2-4353-ad8b-74ac0e031289\") " Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.409232 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a10ee2ec-1ec2-4353-ad8b-74ac0e031289-combined-ca-bundle\") pod \"a10ee2ec-1ec2-4353-ad8b-74ac0e031289\" (UID: \"a10ee2ec-1ec2-4353-ad8b-74ac0e031289\") " Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.409268 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a10ee2ec-1ec2-4353-ad8b-74ac0e031289-credential-keys\") pod \"a10ee2ec-1ec2-4353-ad8b-74ac0e031289\" (UID: \"a10ee2ec-1ec2-4353-ad8b-74ac0e031289\") " Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.409306 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9p27m\" (UniqueName: \"kubernetes.io/projected/a10ee2ec-1ec2-4353-ad8b-74ac0e031289-kube-api-access-9p27m\") pod \"a10ee2ec-1ec2-4353-ad8b-74ac0e031289\" (UID: \"a10ee2ec-1ec2-4353-ad8b-74ac0e031289\") " Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.409331 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a10ee2ec-1ec2-4353-ad8b-74ac0e031289-fernet-keys\") pod \"a10ee2ec-1ec2-4353-ad8b-74ac0e031289\" (UID: \"a10ee2ec-1ec2-4353-ad8b-74ac0e031289\") " Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.409416 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a10ee2ec-1ec2-4353-ad8b-74ac0e031289-config-data\") pod \"a10ee2ec-1ec2-4353-ad8b-74ac0e031289\" (UID: \"a10ee2ec-1ec2-4353-ad8b-74ac0e031289\") " Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.417933 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a10ee2ec-1ec2-4353-ad8b-74ac0e031289-kube-api-access-9p27m" (OuterVolumeSpecName: "kube-api-access-9p27m") pod "a10ee2ec-1ec2-4353-ad8b-74ac0e031289" (UID: "a10ee2ec-1ec2-4353-ad8b-74ac0e031289"). InnerVolumeSpecName "kube-api-access-9p27m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.439588 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a10ee2ec-1ec2-4353-ad8b-74ac0e031289-scripts" (OuterVolumeSpecName: "scripts") pod "a10ee2ec-1ec2-4353-ad8b-74ac0e031289" (UID: "a10ee2ec-1ec2-4353-ad8b-74ac0e031289"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.441359 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a10ee2ec-1ec2-4353-ad8b-74ac0e031289-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "a10ee2ec-1ec2-4353-ad8b-74ac0e031289" (UID: "a10ee2ec-1ec2-4353-ad8b-74ac0e031289"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.450928 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a10ee2ec-1ec2-4353-ad8b-74ac0e031289-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a10ee2ec-1ec2-4353-ad8b-74ac0e031289" (UID: "a10ee2ec-1ec2-4353-ad8b-74ac0e031289"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.476896 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a10ee2ec-1ec2-4353-ad8b-74ac0e031289-config-data" (OuterVolumeSpecName: "config-data") pod "a10ee2ec-1ec2-4353-ad8b-74ac0e031289" (UID: "a10ee2ec-1ec2-4353-ad8b-74ac0e031289"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.512163 4771 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a10ee2ec-1ec2-4353-ad8b-74ac0e031289-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.512203 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9p27m\" (UniqueName: \"kubernetes.io/projected/a10ee2ec-1ec2-4353-ad8b-74ac0e031289-kube-api-access-9p27m\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.512220 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a10ee2ec-1ec2-4353-ad8b-74ac0e031289-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.512231 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a10ee2ec-1ec2-4353-ad8b-74ac0e031289-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.512242 4771 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a10ee2ec-1ec2-4353-ad8b-74ac0e031289-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.518470 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a10ee2ec-1ec2-4353-ad8b-74ac0e031289-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a10ee2ec-1ec2-4353-ad8b-74ac0e031289" (UID: "a10ee2ec-1ec2-4353-ad8b-74ac0e031289"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.568817 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-74b8bdcb7c-xttgq"] Oct 01 15:14:04 crc kubenswrapper[4771]: E1001 15:14:04.569425 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a66f8282-966c-40de-9bfd-b6b75d5f519c" containerName="init" Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.569449 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a66f8282-966c-40de-9bfd-b6b75d5f519c" containerName="init" Oct 01 15:14:04 crc kubenswrapper[4771]: E1001 15:14:04.569493 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a66f8282-966c-40de-9bfd-b6b75d5f519c" containerName="dnsmasq-dns" Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.569501 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a66f8282-966c-40de-9bfd-b6b75d5f519c" containerName="dnsmasq-dns" Oct 01 15:14:04 crc kubenswrapper[4771]: E1001 15:14:04.569523 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a10ee2ec-1ec2-4353-ad8b-74ac0e031289" containerName="keystone-bootstrap" Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.569531 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a10ee2ec-1ec2-4353-ad8b-74ac0e031289" containerName="keystone-bootstrap" Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.569810 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a66f8282-966c-40de-9bfd-b6b75d5f519c" containerName="dnsmasq-dns" Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.569843 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a10ee2ec-1ec2-4353-ad8b-74ac0e031289" containerName="keystone-bootstrap" Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.570530 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-74b8bdcb7c-xttgq" Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.574846 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.575093 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.588825 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-74b8bdcb7c-xttgq"] Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.613427 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a10ee2ec-1ec2-4353-ad8b-74ac0e031289-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.714541 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e356f03-9445-4825-a39d-8b564bd4ea1c-public-tls-certs\") pod \"keystone-74b8bdcb7c-xttgq\" (UID: \"5e356f03-9445-4825-a39d-8b564bd4ea1c\") " pod="openstack/keystone-74b8bdcb7c-xttgq" Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.714946 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5e356f03-9445-4825-a39d-8b564bd4ea1c-credential-keys\") pod \"keystone-74b8bdcb7c-xttgq\" (UID: \"5e356f03-9445-4825-a39d-8b564bd4ea1c\") " pod="openstack/keystone-74b8bdcb7c-xttgq" Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.714996 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e356f03-9445-4825-a39d-8b564bd4ea1c-internal-tls-certs\") pod \"keystone-74b8bdcb7c-xttgq\" (UID: \"5e356f03-9445-4825-a39d-8b564bd4ea1c\") " pod="openstack/keystone-74b8bdcb7c-xttgq" Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.715043 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e356f03-9445-4825-a39d-8b564bd4ea1c-config-data\") pod \"keystone-74b8bdcb7c-xttgq\" (UID: \"5e356f03-9445-4825-a39d-8b564bd4ea1c\") " pod="openstack/keystone-74b8bdcb7c-xttgq" Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.715087 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e356f03-9445-4825-a39d-8b564bd4ea1c-combined-ca-bundle\") pod \"keystone-74b8bdcb7c-xttgq\" (UID: \"5e356f03-9445-4825-a39d-8b564bd4ea1c\") " pod="openstack/keystone-74b8bdcb7c-xttgq" Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.715134 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztkpb\" (UniqueName: \"kubernetes.io/projected/5e356f03-9445-4825-a39d-8b564bd4ea1c-kube-api-access-ztkpb\") pod \"keystone-74b8bdcb7c-xttgq\" (UID: \"5e356f03-9445-4825-a39d-8b564bd4ea1c\") " pod="openstack/keystone-74b8bdcb7c-xttgq" Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.715178 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5e356f03-9445-4825-a39d-8b564bd4ea1c-fernet-keys\") pod \"keystone-74b8bdcb7c-xttgq\" (UID: \"5e356f03-9445-4825-a39d-8b564bd4ea1c\") " pod="openstack/keystone-74b8bdcb7c-xttgq" Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.715277 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e356f03-9445-4825-a39d-8b564bd4ea1c-scripts\") pod \"keystone-74b8bdcb7c-xttgq\" (UID: \"5e356f03-9445-4825-a39d-8b564bd4ea1c\") " pod="openstack/keystone-74b8bdcb7c-xttgq" Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.817219 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e356f03-9445-4825-a39d-8b564bd4ea1c-scripts\") pod \"keystone-74b8bdcb7c-xttgq\" (UID: \"5e356f03-9445-4825-a39d-8b564bd4ea1c\") " pod="openstack/keystone-74b8bdcb7c-xttgq" Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.817274 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e356f03-9445-4825-a39d-8b564bd4ea1c-public-tls-certs\") pod \"keystone-74b8bdcb7c-xttgq\" (UID: \"5e356f03-9445-4825-a39d-8b564bd4ea1c\") " pod="openstack/keystone-74b8bdcb7c-xttgq" Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.817320 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5e356f03-9445-4825-a39d-8b564bd4ea1c-credential-keys\") pod \"keystone-74b8bdcb7c-xttgq\" (UID: \"5e356f03-9445-4825-a39d-8b564bd4ea1c\") " pod="openstack/keystone-74b8bdcb7c-xttgq" Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.817340 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e356f03-9445-4825-a39d-8b564bd4ea1c-internal-tls-certs\") pod \"keystone-74b8bdcb7c-xttgq\" (UID: \"5e356f03-9445-4825-a39d-8b564bd4ea1c\") " pod="openstack/keystone-74b8bdcb7c-xttgq" Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.817358 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e356f03-9445-4825-a39d-8b564bd4ea1c-config-data\") pod \"keystone-74b8bdcb7c-xttgq\" (UID: \"5e356f03-9445-4825-a39d-8b564bd4ea1c\") " pod="openstack/keystone-74b8bdcb7c-xttgq" Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.817384 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e356f03-9445-4825-a39d-8b564bd4ea1c-combined-ca-bundle\") pod \"keystone-74b8bdcb7c-xttgq\" (UID: \"5e356f03-9445-4825-a39d-8b564bd4ea1c\") " pod="openstack/keystone-74b8bdcb7c-xttgq" Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.817408 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztkpb\" (UniqueName: \"kubernetes.io/projected/5e356f03-9445-4825-a39d-8b564bd4ea1c-kube-api-access-ztkpb\") pod \"keystone-74b8bdcb7c-xttgq\" (UID: \"5e356f03-9445-4825-a39d-8b564bd4ea1c\") " pod="openstack/keystone-74b8bdcb7c-xttgq" Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.817431 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5e356f03-9445-4825-a39d-8b564bd4ea1c-fernet-keys\") pod \"keystone-74b8bdcb7c-xttgq\" (UID: \"5e356f03-9445-4825-a39d-8b564bd4ea1c\") " pod="openstack/keystone-74b8bdcb7c-xttgq" Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.822043 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e356f03-9445-4825-a39d-8b564bd4ea1c-public-tls-certs\") pod \"keystone-74b8bdcb7c-xttgq\" (UID: \"5e356f03-9445-4825-a39d-8b564bd4ea1c\") " pod="openstack/keystone-74b8bdcb7c-xttgq" Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.825491 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e356f03-9445-4825-a39d-8b564bd4ea1c-config-data\") pod \"keystone-74b8bdcb7c-xttgq\" (UID: \"5e356f03-9445-4825-a39d-8b564bd4ea1c\") " pod="openstack/keystone-74b8bdcb7c-xttgq" Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.829753 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e356f03-9445-4825-a39d-8b564bd4ea1c-internal-tls-certs\") pod \"keystone-74b8bdcb7c-xttgq\" (UID: \"5e356f03-9445-4825-a39d-8b564bd4ea1c\") " pod="openstack/keystone-74b8bdcb7c-xttgq" Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.829964 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5e356f03-9445-4825-a39d-8b564bd4ea1c-credential-keys\") pod \"keystone-74b8bdcb7c-xttgq\" (UID: \"5e356f03-9445-4825-a39d-8b564bd4ea1c\") " pod="openstack/keystone-74b8bdcb7c-xttgq" Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.830303 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5e356f03-9445-4825-a39d-8b564bd4ea1c-fernet-keys\") pod \"keystone-74b8bdcb7c-xttgq\" (UID: \"5e356f03-9445-4825-a39d-8b564bd4ea1c\") " pod="openstack/keystone-74b8bdcb7c-xttgq" Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.837680 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e356f03-9445-4825-a39d-8b564bd4ea1c-combined-ca-bundle\") pod \"keystone-74b8bdcb7c-xttgq\" (UID: \"5e356f03-9445-4825-a39d-8b564bd4ea1c\") " pod="openstack/keystone-74b8bdcb7c-xttgq" Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.839151 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5e356f03-9445-4825-a39d-8b564bd4ea1c-scripts\") pod \"keystone-74b8bdcb7c-xttgq\" (UID: \"5e356f03-9445-4825-a39d-8b564bd4ea1c\") " pod="openstack/keystone-74b8bdcb7c-xttgq" Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.844279 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztkpb\" (UniqueName: \"kubernetes.io/projected/5e356f03-9445-4825-a39d-8b564bd4ea1c-kube-api-access-ztkpb\") pod \"keystone-74b8bdcb7c-xttgq\" (UID: \"5e356f03-9445-4825-a39d-8b564bd4ea1c\") " pod="openstack/keystone-74b8bdcb7c-xttgq" Oct 01 15:14:04 crc kubenswrapper[4771]: I1001 15:14:04.913541 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-74b8bdcb7c-xttgq" Oct 01 15:14:05 crc kubenswrapper[4771]: I1001 15:14:05.403153 4771 generic.go:334] "Generic (PLEG): container finished" podID="7709b00c-a5a2-46c9-a4dc-ffa7fe3c912c" containerID="758ccba9aed842f21dfca918e8574ac6b220bd4a6467406d8a7aafee26ba08a9" exitCode=0 Oct 01 15:14:05 crc kubenswrapper[4771]: I1001 15:14:05.403204 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-qfvj2" event={"ID":"7709b00c-a5a2-46c9-a4dc-ffa7fe3c912c","Type":"ContainerDied","Data":"758ccba9aed842f21dfca918e8574ac6b220bd4a6467406d8a7aafee26ba08a9"} Oct 01 15:14:08 crc kubenswrapper[4771]: I1001 15:14:08.316643 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-qfvj2" Oct 01 15:14:08 crc kubenswrapper[4771]: I1001 15:14:08.482327 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7709b00c-a5a2-46c9-a4dc-ffa7fe3c912c-combined-ca-bundle\") pod \"7709b00c-a5a2-46c9-a4dc-ffa7fe3c912c\" (UID: \"7709b00c-a5a2-46c9-a4dc-ffa7fe3c912c\") " Oct 01 15:14:08 crc kubenswrapper[4771]: I1001 15:14:08.489090 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcfr7\" (UniqueName: \"kubernetes.io/projected/7709b00c-a5a2-46c9-a4dc-ffa7fe3c912c-kube-api-access-kcfr7\") pod \"7709b00c-a5a2-46c9-a4dc-ffa7fe3c912c\" (UID: \"7709b00c-a5a2-46c9-a4dc-ffa7fe3c912c\") " Oct 01 15:14:08 crc kubenswrapper[4771]: I1001 15:14:08.489280 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7709b00c-a5a2-46c9-a4dc-ffa7fe3c912c-config\") pod \"7709b00c-a5a2-46c9-a4dc-ffa7fe3c912c\" (UID: \"7709b00c-a5a2-46c9-a4dc-ffa7fe3c912c\") " Oct 01 15:14:08 crc kubenswrapper[4771]: I1001 15:14:08.499118 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7709b00c-a5a2-46c9-a4dc-ffa7fe3c912c-kube-api-access-kcfr7" (OuterVolumeSpecName: "kube-api-access-kcfr7") pod "7709b00c-a5a2-46c9-a4dc-ffa7fe3c912c" (UID: "7709b00c-a5a2-46c9-a4dc-ffa7fe3c912c"). InnerVolumeSpecName "kube-api-access-kcfr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:14:08 crc kubenswrapper[4771]: I1001 15:14:08.533842 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7709b00c-a5a2-46c9-a4dc-ffa7fe3c912c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7709b00c-a5a2-46c9-a4dc-ffa7fe3c912c" (UID: "7709b00c-a5a2-46c9-a4dc-ffa7fe3c912c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:08 crc kubenswrapper[4771]: I1001 15:14:08.533944 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-qfvj2" event={"ID":"7709b00c-a5a2-46c9-a4dc-ffa7fe3c912c","Type":"ContainerDied","Data":"f005d94ff6b702b1b395c577e4e295bb5f7843ba209aadf1c1c6a3b26730a72b"} Oct 01 15:14:08 crc kubenswrapper[4771]: I1001 15:14:08.533970 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f005d94ff6b702b1b395c577e4e295bb5f7843ba209aadf1c1c6a3b26730a72b" Oct 01 15:14:08 crc kubenswrapper[4771]: I1001 15:14:08.534022 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-qfvj2" Oct 01 15:14:08 crc kubenswrapper[4771]: I1001 15:14:08.538795 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7709b00c-a5a2-46c9-a4dc-ffa7fe3c912c-config" (OuterVolumeSpecName: "config") pod "7709b00c-a5a2-46c9-a4dc-ffa7fe3c912c" (UID: "7709b00c-a5a2-46c9-a4dc-ffa7fe3c912c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:08 crc kubenswrapper[4771]: I1001 15:14:08.595861 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcfr7\" (UniqueName: \"kubernetes.io/projected/7709b00c-a5a2-46c9-a4dc-ffa7fe3c912c-kube-api-access-kcfr7\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:08 crc kubenswrapper[4771]: I1001 15:14:08.595893 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/7709b00c-a5a2-46c9-a4dc-ffa7fe3c912c-config\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:08 crc kubenswrapper[4771]: I1001 15:14:08.595903 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7709b00c-a5a2-46c9-a4dc-ffa7fe3c912c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:08 crc kubenswrapper[4771]: I1001 15:14:08.879595 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-74b8bdcb7c-xttgq"] Oct 01 15:14:08 crc kubenswrapper[4771]: W1001 15:14:08.923222 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e356f03_9445_4825_a39d_8b564bd4ea1c.slice/crio-267e3ffa1daad5dd0e18b6005fcc10c2f071c406c2e3a0239722cc3d5daf475d WatchSource:0}: Error finding container 267e3ffa1daad5dd0e18b6005fcc10c2f071c406c2e3a0239722cc3d5daf475d: Status 404 returned error can't find the container with id 267e3ffa1daad5dd0e18b6005fcc10c2f071c406c2e3a0239722cc3d5daf475d Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.680814 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-65965d6475-kvgqz"] Oct 01 15:14:09 crc kubenswrapper[4771]: E1001 15:14:09.686282 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7709b00c-a5a2-46c9-a4dc-ffa7fe3c912c" containerName="neutron-db-sync" Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.686593 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7709b00c-a5a2-46c9-a4dc-ffa7fe3c912c" containerName="neutron-db-sync" Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.687429 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="7709b00c-a5a2-46c9-a4dc-ffa7fe3c912c" containerName="neutron-db-sync" Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.690962 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65965d6475-kvgqz" Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.718167 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-74b8bdcb7c-xttgq" event={"ID":"5e356f03-9445-4825-a39d-8b564bd4ea1c","Type":"ContainerStarted","Data":"302735efe208377875ca639d227a5d4436936a6cbc919b5e89dad9d4b186eb31"} Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.718207 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-74b8bdcb7c-xttgq" event={"ID":"5e356f03-9445-4825-a39d-8b564bd4ea1c","Type":"ContainerStarted","Data":"267e3ffa1daad5dd0e18b6005fcc10c2f071c406c2e3a0239722cc3d5daf475d"} Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.727938 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65965d6475-kvgqz"] Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.728952 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-74b8bdcb7c-xttgq" Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.745983 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6c4877d5c6-thmgz"] Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.747652 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c4877d5c6-thmgz" Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.751991 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6c4877d5c6-thmgz"] Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.753970 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f66ea972-6475-4624-a093-8884ead588f8-config\") pod \"neutron-6c4877d5c6-thmgz\" (UID: \"f66ea972-6475-4624-a093-8884ead588f8\") " pod="openstack/neutron-6c4877d5c6-thmgz" Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.754056 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f66ea972-6475-4624-a093-8884ead588f8-combined-ca-bundle\") pod \"neutron-6c4877d5c6-thmgz\" (UID: \"f66ea972-6475-4624-a093-8884ead588f8\") " pod="openstack/neutron-6c4877d5c6-thmgz" Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.754119 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3ad98d7-c092-4ef4-96b7-194255e37e83-config\") pod \"dnsmasq-dns-65965d6475-kvgqz\" (UID: \"b3ad98d7-c092-4ef4-96b7-194255e37e83\") " pod="openstack/dnsmasq-dns-65965d6475-kvgqz" Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.754251 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3ad98d7-c092-4ef4-96b7-194255e37e83-dns-svc\") pod \"dnsmasq-dns-65965d6475-kvgqz\" (UID: \"b3ad98d7-c092-4ef4-96b7-194255e37e83\") " pod="openstack/dnsmasq-dns-65965d6475-kvgqz" Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.754335 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f66ea972-6475-4624-a093-8884ead588f8-ovndb-tls-certs\") pod \"neutron-6c4877d5c6-thmgz\" (UID: \"f66ea972-6475-4624-a093-8884ead588f8\") " pod="openstack/neutron-6c4877d5c6-thmgz" Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.754400 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b3ad98d7-c092-4ef4-96b7-194255e37e83-dns-swift-storage-0\") pod \"dnsmasq-dns-65965d6475-kvgqz\" (UID: \"b3ad98d7-c092-4ef4-96b7-194255e37e83\") " pod="openstack/dnsmasq-dns-65965d6475-kvgqz" Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.754513 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlcw5\" (UniqueName: \"kubernetes.io/projected/f66ea972-6475-4624-a093-8884ead588f8-kube-api-access-xlcw5\") pod \"neutron-6c4877d5c6-thmgz\" (UID: \"f66ea972-6475-4624-a093-8884ead588f8\") " pod="openstack/neutron-6c4877d5c6-thmgz" Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.754628 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3ad98d7-c092-4ef4-96b7-194255e37e83-ovsdbserver-nb\") pod \"dnsmasq-dns-65965d6475-kvgqz\" (UID: \"b3ad98d7-c092-4ef4-96b7-194255e37e83\") " pod="openstack/dnsmasq-dns-65965d6475-kvgqz" Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.754704 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f66ea972-6475-4624-a093-8884ead588f8-httpd-config\") pod \"neutron-6c4877d5c6-thmgz\" (UID: \"f66ea972-6475-4624-a093-8884ead588f8\") " pod="openstack/neutron-6c4877d5c6-thmgz" Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.754792 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3ad98d7-c092-4ef4-96b7-194255e37e83-ovsdbserver-sb\") pod \"dnsmasq-dns-65965d6475-kvgqz\" (UID: \"b3ad98d7-c092-4ef4-96b7-194255e37e83\") " pod="openstack/dnsmasq-dns-65965d6475-kvgqz" Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.757625 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsz66\" (UniqueName: \"kubernetes.io/projected/b3ad98d7-c092-4ef4-96b7-194255e37e83-kube-api-access-fsz66\") pod \"dnsmasq-dns-65965d6475-kvgqz\" (UID: \"b3ad98d7-c092-4ef4-96b7-194255e37e83\") " pod="openstack/dnsmasq-dns-65965d6475-kvgqz" Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.761374 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-m89mp" Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.761671 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.761848 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.762024 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.767827 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-nvszz" event={"ID":"313f9ab1-8fa8-476f-94cb-1d94bd975a06","Type":"ContainerStarted","Data":"5f20731ceb27dc77af04cf960e63721a47976e8675a07ee752f038d50d44be44"} Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.785667 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-d4zpd" event={"ID":"91950d77-9457-412f-be07-626b553f6b8d","Type":"ContainerStarted","Data":"d3afd8daf31a54889ea06e80fb2e696bebb9f2f782ba5c2d3ff182daf3212ac4"} Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.810001 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"127a1d07-d1f4-4c95-abf8-da08884ea57a","Type":"ContainerStarted","Data":"0aef501b8ed99dbcdaf337e52e02094a16b36ba8436428288b3845e0ff56df48"} Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.811679 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-74b8bdcb7c-xttgq" podStartSLOduration=5.8116621760000005 podStartE2EDuration="5.811662176s" podCreationTimestamp="2025-10-01 15:14:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:14:09.809234016 +0000 UTC m=+1094.428409187" watchObservedRunningTime="2025-10-01 15:14:09.811662176 +0000 UTC m=+1094.430837347" Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.848363 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-d4zpd" podStartSLOduration=3.866849799 podStartE2EDuration="48.848346443s" podCreationTimestamp="2025-10-01 15:13:21 +0000 UTC" firstStartedPulling="2025-10-01 15:13:23.403677983 +0000 UTC m=+1048.022853154" lastFinishedPulling="2025-10-01 15:14:08.385174627 +0000 UTC m=+1093.004349798" observedRunningTime="2025-10-01 15:14:09.829156199 +0000 UTC m=+1094.448331370" watchObservedRunningTime="2025-10-01 15:14:09.848346443 +0000 UTC m=+1094.467521614" Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.859073 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3ad98d7-c092-4ef4-96b7-194255e37e83-ovsdbserver-nb\") pod \"dnsmasq-dns-65965d6475-kvgqz\" (UID: \"b3ad98d7-c092-4ef4-96b7-194255e37e83\") " pod="openstack/dnsmasq-dns-65965d6475-kvgqz" Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.859225 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f66ea972-6475-4624-a093-8884ead588f8-httpd-config\") pod \"neutron-6c4877d5c6-thmgz\" (UID: \"f66ea972-6475-4624-a093-8884ead588f8\") " pod="openstack/neutron-6c4877d5c6-thmgz" Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.859316 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3ad98d7-c092-4ef4-96b7-194255e37e83-ovsdbserver-sb\") pod \"dnsmasq-dns-65965d6475-kvgqz\" (UID: \"b3ad98d7-c092-4ef4-96b7-194255e37e83\") " pod="openstack/dnsmasq-dns-65965d6475-kvgqz" Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.859427 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsz66\" (UniqueName: \"kubernetes.io/projected/b3ad98d7-c092-4ef4-96b7-194255e37e83-kube-api-access-fsz66\") pod \"dnsmasq-dns-65965d6475-kvgqz\" (UID: \"b3ad98d7-c092-4ef4-96b7-194255e37e83\") " pod="openstack/dnsmasq-dns-65965d6475-kvgqz" Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.859494 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f66ea972-6475-4624-a093-8884ead588f8-config\") pod \"neutron-6c4877d5c6-thmgz\" (UID: \"f66ea972-6475-4624-a093-8884ead588f8\") " pod="openstack/neutron-6c4877d5c6-thmgz" Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.859556 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f66ea972-6475-4624-a093-8884ead588f8-combined-ca-bundle\") pod \"neutron-6c4877d5c6-thmgz\" (UID: \"f66ea972-6475-4624-a093-8884ead588f8\") " pod="openstack/neutron-6c4877d5c6-thmgz" Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.859645 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3ad98d7-c092-4ef4-96b7-194255e37e83-config\") pod \"dnsmasq-dns-65965d6475-kvgqz\" (UID: \"b3ad98d7-c092-4ef4-96b7-194255e37e83\") " pod="openstack/dnsmasq-dns-65965d6475-kvgqz" Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.859990 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3ad98d7-c092-4ef4-96b7-194255e37e83-dns-svc\") pod \"dnsmasq-dns-65965d6475-kvgqz\" (UID: \"b3ad98d7-c092-4ef4-96b7-194255e37e83\") " pod="openstack/dnsmasq-dns-65965d6475-kvgqz" Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.860068 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f66ea972-6475-4624-a093-8884ead588f8-ovndb-tls-certs\") pod \"neutron-6c4877d5c6-thmgz\" (UID: \"f66ea972-6475-4624-a093-8884ead588f8\") " pod="openstack/neutron-6c4877d5c6-thmgz" Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.860153 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b3ad98d7-c092-4ef4-96b7-194255e37e83-dns-swift-storage-0\") pod \"dnsmasq-dns-65965d6475-kvgqz\" (UID: \"b3ad98d7-c092-4ef4-96b7-194255e37e83\") " pod="openstack/dnsmasq-dns-65965d6475-kvgqz" Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.860244 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlcw5\" (UniqueName: \"kubernetes.io/projected/f66ea972-6475-4624-a093-8884ead588f8-kube-api-access-xlcw5\") pod \"neutron-6c4877d5c6-thmgz\" (UID: \"f66ea972-6475-4624-a093-8884ead588f8\") " pod="openstack/neutron-6c4877d5c6-thmgz" Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.861500 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3ad98d7-c092-4ef4-96b7-194255e37e83-ovsdbserver-nb\") pod \"dnsmasq-dns-65965d6475-kvgqz\" (UID: \"b3ad98d7-c092-4ef4-96b7-194255e37e83\") " pod="openstack/dnsmasq-dns-65965d6475-kvgqz" Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.866980 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f66ea972-6475-4624-a093-8884ead588f8-httpd-config\") pod \"neutron-6c4877d5c6-thmgz\" (UID: \"f66ea972-6475-4624-a093-8884ead588f8\") " pod="openstack/neutron-6c4877d5c6-thmgz" Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.868364 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3ad98d7-c092-4ef4-96b7-194255e37e83-config\") pod \"dnsmasq-dns-65965d6475-kvgqz\" (UID: \"b3ad98d7-c092-4ef4-96b7-194255e37e83\") " pod="openstack/dnsmasq-dns-65965d6475-kvgqz" Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.869211 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3ad98d7-c092-4ef4-96b7-194255e37e83-dns-svc\") pod \"dnsmasq-dns-65965d6475-kvgqz\" (UID: \"b3ad98d7-c092-4ef4-96b7-194255e37e83\") " pod="openstack/dnsmasq-dns-65965d6475-kvgqz" Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.872178 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f66ea972-6475-4624-a093-8884ead588f8-ovndb-tls-certs\") pod \"neutron-6c4877d5c6-thmgz\" (UID: \"f66ea972-6475-4624-a093-8884ead588f8\") " pod="openstack/neutron-6c4877d5c6-thmgz" Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.872813 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b3ad98d7-c092-4ef4-96b7-194255e37e83-dns-swift-storage-0\") pod \"dnsmasq-dns-65965d6475-kvgqz\" (UID: \"b3ad98d7-c092-4ef4-96b7-194255e37e83\") " pod="openstack/dnsmasq-dns-65965d6475-kvgqz" Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.873624 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3ad98d7-c092-4ef4-96b7-194255e37e83-ovsdbserver-sb\") pod \"dnsmasq-dns-65965d6475-kvgqz\" (UID: \"b3ad98d7-c092-4ef4-96b7-194255e37e83\") " pod="openstack/dnsmasq-dns-65965d6475-kvgqz" Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.876220 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-nvszz" podStartSLOduration=3.596984579 podStartE2EDuration="48.876201251s" podCreationTimestamp="2025-10-01 15:13:21 +0000 UTC" firstStartedPulling="2025-10-01 15:13:23.423037611 +0000 UTC m=+1048.042212782" lastFinishedPulling="2025-10-01 15:14:08.702254283 +0000 UTC m=+1093.321429454" observedRunningTime="2025-10-01 15:14:09.855811938 +0000 UTC m=+1094.474987119" watchObservedRunningTime="2025-10-01 15:14:09.876201251 +0000 UTC m=+1094.495376422" Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.877058 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f66ea972-6475-4624-a093-8884ead588f8-config\") pod \"neutron-6c4877d5c6-thmgz\" (UID: \"f66ea972-6475-4624-a093-8884ead588f8\") " pod="openstack/neutron-6c4877d5c6-thmgz" Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.881294 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f66ea972-6475-4624-a093-8884ead588f8-combined-ca-bundle\") pod \"neutron-6c4877d5c6-thmgz\" (UID: \"f66ea972-6475-4624-a093-8884ead588f8\") " pod="openstack/neutron-6c4877d5c6-thmgz" Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.884524 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlcw5\" (UniqueName: \"kubernetes.io/projected/f66ea972-6475-4624-a093-8884ead588f8-kube-api-access-xlcw5\") pod \"neutron-6c4877d5c6-thmgz\" (UID: \"f66ea972-6475-4624-a093-8884ead588f8\") " pod="openstack/neutron-6c4877d5c6-thmgz" Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.891315 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-gq7gl" event={"ID":"7b7689a2-6ac8-47ac-86f7-7456994c39ca","Type":"ContainerStarted","Data":"c00acf985cfe64678616d36ec81d0a930db7d80fdf370d7eb6a3a5fa5167bcb0"} Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.900257 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsz66\" (UniqueName: \"kubernetes.io/projected/b3ad98d7-c092-4ef4-96b7-194255e37e83-kube-api-access-fsz66\") pod \"dnsmasq-dns-65965d6475-kvgqz\" (UID: \"b3ad98d7-c092-4ef4-96b7-194255e37e83\") " pod="openstack/dnsmasq-dns-65965d6475-kvgqz" Oct 01 15:14:09 crc kubenswrapper[4771]: I1001 15:14:09.929630 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-gq7gl" podStartSLOduration=3.130612093 podStartE2EDuration="53.929608901s" podCreationTimestamp="2025-10-01 15:13:16 +0000 UTC" firstStartedPulling="2025-10-01 15:13:17.582236951 +0000 UTC m=+1042.201412132" lastFinishedPulling="2025-10-01 15:14:08.381233769 +0000 UTC m=+1093.000408940" observedRunningTime="2025-10-01 15:14:09.919094332 +0000 UTC m=+1094.538269503" watchObservedRunningTime="2025-10-01 15:14:09.929608901 +0000 UTC m=+1094.548784072" Oct 01 15:14:10 crc kubenswrapper[4771]: I1001 15:14:10.099471 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65965d6475-kvgqz" Oct 01 15:14:10 crc kubenswrapper[4771]: I1001 15:14:10.116190 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c4877d5c6-thmgz" Oct 01 15:14:10 crc kubenswrapper[4771]: I1001 15:14:10.848410 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65965d6475-kvgqz"] Oct 01 15:14:10 crc kubenswrapper[4771]: I1001 15:14:10.857371 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-b48cbbc84-ndtts" podUID="a5159749-76e2-4e89-8a81-59d8ea1ab063" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Oct 01 15:14:10 crc kubenswrapper[4771]: I1001 15:14:10.906822 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65965d6475-kvgqz" event={"ID":"b3ad98d7-c092-4ef4-96b7-194255e37e83","Type":"ContainerStarted","Data":"1527d0e9c8b96e4f81c5aada4a981f12a1699c639f2b1fe2ca1041fc86dedce1"} Oct 01 15:14:10 crc kubenswrapper[4771]: I1001 15:14:10.928091 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-66679756f6-g56hw" podUID="405e12dd-6888-4994-ac26-b2836ad9069c" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.148:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.148:8443: connect: connection refused" Oct 01 15:14:11 crc kubenswrapper[4771]: I1001 15:14:11.649446 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6c4877d5c6-thmgz"] Oct 01 15:14:11 crc kubenswrapper[4771]: I1001 15:14:11.889673 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-67bb68cc5c-l7gnn"] Oct 01 15:14:11 crc kubenswrapper[4771]: I1001 15:14:11.891406 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67bb68cc5c-l7gnn" Oct 01 15:14:11 crc kubenswrapper[4771]: I1001 15:14:11.894322 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 01 15:14:11 crc kubenswrapper[4771]: I1001 15:14:11.899644 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-67bb68cc5c-l7gnn"] Oct 01 15:14:11 crc kubenswrapper[4771]: I1001 15:14:11.908945 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 01 15:14:11 crc kubenswrapper[4771]: I1001 15:14:11.936709 4771 generic.go:334] "Generic (PLEG): container finished" podID="b3ad98d7-c092-4ef4-96b7-194255e37e83" containerID="154182c8baec91e004c6d9fc94ba3acc61f40ecd165a8c1fc0a5681284238649" exitCode=0 Oct 01 15:14:11 crc kubenswrapper[4771]: I1001 15:14:11.936790 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65965d6475-kvgqz" event={"ID":"b3ad98d7-c092-4ef4-96b7-194255e37e83","Type":"ContainerDied","Data":"154182c8baec91e004c6d9fc94ba3acc61f40ecd165a8c1fc0a5681284238649"} Oct 01 15:14:11 crc kubenswrapper[4771]: I1001 15:14:11.951551 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c4877d5c6-thmgz" event={"ID":"f66ea972-6475-4624-a093-8884ead588f8","Type":"ContainerStarted","Data":"c12c40f91ddd483f75bcfb4e3d79ca5d1419d45516bc1bb57e17bc3bac935235"} Oct 01 15:14:11 crc kubenswrapper[4771]: I1001 15:14:11.964253 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-99rss" event={"ID":"9a523e37-804a-4173-8012-19848efc8cc0","Type":"ContainerStarted","Data":"d140b7ce466b3823c65fe198c4f62d5d313c28d82b7acdeee65dd69e6f7610ac"} Oct 01 15:14:11 crc kubenswrapper[4771]: I1001 15:14:11.984602 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-99rss" podStartSLOduration=4.906410066 podStartE2EDuration="50.984573801s" podCreationTimestamp="2025-10-01 15:13:21 +0000 UTC" firstStartedPulling="2025-10-01 15:13:23.401292143 +0000 UTC m=+1048.020467314" lastFinishedPulling="2025-10-01 15:14:09.479455868 +0000 UTC m=+1094.098631049" observedRunningTime="2025-10-01 15:14:11.980262964 +0000 UTC m=+1096.599438145" watchObservedRunningTime="2025-10-01 15:14:11.984573801 +0000 UTC m=+1096.603748972" Oct 01 15:14:12 crc kubenswrapper[4771]: I1001 15:14:12.010703 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7b28e9a-d59d-4aba-97c6-9102ada72a28-internal-tls-certs\") pod \"neutron-67bb68cc5c-l7gnn\" (UID: \"a7b28e9a-d59d-4aba-97c6-9102ada72a28\") " pod="openstack/neutron-67bb68cc5c-l7gnn" Oct 01 15:14:12 crc kubenswrapper[4771]: I1001 15:14:12.011119 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a7b28e9a-d59d-4aba-97c6-9102ada72a28-config\") pod \"neutron-67bb68cc5c-l7gnn\" (UID: \"a7b28e9a-d59d-4aba-97c6-9102ada72a28\") " pod="openstack/neutron-67bb68cc5c-l7gnn" Oct 01 15:14:12 crc kubenswrapper[4771]: I1001 15:14:12.011159 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7b28e9a-d59d-4aba-97c6-9102ada72a28-public-tls-certs\") pod \"neutron-67bb68cc5c-l7gnn\" (UID: \"a7b28e9a-d59d-4aba-97c6-9102ada72a28\") " pod="openstack/neutron-67bb68cc5c-l7gnn" Oct 01 15:14:12 crc kubenswrapper[4771]: I1001 15:14:12.011230 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7b28e9a-d59d-4aba-97c6-9102ada72a28-ovndb-tls-certs\") pod \"neutron-67bb68cc5c-l7gnn\" (UID: \"a7b28e9a-d59d-4aba-97c6-9102ada72a28\") " pod="openstack/neutron-67bb68cc5c-l7gnn" Oct 01 15:14:12 crc kubenswrapper[4771]: I1001 15:14:12.011264 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt99z\" (UniqueName: \"kubernetes.io/projected/a7b28e9a-d59d-4aba-97c6-9102ada72a28-kube-api-access-wt99z\") pod \"neutron-67bb68cc5c-l7gnn\" (UID: \"a7b28e9a-d59d-4aba-97c6-9102ada72a28\") " pod="openstack/neutron-67bb68cc5c-l7gnn" Oct 01 15:14:12 crc kubenswrapper[4771]: I1001 15:14:12.011359 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a7b28e9a-d59d-4aba-97c6-9102ada72a28-httpd-config\") pod \"neutron-67bb68cc5c-l7gnn\" (UID: \"a7b28e9a-d59d-4aba-97c6-9102ada72a28\") " pod="openstack/neutron-67bb68cc5c-l7gnn" Oct 01 15:14:12 crc kubenswrapper[4771]: I1001 15:14:12.013313 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7b28e9a-d59d-4aba-97c6-9102ada72a28-combined-ca-bundle\") pod \"neutron-67bb68cc5c-l7gnn\" (UID: \"a7b28e9a-d59d-4aba-97c6-9102ada72a28\") " pod="openstack/neutron-67bb68cc5c-l7gnn" Oct 01 15:14:12 crc kubenswrapper[4771]: I1001 15:14:12.115249 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt99z\" (UniqueName: \"kubernetes.io/projected/a7b28e9a-d59d-4aba-97c6-9102ada72a28-kube-api-access-wt99z\") pod \"neutron-67bb68cc5c-l7gnn\" (UID: \"a7b28e9a-d59d-4aba-97c6-9102ada72a28\") " pod="openstack/neutron-67bb68cc5c-l7gnn" Oct 01 15:14:12 crc kubenswrapper[4771]: I1001 15:14:12.115312 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a7b28e9a-d59d-4aba-97c6-9102ada72a28-httpd-config\") pod \"neutron-67bb68cc5c-l7gnn\" (UID: \"a7b28e9a-d59d-4aba-97c6-9102ada72a28\") " pod="openstack/neutron-67bb68cc5c-l7gnn" Oct 01 15:14:12 crc kubenswrapper[4771]: I1001 15:14:12.115371 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7b28e9a-d59d-4aba-97c6-9102ada72a28-combined-ca-bundle\") pod \"neutron-67bb68cc5c-l7gnn\" (UID: \"a7b28e9a-d59d-4aba-97c6-9102ada72a28\") " pod="openstack/neutron-67bb68cc5c-l7gnn" Oct 01 15:14:12 crc kubenswrapper[4771]: I1001 15:14:12.115425 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7b28e9a-d59d-4aba-97c6-9102ada72a28-internal-tls-certs\") pod \"neutron-67bb68cc5c-l7gnn\" (UID: \"a7b28e9a-d59d-4aba-97c6-9102ada72a28\") " pod="openstack/neutron-67bb68cc5c-l7gnn" Oct 01 15:14:12 crc kubenswrapper[4771]: I1001 15:14:12.115462 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a7b28e9a-d59d-4aba-97c6-9102ada72a28-config\") pod \"neutron-67bb68cc5c-l7gnn\" (UID: \"a7b28e9a-d59d-4aba-97c6-9102ada72a28\") " pod="openstack/neutron-67bb68cc5c-l7gnn" Oct 01 15:14:12 crc kubenswrapper[4771]: I1001 15:14:12.115489 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7b28e9a-d59d-4aba-97c6-9102ada72a28-public-tls-certs\") pod \"neutron-67bb68cc5c-l7gnn\" (UID: \"a7b28e9a-d59d-4aba-97c6-9102ada72a28\") " pod="openstack/neutron-67bb68cc5c-l7gnn" Oct 01 15:14:12 crc kubenswrapper[4771]: I1001 15:14:12.115691 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7b28e9a-d59d-4aba-97c6-9102ada72a28-ovndb-tls-certs\") pod \"neutron-67bb68cc5c-l7gnn\" (UID: \"a7b28e9a-d59d-4aba-97c6-9102ada72a28\") " pod="openstack/neutron-67bb68cc5c-l7gnn" Oct 01 15:14:12 crc kubenswrapper[4771]: I1001 15:14:12.126195 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a7b28e9a-d59d-4aba-97c6-9102ada72a28-httpd-config\") pod \"neutron-67bb68cc5c-l7gnn\" (UID: \"a7b28e9a-d59d-4aba-97c6-9102ada72a28\") " pod="openstack/neutron-67bb68cc5c-l7gnn" Oct 01 15:14:12 crc kubenswrapper[4771]: I1001 15:14:12.126981 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7b28e9a-d59d-4aba-97c6-9102ada72a28-combined-ca-bundle\") pod \"neutron-67bb68cc5c-l7gnn\" (UID: \"a7b28e9a-d59d-4aba-97c6-9102ada72a28\") " pod="openstack/neutron-67bb68cc5c-l7gnn" Oct 01 15:14:12 crc kubenswrapper[4771]: I1001 15:14:12.135527 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7b28e9a-d59d-4aba-97c6-9102ada72a28-internal-tls-certs\") pod \"neutron-67bb68cc5c-l7gnn\" (UID: \"a7b28e9a-d59d-4aba-97c6-9102ada72a28\") " pod="openstack/neutron-67bb68cc5c-l7gnn" Oct 01 15:14:12 crc kubenswrapper[4771]: I1001 15:14:12.139416 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7b28e9a-d59d-4aba-97c6-9102ada72a28-public-tls-certs\") pod \"neutron-67bb68cc5c-l7gnn\" (UID: \"a7b28e9a-d59d-4aba-97c6-9102ada72a28\") " pod="openstack/neutron-67bb68cc5c-l7gnn" Oct 01 15:14:12 crc kubenswrapper[4771]: I1001 15:14:12.140476 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a7b28e9a-d59d-4aba-97c6-9102ada72a28-config\") pod \"neutron-67bb68cc5c-l7gnn\" (UID: \"a7b28e9a-d59d-4aba-97c6-9102ada72a28\") " pod="openstack/neutron-67bb68cc5c-l7gnn" Oct 01 15:14:12 crc kubenswrapper[4771]: I1001 15:14:12.143678 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt99z\" (UniqueName: \"kubernetes.io/projected/a7b28e9a-d59d-4aba-97c6-9102ada72a28-kube-api-access-wt99z\") pod \"neutron-67bb68cc5c-l7gnn\" (UID: \"a7b28e9a-d59d-4aba-97c6-9102ada72a28\") " pod="openstack/neutron-67bb68cc5c-l7gnn" Oct 01 15:14:12 crc kubenswrapper[4771]: I1001 15:14:12.155115 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7b28e9a-d59d-4aba-97c6-9102ada72a28-ovndb-tls-certs\") pod \"neutron-67bb68cc5c-l7gnn\" (UID: \"a7b28e9a-d59d-4aba-97c6-9102ada72a28\") " pod="openstack/neutron-67bb68cc5c-l7gnn" Oct 01 15:14:12 crc kubenswrapper[4771]: I1001 15:14:12.177684 4771 patch_prober.go:28] interesting pod/machine-config-daemon-vck47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 15:14:12 crc kubenswrapper[4771]: I1001 15:14:12.178075 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 15:14:12 crc kubenswrapper[4771]: I1001 15:14:12.254417 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67bb68cc5c-l7gnn" Oct 01 15:14:12 crc kubenswrapper[4771]: I1001 15:14:12.924849 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-67bb68cc5c-l7gnn"] Oct 01 15:14:12 crc kubenswrapper[4771]: I1001 15:14:12.990761 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c4877d5c6-thmgz" event={"ID":"f66ea972-6475-4624-a093-8884ead588f8","Type":"ContainerStarted","Data":"1295acd6636ce8c5c95b86fa2c5b81795c12be6d40eb7105163c265e27ddebc4"} Oct 01 15:14:12 crc kubenswrapper[4771]: I1001 15:14:12.990797 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c4877d5c6-thmgz" event={"ID":"f66ea972-6475-4624-a093-8884ead588f8","Type":"ContainerStarted","Data":"23622ff95d6dcef208b1ee5e0aa03a223373fc0f91fb4beeaf4078217eaab3c5"} Oct 01 15:14:12 crc kubenswrapper[4771]: I1001 15:14:12.990852 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6c4877d5c6-thmgz" Oct 01 15:14:13 crc kubenswrapper[4771]: I1001 15:14:13.004019 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67bb68cc5c-l7gnn" event={"ID":"a7b28e9a-d59d-4aba-97c6-9102ada72a28","Type":"ContainerStarted","Data":"269a04810558df94a98a05b875fa5afe444612f2c8d3fd21a479691a2aa56909"} Oct 01 15:14:13 crc kubenswrapper[4771]: I1001 15:14:13.009623 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65965d6475-kvgqz" event={"ID":"b3ad98d7-c092-4ef4-96b7-194255e37e83","Type":"ContainerStarted","Data":"909ecd1dd1bb27090a1c5e17a30eaa369edde5b3077b211232eb37e1175a5409"} Oct 01 15:14:13 crc kubenswrapper[4771]: I1001 15:14:13.010501 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-65965d6475-kvgqz" Oct 01 15:14:13 crc kubenswrapper[4771]: I1001 15:14:13.037400 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6c4877d5c6-thmgz" podStartSLOduration=4.037374846 podStartE2EDuration="4.037374846s" podCreationTimestamp="2025-10-01 15:14:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:14:13.012748007 +0000 UTC m=+1097.631923198" watchObservedRunningTime="2025-10-01 15:14:13.037374846 +0000 UTC m=+1097.656550027" Oct 01 15:14:13 crc kubenswrapper[4771]: I1001 15:14:13.047075 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-65965d6475-kvgqz" podStartSLOduration=4.047058506 podStartE2EDuration="4.047058506s" podCreationTimestamp="2025-10-01 15:14:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:14:13.030805124 +0000 UTC m=+1097.649980315" watchObservedRunningTime="2025-10-01 15:14:13.047058506 +0000 UTC m=+1097.666233677" Oct 01 15:14:14 crc kubenswrapper[4771]: I1001 15:14:14.018435 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67bb68cc5c-l7gnn" event={"ID":"a7b28e9a-d59d-4aba-97c6-9102ada72a28","Type":"ContainerStarted","Data":"e6bdea6c1fc486fa37f37141d360dcdd4489aff2f3b55087894165e797e27479"} Oct 01 15:14:14 crc kubenswrapper[4771]: I1001 15:14:14.018782 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67bb68cc5c-l7gnn" event={"ID":"a7b28e9a-d59d-4aba-97c6-9102ada72a28","Type":"ContainerStarted","Data":"290cc3c110c912385d4df64ff3f1fc94c9718c62265b84e207b21d92708396e2"} Oct 01 15:14:14 crc kubenswrapper[4771]: I1001 15:14:14.035229 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-67bb68cc5c-l7gnn" podStartSLOduration=3.035216024 podStartE2EDuration="3.035216024s" podCreationTimestamp="2025-10-01 15:14:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:14:14.033195173 +0000 UTC m=+1098.652370344" watchObservedRunningTime="2025-10-01 15:14:14.035216024 +0000 UTC m=+1098.654391195" Oct 01 15:14:15 crc kubenswrapper[4771]: I1001 15:14:15.026091 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-67bb68cc5c-l7gnn" Oct 01 15:14:17 crc kubenswrapper[4771]: I1001 15:14:17.048451 4771 generic.go:334] "Generic (PLEG): container finished" podID="91950d77-9457-412f-be07-626b553f6b8d" containerID="d3afd8daf31a54889ea06e80fb2e696bebb9f2f782ba5c2d3ff182daf3212ac4" exitCode=0 Oct 01 15:14:17 crc kubenswrapper[4771]: I1001 15:14:17.048481 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-d4zpd" event={"ID":"91950d77-9457-412f-be07-626b553f6b8d","Type":"ContainerDied","Data":"d3afd8daf31a54889ea06e80fb2e696bebb9f2f782ba5c2d3ff182daf3212ac4"} Oct 01 15:14:19 crc kubenswrapper[4771]: I1001 15:14:19.950209 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-d4zpd" Oct 01 15:14:20 crc kubenswrapper[4771]: I1001 15:14:20.065581 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlfnq\" (UniqueName: \"kubernetes.io/projected/91950d77-9457-412f-be07-626b553f6b8d-kube-api-access-zlfnq\") pod \"91950d77-9457-412f-be07-626b553f6b8d\" (UID: \"91950d77-9457-412f-be07-626b553f6b8d\") " Oct 01 15:14:20 crc kubenswrapper[4771]: I1001 15:14:20.065670 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91950d77-9457-412f-be07-626b553f6b8d-scripts\") pod \"91950d77-9457-412f-be07-626b553f6b8d\" (UID: \"91950d77-9457-412f-be07-626b553f6b8d\") " Oct 01 15:14:20 crc kubenswrapper[4771]: I1001 15:14:20.065756 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91950d77-9457-412f-be07-626b553f6b8d-logs\") pod \"91950d77-9457-412f-be07-626b553f6b8d\" (UID: \"91950d77-9457-412f-be07-626b553f6b8d\") " Oct 01 15:14:20 crc kubenswrapper[4771]: I1001 15:14:20.065812 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91950d77-9457-412f-be07-626b553f6b8d-config-data\") pod \"91950d77-9457-412f-be07-626b553f6b8d\" (UID: \"91950d77-9457-412f-be07-626b553f6b8d\") " Oct 01 15:14:20 crc kubenswrapper[4771]: I1001 15:14:20.065875 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91950d77-9457-412f-be07-626b553f6b8d-combined-ca-bundle\") pod \"91950d77-9457-412f-be07-626b553f6b8d\" (UID: \"91950d77-9457-412f-be07-626b553f6b8d\") " Oct 01 15:14:20 crc kubenswrapper[4771]: I1001 15:14:20.066322 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91950d77-9457-412f-be07-626b553f6b8d-logs" (OuterVolumeSpecName: "logs") pod "91950d77-9457-412f-be07-626b553f6b8d" (UID: "91950d77-9457-412f-be07-626b553f6b8d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:14:20 crc kubenswrapper[4771]: I1001 15:14:20.066774 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/91950d77-9457-412f-be07-626b553f6b8d-logs\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:20 crc kubenswrapper[4771]: I1001 15:14:20.079980 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91950d77-9457-412f-be07-626b553f6b8d-kube-api-access-zlfnq" (OuterVolumeSpecName: "kube-api-access-zlfnq") pod "91950d77-9457-412f-be07-626b553f6b8d" (UID: "91950d77-9457-412f-be07-626b553f6b8d"). InnerVolumeSpecName "kube-api-access-zlfnq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:14:20 crc kubenswrapper[4771]: I1001 15:14:20.080835 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91950d77-9457-412f-be07-626b553f6b8d-scripts" (OuterVolumeSpecName: "scripts") pod "91950d77-9457-412f-be07-626b553f6b8d" (UID: "91950d77-9457-412f-be07-626b553f6b8d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:20 crc kubenswrapper[4771]: I1001 15:14:20.097678 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91950d77-9457-412f-be07-626b553f6b8d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "91950d77-9457-412f-be07-626b553f6b8d" (UID: "91950d77-9457-412f-be07-626b553f6b8d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:20 crc kubenswrapper[4771]: I1001 15:14:20.106426 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-65965d6475-kvgqz" Oct 01 15:14:20 crc kubenswrapper[4771]: I1001 15:14:20.116943 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-d4zpd" event={"ID":"91950d77-9457-412f-be07-626b553f6b8d","Type":"ContainerDied","Data":"7cb9677ed675d24cb8cd0ba643e00fd75391fcb66fa1656591799e73b061532e"} Oct 01 15:14:20 crc kubenswrapper[4771]: I1001 15:14:20.116985 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7cb9677ed675d24cb8cd0ba643e00fd75391fcb66fa1656591799e73b061532e" Oct 01 15:14:20 crc kubenswrapper[4771]: I1001 15:14:20.117049 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-d4zpd" Oct 01 15:14:20 crc kubenswrapper[4771]: I1001 15:14:20.170869 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91950d77-9457-412f-be07-626b553f6b8d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:20 crc kubenswrapper[4771]: I1001 15:14:20.171104 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlfnq\" (UniqueName: \"kubernetes.io/projected/91950d77-9457-412f-be07-626b553f6b8d-kube-api-access-zlfnq\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:20 crc kubenswrapper[4771]: I1001 15:14:20.171115 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/91950d77-9457-412f-be07-626b553f6b8d-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:20 crc kubenswrapper[4771]: I1001 15:14:20.205500 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-qjn7w"] Oct 01 15:14:20 crc kubenswrapper[4771]: I1001 15:14:20.205768 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76fcf4b695-qjn7w" podUID="ceb96a8c-ce5a-420e-aa1e-594d09fc1487" containerName="dnsmasq-dns" containerID="cri-o://f749f84155fd265a1e036a5648224fc00c532993dd4d7c8e0cd2bc56722c553a" gracePeriod=10 Oct 01 15:14:20 crc kubenswrapper[4771]: I1001 15:14:20.214983 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91950d77-9457-412f-be07-626b553f6b8d-config-data" (OuterVolumeSpecName: "config-data") pod "91950d77-9457-412f-be07-626b553f6b8d" (UID: "91950d77-9457-412f-be07-626b553f6b8d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:20 crc kubenswrapper[4771]: I1001 15:14:20.273320 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/91950d77-9457-412f-be07-626b553f6b8d-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:20 crc kubenswrapper[4771]: I1001 15:14:20.843988 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-b48cbbc84-ndtts" podUID="a5159749-76e2-4e89-8a81-59d8ea1ab063" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.147:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.147:8443: connect: connection refused" Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.055703 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-67bb557f68-mz5cv"] Oct 01 15:14:21 crc kubenswrapper[4771]: E1001 15:14:21.056457 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91950d77-9457-412f-be07-626b553f6b8d" containerName="placement-db-sync" Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.056473 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="91950d77-9457-412f-be07-626b553f6b8d" containerName="placement-db-sync" Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.056694 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="91950d77-9457-412f-be07-626b553f6b8d" containerName="placement-db-sync" Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.057806 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-67bb557f68-mz5cv" Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.061088 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.061325 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-w4m4k" Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.061467 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.061662 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.061794 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.069795 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-67bb557f68-mz5cv"] Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.128224 4771 generic.go:334] "Generic (PLEG): container finished" podID="ceb96a8c-ce5a-420e-aa1e-594d09fc1487" containerID="f749f84155fd265a1e036a5648224fc00c532993dd4d7c8e0cd2bc56722c553a" exitCode=0 Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.128279 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-qjn7w" event={"ID":"ceb96a8c-ce5a-420e-aa1e-594d09fc1487","Type":"ContainerDied","Data":"f749f84155fd265a1e036a5648224fc00c532993dd4d7c8e0cd2bc56722c553a"} Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.128304 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76fcf4b695-qjn7w" event={"ID":"ceb96a8c-ce5a-420e-aa1e-594d09fc1487","Type":"ContainerDied","Data":"07acfd45c9803ba4f89499650d17e0b3a8fce807090693a47c1ee664dee47e74"} Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.128335 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="07acfd45c9803ba4f89499650d17e0b3a8fce807090693a47c1ee664dee47e74" Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.189507 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dfa4374-0400-489e-90eb-baca0f8afdfd-config-data\") pod \"placement-67bb557f68-mz5cv\" (UID: \"6dfa4374-0400-489e-90eb-baca0f8afdfd\") " pod="openstack/placement-67bb557f68-mz5cv" Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.189562 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sccm\" (UniqueName: \"kubernetes.io/projected/6dfa4374-0400-489e-90eb-baca0f8afdfd-kube-api-access-4sccm\") pod \"placement-67bb557f68-mz5cv\" (UID: \"6dfa4374-0400-489e-90eb-baca0f8afdfd\") " pod="openstack/placement-67bb557f68-mz5cv" Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.189704 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dfa4374-0400-489e-90eb-baca0f8afdfd-internal-tls-certs\") pod \"placement-67bb557f68-mz5cv\" (UID: \"6dfa4374-0400-489e-90eb-baca0f8afdfd\") " pod="openstack/placement-67bb557f68-mz5cv" Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.189766 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dfa4374-0400-489e-90eb-baca0f8afdfd-public-tls-certs\") pod \"placement-67bb557f68-mz5cv\" (UID: \"6dfa4374-0400-489e-90eb-baca0f8afdfd\") " pod="openstack/placement-67bb557f68-mz5cv" Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.189843 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dfa4374-0400-489e-90eb-baca0f8afdfd-combined-ca-bundle\") pod \"placement-67bb557f68-mz5cv\" (UID: \"6dfa4374-0400-489e-90eb-baca0f8afdfd\") " pod="openstack/placement-67bb557f68-mz5cv" Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.189881 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6dfa4374-0400-489e-90eb-baca0f8afdfd-scripts\") pod \"placement-67bb557f68-mz5cv\" (UID: \"6dfa4374-0400-489e-90eb-baca0f8afdfd\") " pod="openstack/placement-67bb557f68-mz5cv" Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.189899 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6dfa4374-0400-489e-90eb-baca0f8afdfd-logs\") pod \"placement-67bb557f68-mz5cv\" (UID: \"6dfa4374-0400-489e-90eb-baca0f8afdfd\") " pod="openstack/placement-67bb557f68-mz5cv" Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.211174 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-qjn7w" Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.292050 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ceb96a8c-ce5a-420e-aa1e-594d09fc1487-ovsdbserver-sb\") pod \"ceb96a8c-ce5a-420e-aa1e-594d09fc1487\" (UID: \"ceb96a8c-ce5a-420e-aa1e-594d09fc1487\") " Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.292123 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ceb96a8c-ce5a-420e-aa1e-594d09fc1487-dns-svc\") pod \"ceb96a8c-ce5a-420e-aa1e-594d09fc1487\" (UID: \"ceb96a8c-ce5a-420e-aa1e-594d09fc1487\") " Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.292196 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ceb96a8c-ce5a-420e-aa1e-594d09fc1487-dns-swift-storage-0\") pod \"ceb96a8c-ce5a-420e-aa1e-594d09fc1487\" (UID: \"ceb96a8c-ce5a-420e-aa1e-594d09fc1487\") " Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.292223 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhkrp\" (UniqueName: \"kubernetes.io/projected/ceb96a8c-ce5a-420e-aa1e-594d09fc1487-kube-api-access-jhkrp\") pod \"ceb96a8c-ce5a-420e-aa1e-594d09fc1487\" (UID: \"ceb96a8c-ce5a-420e-aa1e-594d09fc1487\") " Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.292242 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ceb96a8c-ce5a-420e-aa1e-594d09fc1487-config\") pod \"ceb96a8c-ce5a-420e-aa1e-594d09fc1487\" (UID: \"ceb96a8c-ce5a-420e-aa1e-594d09fc1487\") " Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.292390 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ceb96a8c-ce5a-420e-aa1e-594d09fc1487-ovsdbserver-nb\") pod \"ceb96a8c-ce5a-420e-aa1e-594d09fc1487\" (UID: \"ceb96a8c-ce5a-420e-aa1e-594d09fc1487\") " Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.292710 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dfa4374-0400-489e-90eb-baca0f8afdfd-combined-ca-bundle\") pod \"placement-67bb557f68-mz5cv\" (UID: \"6dfa4374-0400-489e-90eb-baca0f8afdfd\") " pod="openstack/placement-67bb557f68-mz5cv" Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.292790 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6dfa4374-0400-489e-90eb-baca0f8afdfd-scripts\") pod \"placement-67bb557f68-mz5cv\" (UID: \"6dfa4374-0400-489e-90eb-baca0f8afdfd\") " pod="openstack/placement-67bb557f68-mz5cv" Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.292818 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6dfa4374-0400-489e-90eb-baca0f8afdfd-logs\") pod \"placement-67bb557f68-mz5cv\" (UID: \"6dfa4374-0400-489e-90eb-baca0f8afdfd\") " pod="openstack/placement-67bb557f68-mz5cv" Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.292872 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dfa4374-0400-489e-90eb-baca0f8afdfd-config-data\") pod \"placement-67bb557f68-mz5cv\" (UID: \"6dfa4374-0400-489e-90eb-baca0f8afdfd\") " pod="openstack/placement-67bb557f68-mz5cv" Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.292908 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sccm\" (UniqueName: \"kubernetes.io/projected/6dfa4374-0400-489e-90eb-baca0f8afdfd-kube-api-access-4sccm\") pod \"placement-67bb557f68-mz5cv\" (UID: \"6dfa4374-0400-489e-90eb-baca0f8afdfd\") " pod="openstack/placement-67bb557f68-mz5cv" Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.292995 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dfa4374-0400-489e-90eb-baca0f8afdfd-internal-tls-certs\") pod \"placement-67bb557f68-mz5cv\" (UID: \"6dfa4374-0400-489e-90eb-baca0f8afdfd\") " pod="openstack/placement-67bb557f68-mz5cv" Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.293913 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dfa4374-0400-489e-90eb-baca0f8afdfd-public-tls-certs\") pod \"placement-67bb557f68-mz5cv\" (UID: \"6dfa4374-0400-489e-90eb-baca0f8afdfd\") " pod="openstack/placement-67bb557f68-mz5cv" Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.298711 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dfa4374-0400-489e-90eb-baca0f8afdfd-public-tls-certs\") pod \"placement-67bb557f68-mz5cv\" (UID: \"6dfa4374-0400-489e-90eb-baca0f8afdfd\") " pod="openstack/placement-67bb557f68-mz5cv" Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.302715 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6dfa4374-0400-489e-90eb-baca0f8afdfd-scripts\") pod \"placement-67bb557f68-mz5cv\" (UID: \"6dfa4374-0400-489e-90eb-baca0f8afdfd\") " pod="openstack/placement-67bb557f68-mz5cv" Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.302980 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6dfa4374-0400-489e-90eb-baca0f8afdfd-logs\") pod \"placement-67bb557f68-mz5cv\" (UID: \"6dfa4374-0400-489e-90eb-baca0f8afdfd\") " pod="openstack/placement-67bb557f68-mz5cv" Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.306342 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6dfa4374-0400-489e-90eb-baca0f8afdfd-config-data\") pod \"placement-67bb557f68-mz5cv\" (UID: \"6dfa4374-0400-489e-90eb-baca0f8afdfd\") " pod="openstack/placement-67bb557f68-mz5cv" Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.310475 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6dfa4374-0400-489e-90eb-baca0f8afdfd-internal-tls-certs\") pod \"placement-67bb557f68-mz5cv\" (UID: \"6dfa4374-0400-489e-90eb-baca0f8afdfd\") " pod="openstack/placement-67bb557f68-mz5cv" Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.315335 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dfa4374-0400-489e-90eb-baca0f8afdfd-combined-ca-bundle\") pod \"placement-67bb557f68-mz5cv\" (UID: \"6dfa4374-0400-489e-90eb-baca0f8afdfd\") " pod="openstack/placement-67bb557f68-mz5cv" Oct 01 15:14:21 crc kubenswrapper[4771]: E1001 15:14:21.319945 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="127a1d07-d1f4-4c95-abf8-da08884ea57a" Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.330780 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ceb96a8c-ce5a-420e-aa1e-594d09fc1487-kube-api-access-jhkrp" (OuterVolumeSpecName: "kube-api-access-jhkrp") pod "ceb96a8c-ce5a-420e-aa1e-594d09fc1487" (UID: "ceb96a8c-ce5a-420e-aa1e-594d09fc1487"). InnerVolumeSpecName "kube-api-access-jhkrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.355844 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ceb96a8c-ce5a-420e-aa1e-594d09fc1487-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ceb96a8c-ce5a-420e-aa1e-594d09fc1487" (UID: "ceb96a8c-ce5a-420e-aa1e-594d09fc1487"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.372020 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sccm\" (UniqueName: \"kubernetes.io/projected/6dfa4374-0400-489e-90eb-baca0f8afdfd-kube-api-access-4sccm\") pod \"placement-67bb557f68-mz5cv\" (UID: \"6dfa4374-0400-489e-90eb-baca0f8afdfd\") " pod="openstack/placement-67bb557f68-mz5cv" Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.396453 4771 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ceb96a8c-ce5a-420e-aa1e-594d09fc1487-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.396494 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhkrp\" (UniqueName: \"kubernetes.io/projected/ceb96a8c-ce5a-420e-aa1e-594d09fc1487-kube-api-access-jhkrp\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.401760 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ceb96a8c-ce5a-420e-aa1e-594d09fc1487-config" (OuterVolumeSpecName: "config") pod "ceb96a8c-ce5a-420e-aa1e-594d09fc1487" (UID: "ceb96a8c-ce5a-420e-aa1e-594d09fc1487"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.401837 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ceb96a8c-ce5a-420e-aa1e-594d09fc1487-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ceb96a8c-ce5a-420e-aa1e-594d09fc1487" (UID: "ceb96a8c-ce5a-420e-aa1e-594d09fc1487"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:14:21 crc kubenswrapper[4771]: E1001 15:14:21.466329 4771 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ceb96a8c-ce5a-420e-aa1e-594d09fc1487-ovsdbserver-sb podName:ceb96a8c-ce5a-420e-aa1e-594d09fc1487 nodeName:}" failed. No retries permitted until 2025-10-01 15:14:21.966297075 +0000 UTC m=+1106.585472256 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ovsdbserver-sb" (UniqueName: "kubernetes.io/configmap/ceb96a8c-ce5a-420e-aa1e-594d09fc1487-ovsdbserver-sb") pod "ceb96a8c-ce5a-420e-aa1e-594d09fc1487" (UID: "ceb96a8c-ce5a-420e-aa1e-594d09fc1487") : error deleting /var/lib/kubelet/pods/ceb96a8c-ce5a-420e-aa1e-594d09fc1487/volume-subpaths: remove /var/lib/kubelet/pods/ceb96a8c-ce5a-420e-aa1e-594d09fc1487/volume-subpaths: no such file or directory Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.498406 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ceb96a8c-ce5a-420e-aa1e-594d09fc1487-config\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.498449 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ceb96a8c-ce5a-420e-aa1e-594d09fc1487-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.541415 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ceb96a8c-ce5a-420e-aa1e-594d09fc1487-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ceb96a8c-ce5a-420e-aa1e-594d09fc1487" (UID: "ceb96a8c-ce5a-420e-aa1e-594d09fc1487"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.547966 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-67bb557f68-mz5cv" Oct 01 15:14:21 crc kubenswrapper[4771]: I1001 15:14:21.600823 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ceb96a8c-ce5a-420e-aa1e-594d09fc1487-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:22 crc kubenswrapper[4771]: I1001 15:14:22.008037 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ceb96a8c-ce5a-420e-aa1e-594d09fc1487-ovsdbserver-sb\") pod \"ceb96a8c-ce5a-420e-aa1e-594d09fc1487\" (UID: \"ceb96a8c-ce5a-420e-aa1e-594d09fc1487\") " Oct 01 15:14:22 crc kubenswrapper[4771]: I1001 15:14:22.008693 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ceb96a8c-ce5a-420e-aa1e-594d09fc1487-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ceb96a8c-ce5a-420e-aa1e-594d09fc1487" (UID: "ceb96a8c-ce5a-420e-aa1e-594d09fc1487"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:14:22 crc kubenswrapper[4771]: W1001 15:14:22.043150 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6dfa4374_0400_489e_90eb_baca0f8afdfd.slice/crio-db975082a186087e49df87e7b728933561aef059fa236e8ac2a83dab2741706e WatchSource:0}: Error finding container db975082a186087e49df87e7b728933561aef059fa236e8ac2a83dab2741706e: Status 404 returned error can't find the container with id db975082a186087e49df87e7b728933561aef059fa236e8ac2a83dab2741706e Oct 01 15:14:22 crc kubenswrapper[4771]: I1001 15:14:22.044058 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-67bb557f68-mz5cv"] Oct 01 15:14:22 crc kubenswrapper[4771]: I1001 15:14:22.110803 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ceb96a8c-ce5a-420e-aa1e-594d09fc1487-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:22 crc kubenswrapper[4771]: I1001 15:14:22.146290 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"127a1d07-d1f4-4c95-abf8-da08884ea57a","Type":"ContainerStarted","Data":"549414480fe1f01dcde3c7a3370fff68f0e7e8fd016e99e433070e1d295463d0"} Oct 01 15:14:22 crc kubenswrapper[4771]: I1001 15:14:22.146490 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="127a1d07-d1f4-4c95-abf8-da08884ea57a" containerName="ceilometer-notification-agent" containerID="cri-o://67d4f49b7771a6cd394edbf1f7a2ab2572e824e9ea022772d3e883951c3927c0" gracePeriod=30 Oct 01 15:14:22 crc kubenswrapper[4771]: I1001 15:14:22.146792 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 15:14:22 crc kubenswrapper[4771]: I1001 15:14:22.147131 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="127a1d07-d1f4-4c95-abf8-da08884ea57a" containerName="proxy-httpd" containerID="cri-o://549414480fe1f01dcde3c7a3370fff68f0e7e8fd016e99e433070e1d295463d0" gracePeriod=30 Oct 01 15:14:22 crc kubenswrapper[4771]: I1001 15:14:22.147210 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="127a1d07-d1f4-4c95-abf8-da08884ea57a" containerName="sg-core" containerID="cri-o://0aef501b8ed99dbcdaf337e52e02094a16b36ba8436428288b3845e0ff56df48" gracePeriod=30 Oct 01 15:14:22 crc kubenswrapper[4771]: I1001 15:14:22.158445 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-67bb557f68-mz5cv" event={"ID":"6dfa4374-0400-489e-90eb-baca0f8afdfd","Type":"ContainerStarted","Data":"db975082a186087e49df87e7b728933561aef059fa236e8ac2a83dab2741706e"} Oct 01 15:14:22 crc kubenswrapper[4771]: I1001 15:14:22.162120 4771 generic.go:334] "Generic (PLEG): container finished" podID="313f9ab1-8fa8-476f-94cb-1d94bd975a06" containerID="5f20731ceb27dc77af04cf960e63721a47976e8675a07ee752f038d50d44be44" exitCode=0 Oct 01 15:14:22 crc kubenswrapper[4771]: I1001 15:14:22.162224 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76fcf4b695-qjn7w" Oct 01 15:14:22 crc kubenswrapper[4771]: I1001 15:14:22.162217 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-nvszz" event={"ID":"313f9ab1-8fa8-476f-94cb-1d94bd975a06","Type":"ContainerDied","Data":"5f20731ceb27dc77af04cf960e63721a47976e8675a07ee752f038d50d44be44"} Oct 01 15:14:22 crc kubenswrapper[4771]: I1001 15:14:22.342490 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-qjn7w"] Oct 01 15:14:22 crc kubenswrapper[4771]: I1001 15:14:22.350209 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76fcf4b695-qjn7w"] Oct 01 15:14:23 crc kubenswrapper[4771]: I1001 15:14:23.028065 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-66679756f6-g56hw" Oct 01 15:14:23 crc kubenswrapper[4771]: I1001 15:14:23.183398 4771 generic.go:334] "Generic (PLEG): container finished" podID="127a1d07-d1f4-4c95-abf8-da08884ea57a" containerID="549414480fe1f01dcde3c7a3370fff68f0e7e8fd016e99e433070e1d295463d0" exitCode=0 Oct 01 15:14:23 crc kubenswrapper[4771]: I1001 15:14:23.183725 4771 generic.go:334] "Generic (PLEG): container finished" podID="127a1d07-d1f4-4c95-abf8-da08884ea57a" containerID="0aef501b8ed99dbcdaf337e52e02094a16b36ba8436428288b3845e0ff56df48" exitCode=2 Oct 01 15:14:23 crc kubenswrapper[4771]: I1001 15:14:23.183756 4771 generic.go:334] "Generic (PLEG): container finished" podID="127a1d07-d1f4-4c95-abf8-da08884ea57a" containerID="67d4f49b7771a6cd394edbf1f7a2ab2572e824e9ea022772d3e883951c3927c0" exitCode=0 Oct 01 15:14:23 crc kubenswrapper[4771]: I1001 15:14:23.183468 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"127a1d07-d1f4-4c95-abf8-da08884ea57a","Type":"ContainerDied","Data":"549414480fe1f01dcde3c7a3370fff68f0e7e8fd016e99e433070e1d295463d0"} Oct 01 15:14:23 crc kubenswrapper[4771]: I1001 15:14:23.183832 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"127a1d07-d1f4-4c95-abf8-da08884ea57a","Type":"ContainerDied","Data":"0aef501b8ed99dbcdaf337e52e02094a16b36ba8436428288b3845e0ff56df48"} Oct 01 15:14:23 crc kubenswrapper[4771]: I1001 15:14:23.183850 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"127a1d07-d1f4-4c95-abf8-da08884ea57a","Type":"ContainerDied","Data":"67d4f49b7771a6cd394edbf1f7a2ab2572e824e9ea022772d3e883951c3927c0"} Oct 01 15:14:23 crc kubenswrapper[4771]: I1001 15:14:23.183863 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"127a1d07-d1f4-4c95-abf8-da08884ea57a","Type":"ContainerDied","Data":"a75867f32bf3e6dab26e393500b4de93fda7ff71f88a394bd3f0c6bfa4b83619"} Oct 01 15:14:23 crc kubenswrapper[4771]: I1001 15:14:23.183875 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a75867f32bf3e6dab26e393500b4de93fda7ff71f88a394bd3f0c6bfa4b83619" Oct 01 15:14:23 crc kubenswrapper[4771]: I1001 15:14:23.193693 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-67bb557f68-mz5cv" event={"ID":"6dfa4374-0400-489e-90eb-baca0f8afdfd","Type":"ContainerStarted","Data":"079d920c86b35ec77859d9c19317a240ac31f77d2ff0bb50b381236fff5d524e"} Oct 01 15:14:23 crc kubenswrapper[4771]: I1001 15:14:23.193723 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-67bb557f68-mz5cv" event={"ID":"6dfa4374-0400-489e-90eb-baca0f8afdfd","Type":"ContainerStarted","Data":"700408729c410d7c77ecf81f34c7c68e8365ac8b2eed836283cc8ea089450d39"} Oct 01 15:14:23 crc kubenswrapper[4771]: I1001 15:14:23.193788 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-67bb557f68-mz5cv" Oct 01 15:14:23 crc kubenswrapper[4771]: I1001 15:14:23.193806 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-67bb557f68-mz5cv" Oct 01 15:14:23 crc kubenswrapper[4771]: I1001 15:14:23.219481 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-67bb557f68-mz5cv" podStartSLOduration=2.219462753 podStartE2EDuration="2.219462753s" podCreationTimestamp="2025-10-01 15:14:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:14:23.217082894 +0000 UTC m=+1107.836258085" watchObservedRunningTime="2025-10-01 15:14:23.219462753 +0000 UTC m=+1107.838637924" Oct 01 15:14:23 crc kubenswrapper[4771]: I1001 15:14:23.293310 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 15:14:23 crc kubenswrapper[4771]: I1001 15:14:23.437602 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/127a1d07-d1f4-4c95-abf8-da08884ea57a-combined-ca-bundle\") pod \"127a1d07-d1f4-4c95-abf8-da08884ea57a\" (UID: \"127a1d07-d1f4-4c95-abf8-da08884ea57a\") " Oct 01 15:14:23 crc kubenswrapper[4771]: I1001 15:14:23.438186 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/127a1d07-d1f4-4c95-abf8-da08884ea57a-sg-core-conf-yaml\") pod \"127a1d07-d1f4-4c95-abf8-da08884ea57a\" (UID: \"127a1d07-d1f4-4c95-abf8-da08884ea57a\") " Oct 01 15:14:23 crc kubenswrapper[4771]: I1001 15:14:23.438302 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/127a1d07-d1f4-4c95-abf8-da08884ea57a-scripts\") pod \"127a1d07-d1f4-4c95-abf8-da08884ea57a\" (UID: \"127a1d07-d1f4-4c95-abf8-da08884ea57a\") " Oct 01 15:14:23 crc kubenswrapper[4771]: I1001 15:14:23.438358 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/127a1d07-d1f4-4c95-abf8-da08884ea57a-log-httpd\") pod \"127a1d07-d1f4-4c95-abf8-da08884ea57a\" (UID: \"127a1d07-d1f4-4c95-abf8-da08884ea57a\") " Oct 01 15:14:23 crc kubenswrapper[4771]: I1001 15:14:23.438410 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/127a1d07-d1f4-4c95-abf8-da08884ea57a-config-data\") pod \"127a1d07-d1f4-4c95-abf8-da08884ea57a\" (UID: \"127a1d07-d1f4-4c95-abf8-da08884ea57a\") " Oct 01 15:14:23 crc kubenswrapper[4771]: I1001 15:14:23.438435 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4g54n\" (UniqueName: \"kubernetes.io/projected/127a1d07-d1f4-4c95-abf8-da08884ea57a-kube-api-access-4g54n\") pod \"127a1d07-d1f4-4c95-abf8-da08884ea57a\" (UID: \"127a1d07-d1f4-4c95-abf8-da08884ea57a\") " Oct 01 15:14:23 crc kubenswrapper[4771]: I1001 15:14:23.438473 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/127a1d07-d1f4-4c95-abf8-da08884ea57a-run-httpd\") pod \"127a1d07-d1f4-4c95-abf8-da08884ea57a\" (UID: \"127a1d07-d1f4-4c95-abf8-da08884ea57a\") " Oct 01 15:14:23 crc kubenswrapper[4771]: I1001 15:14:23.438924 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/127a1d07-d1f4-4c95-abf8-da08884ea57a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "127a1d07-d1f4-4c95-abf8-da08884ea57a" (UID: "127a1d07-d1f4-4c95-abf8-da08884ea57a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:14:23 crc kubenswrapper[4771]: I1001 15:14:23.439376 4771 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/127a1d07-d1f4-4c95-abf8-da08884ea57a-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:23 crc kubenswrapper[4771]: I1001 15:14:23.439761 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/127a1d07-d1f4-4c95-abf8-da08884ea57a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "127a1d07-d1f4-4c95-abf8-da08884ea57a" (UID: "127a1d07-d1f4-4c95-abf8-da08884ea57a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:14:23 crc kubenswrapper[4771]: I1001 15:14:23.445872 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/127a1d07-d1f4-4c95-abf8-da08884ea57a-scripts" (OuterVolumeSpecName: "scripts") pod "127a1d07-d1f4-4c95-abf8-da08884ea57a" (UID: "127a1d07-d1f4-4c95-abf8-da08884ea57a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:23 crc kubenswrapper[4771]: I1001 15:14:23.451957 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/127a1d07-d1f4-4c95-abf8-da08884ea57a-kube-api-access-4g54n" (OuterVolumeSpecName: "kube-api-access-4g54n") pod "127a1d07-d1f4-4c95-abf8-da08884ea57a" (UID: "127a1d07-d1f4-4c95-abf8-da08884ea57a"). InnerVolumeSpecName "kube-api-access-4g54n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:14:23 crc kubenswrapper[4771]: I1001 15:14:23.466817 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/127a1d07-d1f4-4c95-abf8-da08884ea57a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "127a1d07-d1f4-4c95-abf8-da08884ea57a" (UID: "127a1d07-d1f4-4c95-abf8-da08884ea57a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:23 crc kubenswrapper[4771]: I1001 15:14:23.515374 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/127a1d07-d1f4-4c95-abf8-da08884ea57a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "127a1d07-d1f4-4c95-abf8-da08884ea57a" (UID: "127a1d07-d1f4-4c95-abf8-da08884ea57a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:23 crc kubenswrapper[4771]: I1001 15:14:23.527445 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/127a1d07-d1f4-4c95-abf8-da08884ea57a-config-data" (OuterVolumeSpecName: "config-data") pod "127a1d07-d1f4-4c95-abf8-da08884ea57a" (UID: "127a1d07-d1f4-4c95-abf8-da08884ea57a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:23 crc kubenswrapper[4771]: I1001 15:14:23.540959 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/127a1d07-d1f4-4c95-abf8-da08884ea57a-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:23 crc kubenswrapper[4771]: I1001 15:14:23.541152 4771 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/127a1d07-d1f4-4c95-abf8-da08884ea57a-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:23 crc kubenswrapper[4771]: I1001 15:14:23.541211 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/127a1d07-d1f4-4c95-abf8-da08884ea57a-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:23 crc kubenswrapper[4771]: I1001 15:14:23.541265 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4g54n\" (UniqueName: \"kubernetes.io/projected/127a1d07-d1f4-4c95-abf8-da08884ea57a-kube-api-access-4g54n\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:23 crc kubenswrapper[4771]: I1001 15:14:23.541339 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/127a1d07-d1f4-4c95-abf8-da08884ea57a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:23 crc kubenswrapper[4771]: I1001 15:14:23.541406 4771 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/127a1d07-d1f4-4c95-abf8-da08884ea57a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:23 crc kubenswrapper[4771]: I1001 15:14:23.576791 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-nvszz" Oct 01 15:14:23 crc kubenswrapper[4771]: I1001 15:14:23.744073 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/313f9ab1-8fa8-476f-94cb-1d94bd975a06-db-sync-config-data\") pod \"313f9ab1-8fa8-476f-94cb-1d94bd975a06\" (UID: \"313f9ab1-8fa8-476f-94cb-1d94bd975a06\") " Oct 01 15:14:23 crc kubenswrapper[4771]: I1001 15:14:23.744157 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/313f9ab1-8fa8-476f-94cb-1d94bd975a06-combined-ca-bundle\") pod \"313f9ab1-8fa8-476f-94cb-1d94bd975a06\" (UID: \"313f9ab1-8fa8-476f-94cb-1d94bd975a06\") " Oct 01 15:14:23 crc kubenswrapper[4771]: I1001 15:14:23.744211 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tg27h\" (UniqueName: \"kubernetes.io/projected/313f9ab1-8fa8-476f-94cb-1d94bd975a06-kube-api-access-tg27h\") pod \"313f9ab1-8fa8-476f-94cb-1d94bd975a06\" (UID: \"313f9ab1-8fa8-476f-94cb-1d94bd975a06\") " Oct 01 15:14:23 crc kubenswrapper[4771]: I1001 15:14:23.748207 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/313f9ab1-8fa8-476f-94cb-1d94bd975a06-kube-api-access-tg27h" (OuterVolumeSpecName: "kube-api-access-tg27h") pod "313f9ab1-8fa8-476f-94cb-1d94bd975a06" (UID: "313f9ab1-8fa8-476f-94cb-1d94bd975a06"). InnerVolumeSpecName "kube-api-access-tg27h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:14:23 crc kubenswrapper[4771]: I1001 15:14:23.748595 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/313f9ab1-8fa8-476f-94cb-1d94bd975a06-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "313f9ab1-8fa8-476f-94cb-1d94bd975a06" (UID: "313f9ab1-8fa8-476f-94cb-1d94bd975a06"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:23 crc kubenswrapper[4771]: I1001 15:14:23.770451 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/313f9ab1-8fa8-476f-94cb-1d94bd975a06-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "313f9ab1-8fa8-476f-94cb-1d94bd975a06" (UID: "313f9ab1-8fa8-476f-94cb-1d94bd975a06"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:23 crc kubenswrapper[4771]: I1001 15:14:23.846560 4771 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/313f9ab1-8fa8-476f-94cb-1d94bd975a06-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:23 crc kubenswrapper[4771]: I1001 15:14:23.846607 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/313f9ab1-8fa8-476f-94cb-1d94bd975a06-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:23 crc kubenswrapper[4771]: I1001 15:14:23.846624 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tg27h\" (UniqueName: \"kubernetes.io/projected/313f9ab1-8fa8-476f-94cb-1d94bd975a06-kube-api-access-tg27h\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:23 crc kubenswrapper[4771]: I1001 15:14:23.999757 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ceb96a8c-ce5a-420e-aa1e-594d09fc1487" path="/var/lib/kubelet/pods/ceb96a8c-ce5a-420e-aa1e-594d09fc1487/volumes" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.204303 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-nvszz" event={"ID":"313f9ab1-8fa8-476f-94cb-1d94bd975a06","Type":"ContainerDied","Data":"dc53eed9b5af672fac2aa9b758cb652b9ba69ac0bdd36a39b3f5466d7b845360"} Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.205435 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc53eed9b5af672fac2aa9b758cb652b9ba69ac0bdd36a39b3f5466d7b845360" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.204434 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-nvszz" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.205814 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.285290 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.302265 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.324114 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 15:14:24 crc kubenswrapper[4771]: E1001 15:14:24.326347 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceb96a8c-ce5a-420e-aa1e-594d09fc1487" containerName="dnsmasq-dns" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.326384 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceb96a8c-ce5a-420e-aa1e-594d09fc1487" containerName="dnsmasq-dns" Oct 01 15:14:24 crc kubenswrapper[4771]: E1001 15:14:24.326416 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ceb96a8c-ce5a-420e-aa1e-594d09fc1487" containerName="init" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.326425 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="ceb96a8c-ce5a-420e-aa1e-594d09fc1487" containerName="init" Oct 01 15:14:24 crc kubenswrapper[4771]: E1001 15:14:24.326473 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="127a1d07-d1f4-4c95-abf8-da08884ea57a" containerName="ceilometer-notification-agent" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.326483 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="127a1d07-d1f4-4c95-abf8-da08884ea57a" containerName="ceilometer-notification-agent" Oct 01 15:14:24 crc kubenswrapper[4771]: E1001 15:14:24.326513 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="127a1d07-d1f4-4c95-abf8-da08884ea57a" containerName="sg-core" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.326521 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="127a1d07-d1f4-4c95-abf8-da08884ea57a" containerName="sg-core" Oct 01 15:14:24 crc kubenswrapper[4771]: E1001 15:14:24.326550 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="127a1d07-d1f4-4c95-abf8-da08884ea57a" containerName="proxy-httpd" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.326559 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="127a1d07-d1f4-4c95-abf8-da08884ea57a" containerName="proxy-httpd" Oct 01 15:14:24 crc kubenswrapper[4771]: E1001 15:14:24.326583 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="313f9ab1-8fa8-476f-94cb-1d94bd975a06" containerName="barbican-db-sync" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.326593 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="313f9ab1-8fa8-476f-94cb-1d94bd975a06" containerName="barbican-db-sync" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.326995 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="127a1d07-d1f4-4c95-abf8-da08884ea57a" containerName="proxy-httpd" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.327027 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="127a1d07-d1f4-4c95-abf8-da08884ea57a" containerName="sg-core" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.327051 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="313f9ab1-8fa8-476f-94cb-1d94bd975a06" containerName="barbican-db-sync" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.327073 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="ceb96a8c-ce5a-420e-aa1e-594d09fc1487" containerName="dnsmasq-dns" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.327097 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="127a1d07-d1f4-4c95-abf8-da08884ea57a" containerName="ceilometer-notification-agent" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.340006 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.345387 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.345630 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.375588 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.458431 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hjwd\" (UniqueName: \"kubernetes.io/projected/501a1350-8a05-4e56-8e04-57cb1f4c721b-kube-api-access-2hjwd\") pod \"ceilometer-0\" (UID: \"501a1350-8a05-4e56-8e04-57cb1f4c721b\") " pod="openstack/ceilometer-0" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.458502 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/501a1350-8a05-4e56-8e04-57cb1f4c721b-run-httpd\") pod \"ceilometer-0\" (UID: \"501a1350-8a05-4e56-8e04-57cb1f4c721b\") " pod="openstack/ceilometer-0" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.458523 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/501a1350-8a05-4e56-8e04-57cb1f4c721b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"501a1350-8a05-4e56-8e04-57cb1f4c721b\") " pod="openstack/ceilometer-0" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.458537 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/501a1350-8a05-4e56-8e04-57cb1f4c721b-log-httpd\") pod \"ceilometer-0\" (UID: \"501a1350-8a05-4e56-8e04-57cb1f4c721b\") " pod="openstack/ceilometer-0" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.458564 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/501a1350-8a05-4e56-8e04-57cb1f4c721b-config-data\") pod \"ceilometer-0\" (UID: \"501a1350-8a05-4e56-8e04-57cb1f4c721b\") " pod="openstack/ceilometer-0" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.458614 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/501a1350-8a05-4e56-8e04-57cb1f4c721b-scripts\") pod \"ceilometer-0\" (UID: \"501a1350-8a05-4e56-8e04-57cb1f4c721b\") " pod="openstack/ceilometer-0" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.458627 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/501a1350-8a05-4e56-8e04-57cb1f4c721b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"501a1350-8a05-4e56-8e04-57cb1f4c721b\") " pod="openstack/ceilometer-0" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.464673 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-74dd9b479f-cpgmx"] Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.468647 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-74dd9b479f-cpgmx" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.470813 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-74dd9b479f-cpgmx"] Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.480511 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-755c69f65b-sb4nj"] Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.481976 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-755c69f65b-sb4nj" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.484260 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.484448 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.484584 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-ngq92" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.503130 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.525447 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-755c69f65b-sb4nj"] Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.559751 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90aaa270-c5a1-47b4-8adc-2bd096da3ab0-config-data-custom\") pod \"barbican-keystone-listener-755c69f65b-sb4nj\" (UID: \"90aaa270-c5a1-47b4-8adc-2bd096da3ab0\") " pod="openstack/barbican-keystone-listener-755c69f65b-sb4nj" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.559798 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x756\" (UniqueName: \"kubernetes.io/projected/90aaa270-c5a1-47b4-8adc-2bd096da3ab0-kube-api-access-8x756\") pod \"barbican-keystone-listener-755c69f65b-sb4nj\" (UID: \"90aaa270-c5a1-47b4-8adc-2bd096da3ab0\") " pod="openstack/barbican-keystone-listener-755c69f65b-sb4nj" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.559828 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/962b1815-b3dd-47fc-afdf-97a82cc67893-config-data-custom\") pod \"barbican-worker-74dd9b479f-cpgmx\" (UID: \"962b1815-b3dd-47fc-afdf-97a82cc67893\") " pod="openstack/barbican-worker-74dd9b479f-cpgmx" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.559861 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/962b1815-b3dd-47fc-afdf-97a82cc67893-config-data\") pod \"barbican-worker-74dd9b479f-cpgmx\" (UID: \"962b1815-b3dd-47fc-afdf-97a82cc67893\") " pod="openstack/barbican-worker-74dd9b479f-cpgmx" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.559889 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/501a1350-8a05-4e56-8e04-57cb1f4c721b-scripts\") pod \"ceilometer-0\" (UID: \"501a1350-8a05-4e56-8e04-57cb1f4c721b\") " pod="openstack/ceilometer-0" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.559904 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/501a1350-8a05-4e56-8e04-57cb1f4c721b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"501a1350-8a05-4e56-8e04-57cb1f4c721b\") " pod="openstack/ceilometer-0" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.559954 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90aaa270-c5a1-47b4-8adc-2bd096da3ab0-config-data\") pod \"barbican-keystone-listener-755c69f65b-sb4nj\" (UID: \"90aaa270-c5a1-47b4-8adc-2bd096da3ab0\") " pod="openstack/barbican-keystone-listener-755c69f65b-sb4nj" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.559991 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hjwd\" (UniqueName: \"kubernetes.io/projected/501a1350-8a05-4e56-8e04-57cb1f4c721b-kube-api-access-2hjwd\") pod \"ceilometer-0\" (UID: \"501a1350-8a05-4e56-8e04-57cb1f4c721b\") " pod="openstack/ceilometer-0" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.560010 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90aaa270-c5a1-47b4-8adc-2bd096da3ab0-logs\") pod \"barbican-keystone-listener-755c69f65b-sb4nj\" (UID: \"90aaa270-c5a1-47b4-8adc-2bd096da3ab0\") " pod="openstack/barbican-keystone-listener-755c69f65b-sb4nj" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.560026 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/962b1815-b3dd-47fc-afdf-97a82cc67893-logs\") pod \"barbican-worker-74dd9b479f-cpgmx\" (UID: \"962b1815-b3dd-47fc-afdf-97a82cc67893\") " pod="openstack/barbican-worker-74dd9b479f-cpgmx" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.560069 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8fvt\" (UniqueName: \"kubernetes.io/projected/962b1815-b3dd-47fc-afdf-97a82cc67893-kube-api-access-n8fvt\") pod \"barbican-worker-74dd9b479f-cpgmx\" (UID: \"962b1815-b3dd-47fc-afdf-97a82cc67893\") " pod="openstack/barbican-worker-74dd9b479f-cpgmx" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.560087 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/501a1350-8a05-4e56-8e04-57cb1f4c721b-run-httpd\") pod \"ceilometer-0\" (UID: \"501a1350-8a05-4e56-8e04-57cb1f4c721b\") " pod="openstack/ceilometer-0" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.560101 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/501a1350-8a05-4e56-8e04-57cb1f4c721b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"501a1350-8a05-4e56-8e04-57cb1f4c721b\") " pod="openstack/ceilometer-0" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.560114 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/501a1350-8a05-4e56-8e04-57cb1f4c721b-log-httpd\") pod \"ceilometer-0\" (UID: \"501a1350-8a05-4e56-8e04-57cb1f4c721b\") " pod="openstack/ceilometer-0" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.560140 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/501a1350-8a05-4e56-8e04-57cb1f4c721b-config-data\") pod \"ceilometer-0\" (UID: \"501a1350-8a05-4e56-8e04-57cb1f4c721b\") " pod="openstack/ceilometer-0" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.560158 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90aaa270-c5a1-47b4-8adc-2bd096da3ab0-combined-ca-bundle\") pod \"barbican-keystone-listener-755c69f65b-sb4nj\" (UID: \"90aaa270-c5a1-47b4-8adc-2bd096da3ab0\") " pod="openstack/barbican-keystone-listener-755c69f65b-sb4nj" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.560177 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/962b1815-b3dd-47fc-afdf-97a82cc67893-combined-ca-bundle\") pod \"barbican-worker-74dd9b479f-cpgmx\" (UID: \"962b1815-b3dd-47fc-afdf-97a82cc67893\") " pod="openstack/barbican-worker-74dd9b479f-cpgmx" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.561494 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/501a1350-8a05-4e56-8e04-57cb1f4c721b-log-httpd\") pod \"ceilometer-0\" (UID: \"501a1350-8a05-4e56-8e04-57cb1f4c721b\") " pod="openstack/ceilometer-0" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.564097 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/501a1350-8a05-4e56-8e04-57cb1f4c721b-run-httpd\") pod \"ceilometer-0\" (UID: \"501a1350-8a05-4e56-8e04-57cb1f4c721b\") " pod="openstack/ceilometer-0" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.568637 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/501a1350-8a05-4e56-8e04-57cb1f4c721b-scripts\") pod \"ceilometer-0\" (UID: \"501a1350-8a05-4e56-8e04-57cb1f4c721b\") " pod="openstack/ceilometer-0" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.572860 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/501a1350-8a05-4e56-8e04-57cb1f4c721b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"501a1350-8a05-4e56-8e04-57cb1f4c721b\") " pod="openstack/ceilometer-0" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.574535 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/501a1350-8a05-4e56-8e04-57cb1f4c721b-config-data\") pod \"ceilometer-0\" (UID: \"501a1350-8a05-4e56-8e04-57cb1f4c721b\") " pod="openstack/ceilometer-0" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.574601 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-789c5c5cb7-7vn6t"] Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.576037 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-789c5c5cb7-7vn6t" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.577477 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/501a1350-8a05-4e56-8e04-57cb1f4c721b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"501a1350-8a05-4e56-8e04-57cb1f4c721b\") " pod="openstack/ceilometer-0" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.584419 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-789c5c5cb7-7vn6t"] Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.612110 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hjwd\" (UniqueName: \"kubernetes.io/projected/501a1350-8a05-4e56-8e04-57cb1f4c721b-kube-api-access-2hjwd\") pod \"ceilometer-0\" (UID: \"501a1350-8a05-4e56-8e04-57cb1f4c721b\") " pod="openstack/ceilometer-0" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.663094 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8eccb2db-b788-4639-a452-b2d7738c5126-config\") pod \"dnsmasq-dns-789c5c5cb7-7vn6t\" (UID: \"8eccb2db-b788-4639-a452-b2d7738c5126\") " pod="openstack/dnsmasq-dns-789c5c5cb7-7vn6t" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.663144 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8eccb2db-b788-4639-a452-b2d7738c5126-dns-swift-storage-0\") pod \"dnsmasq-dns-789c5c5cb7-7vn6t\" (UID: \"8eccb2db-b788-4639-a452-b2d7738c5126\") " pod="openstack/dnsmasq-dns-789c5c5cb7-7vn6t" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.663193 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90aaa270-c5a1-47b4-8adc-2bd096da3ab0-logs\") pod \"barbican-keystone-listener-755c69f65b-sb4nj\" (UID: \"90aaa270-c5a1-47b4-8adc-2bd096da3ab0\") " pod="openstack/barbican-keystone-listener-755c69f65b-sb4nj" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.663216 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/962b1815-b3dd-47fc-afdf-97a82cc67893-logs\") pod \"barbican-worker-74dd9b479f-cpgmx\" (UID: \"962b1815-b3dd-47fc-afdf-97a82cc67893\") " pod="openstack/barbican-worker-74dd9b479f-cpgmx" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.663252 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8fvt\" (UniqueName: \"kubernetes.io/projected/962b1815-b3dd-47fc-afdf-97a82cc67893-kube-api-access-n8fvt\") pod \"barbican-worker-74dd9b479f-cpgmx\" (UID: \"962b1815-b3dd-47fc-afdf-97a82cc67893\") " pod="openstack/barbican-worker-74dd9b479f-cpgmx" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.663291 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8eccb2db-b788-4639-a452-b2d7738c5126-ovsdbserver-sb\") pod \"dnsmasq-dns-789c5c5cb7-7vn6t\" (UID: \"8eccb2db-b788-4639-a452-b2d7738c5126\") " pod="openstack/dnsmasq-dns-789c5c5cb7-7vn6t" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.663316 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8eccb2db-b788-4639-a452-b2d7738c5126-ovsdbserver-nb\") pod \"dnsmasq-dns-789c5c5cb7-7vn6t\" (UID: \"8eccb2db-b788-4639-a452-b2d7738c5126\") " pod="openstack/dnsmasq-dns-789c5c5cb7-7vn6t" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.663342 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90aaa270-c5a1-47b4-8adc-2bd096da3ab0-combined-ca-bundle\") pod \"barbican-keystone-listener-755c69f65b-sb4nj\" (UID: \"90aaa270-c5a1-47b4-8adc-2bd096da3ab0\") " pod="openstack/barbican-keystone-listener-755c69f65b-sb4nj" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.663380 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/962b1815-b3dd-47fc-afdf-97a82cc67893-combined-ca-bundle\") pod \"barbican-worker-74dd9b479f-cpgmx\" (UID: \"962b1815-b3dd-47fc-afdf-97a82cc67893\") " pod="openstack/barbican-worker-74dd9b479f-cpgmx" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.663404 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90aaa270-c5a1-47b4-8adc-2bd096da3ab0-config-data-custom\") pod \"barbican-keystone-listener-755c69f65b-sb4nj\" (UID: \"90aaa270-c5a1-47b4-8adc-2bd096da3ab0\") " pod="openstack/barbican-keystone-listener-755c69f65b-sb4nj" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.663426 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x756\" (UniqueName: \"kubernetes.io/projected/90aaa270-c5a1-47b4-8adc-2bd096da3ab0-kube-api-access-8x756\") pod \"barbican-keystone-listener-755c69f65b-sb4nj\" (UID: \"90aaa270-c5a1-47b4-8adc-2bd096da3ab0\") " pod="openstack/barbican-keystone-listener-755c69f65b-sb4nj" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.663453 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/962b1815-b3dd-47fc-afdf-97a82cc67893-config-data-custom\") pod \"barbican-worker-74dd9b479f-cpgmx\" (UID: \"962b1815-b3dd-47fc-afdf-97a82cc67893\") " pod="openstack/barbican-worker-74dd9b479f-cpgmx" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.663480 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctt42\" (UniqueName: \"kubernetes.io/projected/8eccb2db-b788-4639-a452-b2d7738c5126-kube-api-access-ctt42\") pod \"dnsmasq-dns-789c5c5cb7-7vn6t\" (UID: \"8eccb2db-b788-4639-a452-b2d7738c5126\") " pod="openstack/dnsmasq-dns-789c5c5cb7-7vn6t" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.663510 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/962b1815-b3dd-47fc-afdf-97a82cc67893-config-data\") pod \"barbican-worker-74dd9b479f-cpgmx\" (UID: \"962b1815-b3dd-47fc-afdf-97a82cc67893\") " pod="openstack/barbican-worker-74dd9b479f-cpgmx" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.663551 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8eccb2db-b788-4639-a452-b2d7738c5126-dns-svc\") pod \"dnsmasq-dns-789c5c5cb7-7vn6t\" (UID: \"8eccb2db-b788-4639-a452-b2d7738c5126\") " pod="openstack/dnsmasq-dns-789c5c5cb7-7vn6t" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.663615 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90aaa270-c5a1-47b4-8adc-2bd096da3ab0-config-data\") pod \"barbican-keystone-listener-755c69f65b-sb4nj\" (UID: \"90aaa270-c5a1-47b4-8adc-2bd096da3ab0\") " pod="openstack/barbican-keystone-listener-755c69f65b-sb4nj" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.663681 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/90aaa270-c5a1-47b4-8adc-2bd096da3ab0-logs\") pod \"barbican-keystone-listener-755c69f65b-sb4nj\" (UID: \"90aaa270-c5a1-47b4-8adc-2bd096da3ab0\") " pod="openstack/barbican-keystone-listener-755c69f65b-sb4nj" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.665263 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/962b1815-b3dd-47fc-afdf-97a82cc67893-logs\") pod \"barbican-worker-74dd9b479f-cpgmx\" (UID: \"962b1815-b3dd-47fc-afdf-97a82cc67893\") " pod="openstack/barbican-worker-74dd9b479f-cpgmx" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.667342 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.668343 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/962b1815-b3dd-47fc-afdf-97a82cc67893-config-data-custom\") pod \"barbican-worker-74dd9b479f-cpgmx\" (UID: \"962b1815-b3dd-47fc-afdf-97a82cc67893\") " pod="openstack/barbican-worker-74dd9b479f-cpgmx" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.668901 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90aaa270-c5a1-47b4-8adc-2bd096da3ab0-combined-ca-bundle\") pod \"barbican-keystone-listener-755c69f65b-sb4nj\" (UID: \"90aaa270-c5a1-47b4-8adc-2bd096da3ab0\") " pod="openstack/barbican-keystone-listener-755c69f65b-sb4nj" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.674066 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90aaa270-c5a1-47b4-8adc-2bd096da3ab0-config-data-custom\") pod \"barbican-keystone-listener-755c69f65b-sb4nj\" (UID: \"90aaa270-c5a1-47b4-8adc-2bd096da3ab0\") " pod="openstack/barbican-keystone-listener-755c69f65b-sb4nj" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.688413 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/962b1815-b3dd-47fc-afdf-97a82cc67893-combined-ca-bundle\") pod \"barbican-worker-74dd9b479f-cpgmx\" (UID: \"962b1815-b3dd-47fc-afdf-97a82cc67893\") " pod="openstack/barbican-worker-74dd9b479f-cpgmx" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.688967 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90aaa270-c5a1-47b4-8adc-2bd096da3ab0-config-data\") pod \"barbican-keystone-listener-755c69f65b-sb4nj\" (UID: \"90aaa270-c5a1-47b4-8adc-2bd096da3ab0\") " pod="openstack/barbican-keystone-listener-755c69f65b-sb4nj" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.689249 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8fvt\" (UniqueName: \"kubernetes.io/projected/962b1815-b3dd-47fc-afdf-97a82cc67893-kube-api-access-n8fvt\") pod \"barbican-worker-74dd9b479f-cpgmx\" (UID: \"962b1815-b3dd-47fc-afdf-97a82cc67893\") " pod="openstack/barbican-worker-74dd9b479f-cpgmx" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.692435 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x756\" (UniqueName: \"kubernetes.io/projected/90aaa270-c5a1-47b4-8adc-2bd096da3ab0-kube-api-access-8x756\") pod \"barbican-keystone-listener-755c69f65b-sb4nj\" (UID: \"90aaa270-c5a1-47b4-8adc-2bd096da3ab0\") " pod="openstack/barbican-keystone-listener-755c69f65b-sb4nj" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.695011 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/962b1815-b3dd-47fc-afdf-97a82cc67893-config-data\") pod \"barbican-worker-74dd9b479f-cpgmx\" (UID: \"962b1815-b3dd-47fc-afdf-97a82cc67893\") " pod="openstack/barbican-worker-74dd9b479f-cpgmx" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.697113 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-554565568b-5js82"] Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.698586 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-554565568b-5js82" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.703525 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.704882 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-554565568b-5js82"] Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.766757 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8eccb2db-b788-4639-a452-b2d7738c5126-dns-svc\") pod \"dnsmasq-dns-789c5c5cb7-7vn6t\" (UID: \"8eccb2db-b788-4639-a452-b2d7738c5126\") " pod="openstack/dnsmasq-dns-789c5c5cb7-7vn6t" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.766849 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8eccb2db-b788-4639-a452-b2d7738c5126-config\") pod \"dnsmasq-dns-789c5c5cb7-7vn6t\" (UID: \"8eccb2db-b788-4639-a452-b2d7738c5126\") " pod="openstack/dnsmasq-dns-789c5c5cb7-7vn6t" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.766872 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8eccb2db-b788-4639-a452-b2d7738c5126-dns-swift-storage-0\") pod \"dnsmasq-dns-789c5c5cb7-7vn6t\" (UID: \"8eccb2db-b788-4639-a452-b2d7738c5126\") " pod="openstack/dnsmasq-dns-789c5c5cb7-7vn6t" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.766917 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8eccb2db-b788-4639-a452-b2d7738c5126-ovsdbserver-sb\") pod \"dnsmasq-dns-789c5c5cb7-7vn6t\" (UID: \"8eccb2db-b788-4639-a452-b2d7738c5126\") " pod="openstack/dnsmasq-dns-789c5c5cb7-7vn6t" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.766939 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8eccb2db-b788-4639-a452-b2d7738c5126-ovsdbserver-nb\") pod \"dnsmasq-dns-789c5c5cb7-7vn6t\" (UID: \"8eccb2db-b788-4639-a452-b2d7738c5126\") " pod="openstack/dnsmasq-dns-789c5c5cb7-7vn6t" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.766980 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctt42\" (UniqueName: \"kubernetes.io/projected/8eccb2db-b788-4639-a452-b2d7738c5126-kube-api-access-ctt42\") pod \"dnsmasq-dns-789c5c5cb7-7vn6t\" (UID: \"8eccb2db-b788-4639-a452-b2d7738c5126\") " pod="openstack/dnsmasq-dns-789c5c5cb7-7vn6t" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.768494 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8eccb2db-b788-4639-a452-b2d7738c5126-dns-swift-storage-0\") pod \"dnsmasq-dns-789c5c5cb7-7vn6t\" (UID: \"8eccb2db-b788-4639-a452-b2d7738c5126\") " pod="openstack/dnsmasq-dns-789c5c5cb7-7vn6t" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.768523 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8eccb2db-b788-4639-a452-b2d7738c5126-dns-svc\") pod \"dnsmasq-dns-789c5c5cb7-7vn6t\" (UID: \"8eccb2db-b788-4639-a452-b2d7738c5126\") " pod="openstack/dnsmasq-dns-789c5c5cb7-7vn6t" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.771660 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8eccb2db-b788-4639-a452-b2d7738c5126-ovsdbserver-nb\") pod \"dnsmasq-dns-789c5c5cb7-7vn6t\" (UID: \"8eccb2db-b788-4639-a452-b2d7738c5126\") " pod="openstack/dnsmasq-dns-789c5c5cb7-7vn6t" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.772007 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8eccb2db-b788-4639-a452-b2d7738c5126-ovsdbserver-sb\") pod \"dnsmasq-dns-789c5c5cb7-7vn6t\" (UID: \"8eccb2db-b788-4639-a452-b2d7738c5126\") " pod="openstack/dnsmasq-dns-789c5c5cb7-7vn6t" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.772470 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8eccb2db-b788-4639-a452-b2d7738c5126-config\") pod \"dnsmasq-dns-789c5c5cb7-7vn6t\" (UID: \"8eccb2db-b788-4639-a452-b2d7738c5126\") " pod="openstack/dnsmasq-dns-789c5c5cb7-7vn6t" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.792284 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctt42\" (UniqueName: \"kubernetes.io/projected/8eccb2db-b788-4639-a452-b2d7738c5126-kube-api-access-ctt42\") pod \"dnsmasq-dns-789c5c5cb7-7vn6t\" (UID: \"8eccb2db-b788-4639-a452-b2d7738c5126\") " pod="openstack/dnsmasq-dns-789c5c5cb7-7vn6t" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.801421 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-74dd9b479f-cpgmx" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.825669 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-755c69f65b-sb4nj" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.871497 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7323fb43-89a2-4be3-96fc-f8633f91fd8c-logs\") pod \"barbican-api-554565568b-5js82\" (UID: \"7323fb43-89a2-4be3-96fc-f8633f91fd8c\") " pod="openstack/barbican-api-554565568b-5js82" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.871928 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7323fb43-89a2-4be3-96fc-f8633f91fd8c-config-data-custom\") pod \"barbican-api-554565568b-5js82\" (UID: \"7323fb43-89a2-4be3-96fc-f8633f91fd8c\") " pod="openstack/barbican-api-554565568b-5js82" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.872114 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wl92\" (UniqueName: \"kubernetes.io/projected/7323fb43-89a2-4be3-96fc-f8633f91fd8c-kube-api-access-8wl92\") pod \"barbican-api-554565568b-5js82\" (UID: \"7323fb43-89a2-4be3-96fc-f8633f91fd8c\") " pod="openstack/barbican-api-554565568b-5js82" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.872283 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7323fb43-89a2-4be3-96fc-f8633f91fd8c-combined-ca-bundle\") pod \"barbican-api-554565568b-5js82\" (UID: \"7323fb43-89a2-4be3-96fc-f8633f91fd8c\") " pod="openstack/barbican-api-554565568b-5js82" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.872368 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7323fb43-89a2-4be3-96fc-f8633f91fd8c-config-data\") pod \"barbican-api-554565568b-5js82\" (UID: \"7323fb43-89a2-4be3-96fc-f8633f91fd8c\") " pod="openstack/barbican-api-554565568b-5js82" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.975944 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7323fb43-89a2-4be3-96fc-f8633f91fd8c-combined-ca-bundle\") pod \"barbican-api-554565568b-5js82\" (UID: \"7323fb43-89a2-4be3-96fc-f8633f91fd8c\") " pod="openstack/barbican-api-554565568b-5js82" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.976011 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7323fb43-89a2-4be3-96fc-f8633f91fd8c-config-data\") pod \"barbican-api-554565568b-5js82\" (UID: \"7323fb43-89a2-4be3-96fc-f8633f91fd8c\") " pod="openstack/barbican-api-554565568b-5js82" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.976052 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7323fb43-89a2-4be3-96fc-f8633f91fd8c-logs\") pod \"barbican-api-554565568b-5js82\" (UID: \"7323fb43-89a2-4be3-96fc-f8633f91fd8c\") " pod="openstack/barbican-api-554565568b-5js82" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.976095 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7323fb43-89a2-4be3-96fc-f8633f91fd8c-config-data-custom\") pod \"barbican-api-554565568b-5js82\" (UID: \"7323fb43-89a2-4be3-96fc-f8633f91fd8c\") " pod="openstack/barbican-api-554565568b-5js82" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.976184 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wl92\" (UniqueName: \"kubernetes.io/projected/7323fb43-89a2-4be3-96fc-f8633f91fd8c-kube-api-access-8wl92\") pod \"barbican-api-554565568b-5js82\" (UID: \"7323fb43-89a2-4be3-96fc-f8633f91fd8c\") " pod="openstack/barbican-api-554565568b-5js82" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.977864 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7323fb43-89a2-4be3-96fc-f8633f91fd8c-logs\") pod \"barbican-api-554565568b-5js82\" (UID: \"7323fb43-89a2-4be3-96fc-f8633f91fd8c\") " pod="openstack/barbican-api-554565568b-5js82" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.979948 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7323fb43-89a2-4be3-96fc-f8633f91fd8c-config-data-custom\") pod \"barbican-api-554565568b-5js82\" (UID: \"7323fb43-89a2-4be3-96fc-f8633f91fd8c\") " pod="openstack/barbican-api-554565568b-5js82" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.993322 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7323fb43-89a2-4be3-96fc-f8633f91fd8c-combined-ca-bundle\") pod \"barbican-api-554565568b-5js82\" (UID: \"7323fb43-89a2-4be3-96fc-f8633f91fd8c\") " pod="openstack/barbican-api-554565568b-5js82" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.996357 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wl92\" (UniqueName: \"kubernetes.io/projected/7323fb43-89a2-4be3-96fc-f8633f91fd8c-kube-api-access-8wl92\") pod \"barbican-api-554565568b-5js82\" (UID: \"7323fb43-89a2-4be3-96fc-f8633f91fd8c\") " pod="openstack/barbican-api-554565568b-5js82" Oct 01 15:14:24 crc kubenswrapper[4771]: I1001 15:14:24.998771 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7323fb43-89a2-4be3-96fc-f8633f91fd8c-config-data\") pod \"barbican-api-554565568b-5js82\" (UID: \"7323fb43-89a2-4be3-96fc-f8633f91fd8c\") " pod="openstack/barbican-api-554565568b-5js82" Oct 01 15:14:25 crc kubenswrapper[4771]: I1001 15:14:25.058477 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-789c5c5cb7-7vn6t" Oct 01 15:14:25 crc kubenswrapper[4771]: I1001 15:14:25.072041 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-554565568b-5js82" Oct 01 15:14:25 crc kubenswrapper[4771]: I1001 15:14:25.167784 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 15:14:25 crc kubenswrapper[4771]: I1001 15:14:25.169103 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-66679756f6-g56hw" Oct 01 15:14:25 crc kubenswrapper[4771]: I1001 15:14:25.231360 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"501a1350-8a05-4e56-8e04-57cb1f4c721b","Type":"ContainerStarted","Data":"f63089bc5904ee661f4e5ef92227246cbd0972f54e87c821455a99dc13010482"} Oct 01 15:14:25 crc kubenswrapper[4771]: I1001 15:14:25.231767 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-b48cbbc84-ndtts"] Oct 01 15:14:25 crc kubenswrapper[4771]: I1001 15:14:25.231953 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-b48cbbc84-ndtts" podUID="a5159749-76e2-4e89-8a81-59d8ea1ab063" containerName="horizon-log" containerID="cri-o://a90b6e35d0b25d1bee2280f135b1a7b7fbd994bb638560538cf71e22bdc7f88b" gracePeriod=30 Oct 01 15:14:25 crc kubenswrapper[4771]: I1001 15:14:25.232417 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-b48cbbc84-ndtts" podUID="a5159749-76e2-4e89-8a81-59d8ea1ab063" containerName="horizon" containerID="cri-o://2d01502bbf6fc58e63063d52d24bfa7a5cf14caa1fd2964932d63d8a121a9349" gracePeriod=30 Oct 01 15:14:25 crc kubenswrapper[4771]: I1001 15:14:25.312293 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-74dd9b479f-cpgmx"] Oct 01 15:14:25 crc kubenswrapper[4771]: W1001 15:14:25.408651 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90aaa270_c5a1_47b4_8adc_2bd096da3ab0.slice/crio-17fc9fd3c77af278825fdf9bbc38a19f511d65b2c9aeeecb120d9d9625dc0de3 WatchSource:0}: Error finding container 17fc9fd3c77af278825fdf9bbc38a19f511d65b2c9aeeecb120d9d9625dc0de3: Status 404 returned error can't find the container with id 17fc9fd3c77af278825fdf9bbc38a19f511d65b2c9aeeecb120d9d9625dc0de3 Oct 01 15:14:25 crc kubenswrapper[4771]: I1001 15:14:25.413616 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-755c69f65b-sb4nj"] Oct 01 15:14:25 crc kubenswrapper[4771]: I1001 15:14:25.720281 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-789c5c5cb7-7vn6t"] Oct 01 15:14:25 crc kubenswrapper[4771]: W1001 15:14:25.742711 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8eccb2db_b788_4639_a452_b2d7738c5126.slice/crio-1bc4d7f1aea4ba89007ec35fa5153e8ce0c579983ea02236323e656b940b7252 WatchSource:0}: Error finding container 1bc4d7f1aea4ba89007ec35fa5153e8ce0c579983ea02236323e656b940b7252: Status 404 returned error can't find the container with id 1bc4d7f1aea4ba89007ec35fa5153e8ce0c579983ea02236323e656b940b7252 Oct 01 15:14:25 crc kubenswrapper[4771]: I1001 15:14:25.753811 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-554565568b-5js82"] Oct 01 15:14:26 crc kubenswrapper[4771]: I1001 15:14:26.015442 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="127a1d07-d1f4-4c95-abf8-da08884ea57a" path="/var/lib/kubelet/pods/127a1d07-d1f4-4c95-abf8-da08884ea57a/volumes" Oct 01 15:14:26 crc kubenswrapper[4771]: I1001 15:14:26.244227 4771 generic.go:334] "Generic (PLEG): container finished" podID="8eccb2db-b788-4639-a452-b2d7738c5126" containerID="1d746fb4716f9d94557319f61b83ac84c791f1e5652b88caa35dc9c741d9c5a2" exitCode=0 Oct 01 15:14:26 crc kubenswrapper[4771]: I1001 15:14:26.244579 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-789c5c5cb7-7vn6t" event={"ID":"8eccb2db-b788-4639-a452-b2d7738c5126","Type":"ContainerDied","Data":"1d746fb4716f9d94557319f61b83ac84c791f1e5652b88caa35dc9c741d9c5a2"} Oct 01 15:14:26 crc kubenswrapper[4771]: I1001 15:14:26.244611 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-789c5c5cb7-7vn6t" event={"ID":"8eccb2db-b788-4639-a452-b2d7738c5126","Type":"ContainerStarted","Data":"1bc4d7f1aea4ba89007ec35fa5153e8ce0c579983ea02236323e656b940b7252"} Oct 01 15:14:26 crc kubenswrapper[4771]: I1001 15:14:26.250090 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-755c69f65b-sb4nj" event={"ID":"90aaa270-c5a1-47b4-8adc-2bd096da3ab0","Type":"ContainerStarted","Data":"17fc9fd3c77af278825fdf9bbc38a19f511d65b2c9aeeecb120d9d9625dc0de3"} Oct 01 15:14:26 crc kubenswrapper[4771]: I1001 15:14:26.252186 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"501a1350-8a05-4e56-8e04-57cb1f4c721b","Type":"ContainerStarted","Data":"de20565d8b41e17201fed26d95a6eeea0e35a0cbbb74b4a088d33af68298dfbf"} Oct 01 15:14:26 crc kubenswrapper[4771]: I1001 15:14:26.259365 4771 generic.go:334] "Generic (PLEG): container finished" podID="a5159749-76e2-4e89-8a81-59d8ea1ab063" containerID="2d01502bbf6fc58e63063d52d24bfa7a5cf14caa1fd2964932d63d8a121a9349" exitCode=0 Oct 01 15:14:26 crc kubenswrapper[4771]: I1001 15:14:26.259456 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b48cbbc84-ndtts" event={"ID":"a5159749-76e2-4e89-8a81-59d8ea1ab063","Type":"ContainerDied","Data":"2d01502bbf6fc58e63063d52d24bfa7a5cf14caa1fd2964932d63d8a121a9349"} Oct 01 15:14:26 crc kubenswrapper[4771]: I1001 15:14:26.264768 4771 generic.go:334] "Generic (PLEG): container finished" podID="9a523e37-804a-4173-8012-19848efc8cc0" containerID="d140b7ce466b3823c65fe198c4f62d5d313c28d82b7acdeee65dd69e6f7610ac" exitCode=0 Oct 01 15:14:26 crc kubenswrapper[4771]: I1001 15:14:26.264854 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-99rss" event={"ID":"9a523e37-804a-4173-8012-19848efc8cc0","Type":"ContainerDied","Data":"d140b7ce466b3823c65fe198c4f62d5d313c28d82b7acdeee65dd69e6f7610ac"} Oct 01 15:14:26 crc kubenswrapper[4771]: I1001 15:14:26.278756 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-74dd9b479f-cpgmx" event={"ID":"962b1815-b3dd-47fc-afdf-97a82cc67893","Type":"ContainerStarted","Data":"84a1314b9750e5708cd407569fb882cf7474cb0f671dae1cde3662b28ab8a61d"} Oct 01 15:14:26 crc kubenswrapper[4771]: I1001 15:14:26.300690 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-554565568b-5js82" event={"ID":"7323fb43-89a2-4be3-96fc-f8633f91fd8c","Type":"ContainerStarted","Data":"54b337cec3f6977f9a09c08b16d495d2123ae4bad884381dc31029bbf7614c68"} Oct 01 15:14:26 crc kubenswrapper[4771]: I1001 15:14:26.300812 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-554565568b-5js82" event={"ID":"7323fb43-89a2-4be3-96fc-f8633f91fd8c","Type":"ContainerStarted","Data":"942074b4b9bcfcd068c66ff1047b84b470342195265820a4840a741705fb1a27"} Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.296028 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-59879d9576-fvcgl"] Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.298370 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-59879d9576-fvcgl" Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.301277 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.301297 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.318366 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-789c5c5cb7-7vn6t" event={"ID":"8eccb2db-b788-4639-a452-b2d7738c5126","Type":"ContainerStarted","Data":"a050d637d6eff43e669df13511e7d45795ab9b83a29ee8970652d33555303056"} Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.319393 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-789c5c5cb7-7vn6t" Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.325093 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-59879d9576-fvcgl"] Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.332578 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-554565568b-5js82" event={"ID":"7323fb43-89a2-4be3-96fc-f8633f91fd8c","Type":"ContainerStarted","Data":"68d43ea486383f4d067516899ec14d857f94fb7ddb5f08f81c2f81aa6934d79f"} Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.332618 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-554565568b-5js82" Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.332644 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-554565568b-5js82" Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.389717 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-789c5c5cb7-7vn6t" podStartSLOduration=3.389700807 podStartE2EDuration="3.389700807s" podCreationTimestamp="2025-10-01 15:14:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:14:27.365416439 +0000 UTC m=+1111.984591610" watchObservedRunningTime="2025-10-01 15:14:27.389700807 +0000 UTC m=+1112.008875978" Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.397724 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-554565568b-5js82" podStartSLOduration=3.397708793 podStartE2EDuration="3.397708793s" podCreationTimestamp="2025-10-01 15:14:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:14:27.388189719 +0000 UTC m=+1112.007364890" watchObservedRunningTime="2025-10-01 15:14:27.397708793 +0000 UTC m=+1112.016883964" Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.467548 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7137719-9397-4b5e-97ae-10176a7deea3-combined-ca-bundle\") pod \"barbican-api-59879d9576-fvcgl\" (UID: \"d7137719-9397-4b5e-97ae-10176a7deea3\") " pod="openstack/barbican-api-59879d9576-fvcgl" Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.467701 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d7137719-9397-4b5e-97ae-10176a7deea3-config-data-custom\") pod \"barbican-api-59879d9576-fvcgl\" (UID: \"d7137719-9397-4b5e-97ae-10176a7deea3\") " pod="openstack/barbican-api-59879d9576-fvcgl" Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.467778 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7137719-9397-4b5e-97ae-10176a7deea3-internal-tls-certs\") pod \"barbican-api-59879d9576-fvcgl\" (UID: \"d7137719-9397-4b5e-97ae-10176a7deea3\") " pod="openstack/barbican-api-59879d9576-fvcgl" Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.468934 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7137719-9397-4b5e-97ae-10176a7deea3-config-data\") pod \"barbican-api-59879d9576-fvcgl\" (UID: \"d7137719-9397-4b5e-97ae-10176a7deea3\") " pod="openstack/barbican-api-59879d9576-fvcgl" Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.469081 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7137719-9397-4b5e-97ae-10176a7deea3-logs\") pod \"barbican-api-59879d9576-fvcgl\" (UID: \"d7137719-9397-4b5e-97ae-10176a7deea3\") " pod="openstack/barbican-api-59879d9576-fvcgl" Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.469139 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7137719-9397-4b5e-97ae-10176a7deea3-public-tls-certs\") pod \"barbican-api-59879d9576-fvcgl\" (UID: \"d7137719-9397-4b5e-97ae-10176a7deea3\") " pod="openstack/barbican-api-59879d9576-fvcgl" Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.469165 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntgcf\" (UniqueName: \"kubernetes.io/projected/d7137719-9397-4b5e-97ae-10176a7deea3-kube-api-access-ntgcf\") pod \"barbican-api-59879d9576-fvcgl\" (UID: \"d7137719-9397-4b5e-97ae-10176a7deea3\") " pod="openstack/barbican-api-59879d9576-fvcgl" Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.570348 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7137719-9397-4b5e-97ae-10176a7deea3-logs\") pod \"barbican-api-59879d9576-fvcgl\" (UID: \"d7137719-9397-4b5e-97ae-10176a7deea3\") " pod="openstack/barbican-api-59879d9576-fvcgl" Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.570399 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7137719-9397-4b5e-97ae-10176a7deea3-public-tls-certs\") pod \"barbican-api-59879d9576-fvcgl\" (UID: \"d7137719-9397-4b5e-97ae-10176a7deea3\") " pod="openstack/barbican-api-59879d9576-fvcgl" Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.570425 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntgcf\" (UniqueName: \"kubernetes.io/projected/d7137719-9397-4b5e-97ae-10176a7deea3-kube-api-access-ntgcf\") pod \"barbican-api-59879d9576-fvcgl\" (UID: \"d7137719-9397-4b5e-97ae-10176a7deea3\") " pod="openstack/barbican-api-59879d9576-fvcgl" Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.570464 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7137719-9397-4b5e-97ae-10176a7deea3-combined-ca-bundle\") pod \"barbican-api-59879d9576-fvcgl\" (UID: \"d7137719-9397-4b5e-97ae-10176a7deea3\") " pod="openstack/barbican-api-59879d9576-fvcgl" Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.570491 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d7137719-9397-4b5e-97ae-10176a7deea3-config-data-custom\") pod \"barbican-api-59879d9576-fvcgl\" (UID: \"d7137719-9397-4b5e-97ae-10176a7deea3\") " pod="openstack/barbican-api-59879d9576-fvcgl" Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.570518 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7137719-9397-4b5e-97ae-10176a7deea3-internal-tls-certs\") pod \"barbican-api-59879d9576-fvcgl\" (UID: \"d7137719-9397-4b5e-97ae-10176a7deea3\") " pod="openstack/barbican-api-59879d9576-fvcgl" Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.570545 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7137719-9397-4b5e-97ae-10176a7deea3-config-data\") pod \"barbican-api-59879d9576-fvcgl\" (UID: \"d7137719-9397-4b5e-97ae-10176a7deea3\") " pod="openstack/barbican-api-59879d9576-fvcgl" Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.571651 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d7137719-9397-4b5e-97ae-10176a7deea3-logs\") pod \"barbican-api-59879d9576-fvcgl\" (UID: \"d7137719-9397-4b5e-97ae-10176a7deea3\") " pod="openstack/barbican-api-59879d9576-fvcgl" Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.575693 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d7137719-9397-4b5e-97ae-10176a7deea3-config-data\") pod \"barbican-api-59879d9576-fvcgl\" (UID: \"d7137719-9397-4b5e-97ae-10176a7deea3\") " pod="openstack/barbican-api-59879d9576-fvcgl" Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.575935 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d7137719-9397-4b5e-97ae-10176a7deea3-config-data-custom\") pod \"barbican-api-59879d9576-fvcgl\" (UID: \"d7137719-9397-4b5e-97ae-10176a7deea3\") " pod="openstack/barbican-api-59879d9576-fvcgl" Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.577117 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7137719-9397-4b5e-97ae-10176a7deea3-public-tls-certs\") pod \"barbican-api-59879d9576-fvcgl\" (UID: \"d7137719-9397-4b5e-97ae-10176a7deea3\") " pod="openstack/barbican-api-59879d9576-fvcgl" Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.580417 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d7137719-9397-4b5e-97ae-10176a7deea3-combined-ca-bundle\") pod \"barbican-api-59879d9576-fvcgl\" (UID: \"d7137719-9397-4b5e-97ae-10176a7deea3\") " pod="openstack/barbican-api-59879d9576-fvcgl" Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.585507 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d7137719-9397-4b5e-97ae-10176a7deea3-internal-tls-certs\") pod \"barbican-api-59879d9576-fvcgl\" (UID: \"d7137719-9397-4b5e-97ae-10176a7deea3\") " pod="openstack/barbican-api-59879d9576-fvcgl" Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.592488 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntgcf\" (UniqueName: \"kubernetes.io/projected/d7137719-9397-4b5e-97ae-10176a7deea3-kube-api-access-ntgcf\") pod \"barbican-api-59879d9576-fvcgl\" (UID: \"d7137719-9397-4b5e-97ae-10176a7deea3\") " pod="openstack/barbican-api-59879d9576-fvcgl" Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.839289 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-59879d9576-fvcgl" Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.845286 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-99rss" Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.875589 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9a523e37-804a-4173-8012-19848efc8cc0-db-sync-config-data\") pod \"9a523e37-804a-4173-8012-19848efc8cc0\" (UID: \"9a523e37-804a-4173-8012-19848efc8cc0\") " Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.875691 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a523e37-804a-4173-8012-19848efc8cc0-scripts\") pod \"9a523e37-804a-4173-8012-19848efc8cc0\" (UID: \"9a523e37-804a-4173-8012-19848efc8cc0\") " Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.875789 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a523e37-804a-4173-8012-19848efc8cc0-combined-ca-bundle\") pod \"9a523e37-804a-4173-8012-19848efc8cc0\" (UID: \"9a523e37-804a-4173-8012-19848efc8cc0\") " Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.875850 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75gmn\" (UniqueName: \"kubernetes.io/projected/9a523e37-804a-4173-8012-19848efc8cc0-kube-api-access-75gmn\") pod \"9a523e37-804a-4173-8012-19848efc8cc0\" (UID: \"9a523e37-804a-4173-8012-19848efc8cc0\") " Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.875896 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a523e37-804a-4173-8012-19848efc8cc0-config-data\") pod \"9a523e37-804a-4173-8012-19848efc8cc0\" (UID: \"9a523e37-804a-4173-8012-19848efc8cc0\") " Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.875945 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9a523e37-804a-4173-8012-19848efc8cc0-etc-machine-id\") pod \"9a523e37-804a-4173-8012-19848efc8cc0\" (UID: \"9a523e37-804a-4173-8012-19848efc8cc0\") " Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.877611 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9a523e37-804a-4173-8012-19848efc8cc0-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "9a523e37-804a-4173-8012-19848efc8cc0" (UID: "9a523e37-804a-4173-8012-19848efc8cc0"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.883230 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a523e37-804a-4173-8012-19848efc8cc0-scripts" (OuterVolumeSpecName: "scripts") pod "9a523e37-804a-4173-8012-19848efc8cc0" (UID: "9a523e37-804a-4173-8012-19848efc8cc0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.883403 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a523e37-804a-4173-8012-19848efc8cc0-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "9a523e37-804a-4173-8012-19848efc8cc0" (UID: "9a523e37-804a-4173-8012-19848efc8cc0"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.897647 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7ccd9d5bb7-8mdjn" Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.901468 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a523e37-804a-4173-8012-19848efc8cc0-kube-api-access-75gmn" (OuterVolumeSpecName: "kube-api-access-75gmn") pod "9a523e37-804a-4173-8012-19848efc8cc0" (UID: "9a523e37-804a-4173-8012-19848efc8cc0"). InnerVolumeSpecName "kube-api-access-75gmn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.977267 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fl8nd\" (UniqueName: \"kubernetes.io/projected/b474b0f5-0050-4edd-afac-9237aa7284a5-kube-api-access-fl8nd\") pod \"b474b0f5-0050-4edd-afac-9237aa7284a5\" (UID: \"b474b0f5-0050-4edd-afac-9237aa7284a5\") " Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.977298 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b474b0f5-0050-4edd-afac-9237aa7284a5-scripts\") pod \"b474b0f5-0050-4edd-afac-9237aa7284a5\" (UID: \"b474b0f5-0050-4edd-afac-9237aa7284a5\") " Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.977320 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b474b0f5-0050-4edd-afac-9237aa7284a5-logs\") pod \"b474b0f5-0050-4edd-afac-9237aa7284a5\" (UID: \"b474b0f5-0050-4edd-afac-9237aa7284a5\") " Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.977349 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b474b0f5-0050-4edd-afac-9237aa7284a5-horizon-secret-key\") pod \"b474b0f5-0050-4edd-afac-9237aa7284a5\" (UID: \"b474b0f5-0050-4edd-afac-9237aa7284a5\") " Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.977415 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b474b0f5-0050-4edd-afac-9237aa7284a5-config-data\") pod \"b474b0f5-0050-4edd-afac-9237aa7284a5\" (UID: \"b474b0f5-0050-4edd-afac-9237aa7284a5\") " Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.977670 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a523e37-804a-4173-8012-19848efc8cc0-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.977680 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75gmn\" (UniqueName: \"kubernetes.io/projected/9a523e37-804a-4173-8012-19848efc8cc0-kube-api-access-75gmn\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.977690 4771 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/9a523e37-804a-4173-8012-19848efc8cc0-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.977699 4771 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9a523e37-804a-4173-8012-19848efc8cc0-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.978994 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b474b0f5-0050-4edd-afac-9237aa7284a5-logs" (OuterVolumeSpecName: "logs") pod "b474b0f5-0050-4edd-afac-9237aa7284a5" (UID: "b474b0f5-0050-4edd-afac-9237aa7284a5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.984490 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b474b0f5-0050-4edd-afac-9237aa7284a5-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "b474b0f5-0050-4edd-afac-9237aa7284a5" (UID: "b474b0f5-0050-4edd-afac-9237aa7284a5"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:27 crc kubenswrapper[4771]: I1001 15:14:27.981719 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b474b0f5-0050-4edd-afac-9237aa7284a5-kube-api-access-fl8nd" (OuterVolumeSpecName: "kube-api-access-fl8nd") pod "b474b0f5-0050-4edd-afac-9237aa7284a5" (UID: "b474b0f5-0050-4edd-afac-9237aa7284a5"). InnerVolumeSpecName "kube-api-access-fl8nd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.004902 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a523e37-804a-4173-8012-19848efc8cc0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9a523e37-804a-4173-8012-19848efc8cc0" (UID: "9a523e37-804a-4173-8012-19848efc8cc0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.008153 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a523e37-804a-4173-8012-19848efc8cc0-config-data" (OuterVolumeSpecName: "config-data") pod "9a523e37-804a-4173-8012-19848efc8cc0" (UID: "9a523e37-804a-4173-8012-19848efc8cc0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.018244 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b474b0f5-0050-4edd-afac-9237aa7284a5-scripts" (OuterVolumeSpecName: "scripts") pod "b474b0f5-0050-4edd-afac-9237aa7284a5" (UID: "b474b0f5-0050-4edd-afac-9237aa7284a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.036861 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b474b0f5-0050-4edd-afac-9237aa7284a5-config-data" (OuterVolumeSpecName: "config-data") pod "b474b0f5-0050-4edd-afac-9237aa7284a5" (UID: "b474b0f5-0050-4edd-afac-9237aa7284a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.083268 4771 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b474b0f5-0050-4edd-afac-9237aa7284a5-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.083292 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a523e37-804a-4173-8012-19848efc8cc0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.083302 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b474b0f5-0050-4edd-afac-9237aa7284a5-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.083311 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a523e37-804a-4173-8012-19848efc8cc0-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.083319 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fl8nd\" (UniqueName: \"kubernetes.io/projected/b474b0f5-0050-4edd-afac-9237aa7284a5-kube-api-access-fl8nd\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.083329 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b474b0f5-0050-4edd-afac-9237aa7284a5-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.083337 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b474b0f5-0050-4edd-afac-9237aa7284a5-logs\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.147125 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-d8c64bd85-clrr4" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.183998 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7975b9b95f-8lwcw" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.288086 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0-scripts\") pod \"e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0\" (UID: \"e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0\") " Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.288157 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0-logs\") pod \"e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0\" (UID: \"e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0\") " Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.288214 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0-config-data\") pod \"e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0\" (UID: \"e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0\") " Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.288245 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdbvj\" (UniqueName: \"kubernetes.io/projected/e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0-kube-api-access-pdbvj\") pod \"e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0\" (UID: \"e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0\") " Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.288397 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fe922d16-c5a9-4d8d-ba3e-33042e83372b-horizon-secret-key\") pod \"fe922d16-c5a9-4d8d-ba3e-33042e83372b\" (UID: \"fe922d16-c5a9-4d8d-ba3e-33042e83372b\") " Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.288723 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0-logs" (OuterVolumeSpecName: "logs") pod "e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0" (UID: "e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.288973 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe922d16-c5a9-4d8d-ba3e-33042e83372b-scripts\") pod \"fe922d16-c5a9-4d8d-ba3e-33042e83372b\" (UID: \"fe922d16-c5a9-4d8d-ba3e-33042e83372b\") " Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.289005 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fe922d16-c5a9-4d8d-ba3e-33042e83372b-config-data\") pod \"fe922d16-c5a9-4d8d-ba3e-33042e83372b\" (UID: \"fe922d16-c5a9-4d8d-ba3e-33042e83372b\") " Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.289030 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe922d16-c5a9-4d8d-ba3e-33042e83372b-logs\") pod \"fe922d16-c5a9-4d8d-ba3e-33042e83372b\" (UID: \"fe922d16-c5a9-4d8d-ba3e-33042e83372b\") " Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.289094 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhghl\" (UniqueName: \"kubernetes.io/projected/fe922d16-c5a9-4d8d-ba3e-33042e83372b-kube-api-access-qhghl\") pod \"fe922d16-c5a9-4d8d-ba3e-33042e83372b\" (UID: \"fe922d16-c5a9-4d8d-ba3e-33042e83372b\") " Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.289132 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0-horizon-secret-key\") pod \"e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0\" (UID: \"e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0\") " Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.289526 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0-logs\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.290201 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe922d16-c5a9-4d8d-ba3e-33042e83372b-logs" (OuterVolumeSpecName: "logs") pod "fe922d16-c5a9-4d8d-ba3e-33042e83372b" (UID: "fe922d16-c5a9-4d8d-ba3e-33042e83372b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.292340 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0-kube-api-access-pdbvj" (OuterVolumeSpecName: "kube-api-access-pdbvj") pod "e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0" (UID: "e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0"). InnerVolumeSpecName "kube-api-access-pdbvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.292361 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe922d16-c5a9-4d8d-ba3e-33042e83372b-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "fe922d16-c5a9-4d8d-ba3e-33042e83372b" (UID: "fe922d16-c5a9-4d8d-ba3e-33042e83372b"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.293374 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe922d16-c5a9-4d8d-ba3e-33042e83372b-kube-api-access-qhghl" (OuterVolumeSpecName: "kube-api-access-qhghl") pod "fe922d16-c5a9-4d8d-ba3e-33042e83372b" (UID: "fe922d16-c5a9-4d8d-ba3e-33042e83372b"). InnerVolumeSpecName "kube-api-access-qhghl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.294377 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0" (UID: "e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.315678 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe922d16-c5a9-4d8d-ba3e-33042e83372b-scripts" (OuterVolumeSpecName: "scripts") pod "fe922d16-c5a9-4d8d-ba3e-33042e83372b" (UID: "fe922d16-c5a9-4d8d-ba3e-33042e83372b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.317763 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe922d16-c5a9-4d8d-ba3e-33042e83372b-config-data" (OuterVolumeSpecName: "config-data") pod "fe922d16-c5a9-4d8d-ba3e-33042e83372b" (UID: "fe922d16-c5a9-4d8d-ba3e-33042e83372b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.329023 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0-scripts" (OuterVolumeSpecName: "scripts") pod "e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0" (UID: "e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.331307 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0-config-data" (OuterVolumeSpecName: "config-data") pod "e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0" (UID: "e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.357799 4771 generic.go:334] "Generic (PLEG): container finished" podID="e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0" containerID="b91bbfae12a5fc93cf8ef735589297d3e089c9dc513874859fff4236fc6d7fc0" exitCode=137 Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.357837 4771 generic.go:334] "Generic (PLEG): container finished" podID="e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0" containerID="916620a2144c8c9eebb35a2e594232a97b679f6372443ca55296a2a41bdc9e41" exitCode=137 Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.357900 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7975b9b95f-8lwcw" event={"ID":"e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0","Type":"ContainerDied","Data":"b91bbfae12a5fc93cf8ef735589297d3e089c9dc513874859fff4236fc6d7fc0"} Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.357930 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7975b9b95f-8lwcw" event={"ID":"e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0","Type":"ContainerDied","Data":"916620a2144c8c9eebb35a2e594232a97b679f6372443ca55296a2a41bdc9e41"} Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.357942 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7975b9b95f-8lwcw" event={"ID":"e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0","Type":"ContainerDied","Data":"f28e2c258c09bbe46fa44aceba7b17ea14f490c94b0e647ad4001180b0a14c1e"} Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.357969 4771 scope.go:117] "RemoveContainer" containerID="b91bbfae12a5fc93cf8ef735589297d3e089c9dc513874859fff4236fc6d7fc0" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.358149 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7975b9b95f-8lwcw" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.372708 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-74dd9b479f-cpgmx" event={"ID":"962b1815-b3dd-47fc-afdf-97a82cc67893","Type":"ContainerStarted","Data":"31010851545f08fb8a6b9634117d59454c5bda6b730f2df27533c05e1f9fa59e"} Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.377598 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-74dd9b479f-cpgmx" event={"ID":"962b1815-b3dd-47fc-afdf-97a82cc67893","Type":"ContainerStarted","Data":"b47474c5a52a295b26cfefc920f7513cbc8b9bf93ba653d5e8c2dce24fb916f2"} Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.394535 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fe922d16-c5a9-4d8d-ba3e-33042e83372b-logs\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.394909 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhghl\" (UniqueName: \"kubernetes.io/projected/fe922d16-c5a9-4d8d-ba3e-33042e83372b-kube-api-access-qhghl\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.394923 4771 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.394932 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.394945 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.394954 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pdbvj\" (UniqueName: \"kubernetes.io/projected/e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0-kube-api-access-pdbvj\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.394973 4771 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fe922d16-c5a9-4d8d-ba3e-33042e83372b-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.394982 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe922d16-c5a9-4d8d-ba3e-33042e83372b-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.394993 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fe922d16-c5a9-4d8d-ba3e-33042e83372b-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.401567 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-755c69f65b-sb4nj" event={"ID":"90aaa270-c5a1-47b4-8adc-2bd096da3ab0","Type":"ContainerStarted","Data":"4511f65cdc958730243b4f5450d38a9e9243b47fdc7a4d2cf387528a7866b41c"} Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.401615 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-755c69f65b-sb4nj" event={"ID":"90aaa270-c5a1-47b4-8adc-2bd096da3ab0","Type":"ContainerStarted","Data":"c121102170b7082bd781a43d65057f0df5bf56b34ad80188c3e377851344174e"} Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.410618 4771 generic.go:334] "Generic (PLEG): container finished" podID="fe922d16-c5a9-4d8d-ba3e-33042e83372b" containerID="5b2dbe6ff477b1c6c847a723de09cfcb54f04a483fd30ff4dea68ed0c3baa529" exitCode=137 Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.410650 4771 generic.go:334] "Generic (PLEG): container finished" podID="fe922d16-c5a9-4d8d-ba3e-33042e83372b" containerID="04f3e4001efd65b4a9a1aec7134fb2075a5a1d446ae02e867830c733db9cc0a5" exitCode=137 Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.410700 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-d8c64bd85-clrr4" event={"ID":"fe922d16-c5a9-4d8d-ba3e-33042e83372b","Type":"ContainerDied","Data":"5b2dbe6ff477b1c6c847a723de09cfcb54f04a483fd30ff4dea68ed0c3baa529"} Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.410753 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-d8c64bd85-clrr4" event={"ID":"fe922d16-c5a9-4d8d-ba3e-33042e83372b","Type":"ContainerDied","Data":"04f3e4001efd65b4a9a1aec7134fb2075a5a1d446ae02e867830c733db9cc0a5"} Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.410912 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-d8c64bd85-clrr4" event={"ID":"fe922d16-c5a9-4d8d-ba3e-33042e83372b","Type":"ContainerDied","Data":"bfbd67eec3cfad1f839156a32e32f0619a78278bc33157788236f20470e12e1a"} Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.411002 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-d8c64bd85-clrr4" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.428457 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-59879d9576-fvcgl"] Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.430987 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"501a1350-8a05-4e56-8e04-57cb1f4c721b","Type":"ContainerStarted","Data":"9d16671ac2a88a6acf48095805e3d17e3f505860fd999bf1e626b6300d8239ba"} Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.449789 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-74dd9b479f-cpgmx" podStartSLOduration=2.516826069 podStartE2EDuration="4.449770211s" podCreationTimestamp="2025-10-01 15:14:24 +0000 UTC" firstStartedPulling="2025-10-01 15:14:25.322258495 +0000 UTC m=+1109.941433666" lastFinishedPulling="2025-10-01 15:14:27.255202637 +0000 UTC m=+1111.874377808" observedRunningTime="2025-10-01 15:14:28.406044854 +0000 UTC m=+1113.025220035" watchObservedRunningTime="2025-10-01 15:14:28.449770211 +0000 UTC m=+1113.068945382" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.459250 4771 generic.go:334] "Generic (PLEG): container finished" podID="b474b0f5-0050-4edd-afac-9237aa7284a5" containerID="1bd15e2f9655dbcc18b1e30a9a51e6d0824a7c6accf54c984e89caec4321ea31" exitCode=137 Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.459286 4771 generic.go:334] "Generic (PLEG): container finished" podID="b474b0f5-0050-4edd-afac-9237aa7284a5" containerID="2809c1f07caf07bac6fafb7905817befa47759d615f86ffdddc85b049955a712" exitCode=137 Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.459359 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7ccd9d5bb7-8mdjn" event={"ID":"b474b0f5-0050-4edd-afac-9237aa7284a5","Type":"ContainerDied","Data":"1bd15e2f9655dbcc18b1e30a9a51e6d0824a7c6accf54c984e89caec4321ea31"} Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.459385 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7ccd9d5bb7-8mdjn" event={"ID":"b474b0f5-0050-4edd-afac-9237aa7284a5","Type":"ContainerDied","Data":"2809c1f07caf07bac6fafb7905817befa47759d615f86ffdddc85b049955a712"} Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.459395 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7ccd9d5bb7-8mdjn" event={"ID":"b474b0f5-0050-4edd-afac-9237aa7284a5","Type":"ContainerDied","Data":"4fb064a0edef6b0399c5f9110bfc424ad88a901268cefcd1373c1ef812e65baf"} Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.459479 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7ccd9d5bb7-8mdjn" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.486687 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-99rss" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.487340 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-99rss" event={"ID":"9a523e37-804a-4173-8012-19848efc8cc0","Type":"ContainerDied","Data":"0daed170c84f6be65c4e73f4b070a7cac27dccbf117041fd0b321f1dc175d1b6"} Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.487368 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0daed170c84f6be65c4e73f4b070a7cac27dccbf117041fd0b321f1dc175d1b6" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.488395 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-755c69f65b-sb4nj" podStartSLOduration=2.64530847 podStartE2EDuration="4.48837284s" podCreationTimestamp="2025-10-01 15:14:24 +0000 UTC" firstStartedPulling="2025-10-01 15:14:25.413031719 +0000 UTC m=+1110.032206890" lastFinishedPulling="2025-10-01 15:14:27.256096089 +0000 UTC m=+1111.875271260" observedRunningTime="2025-10-01 15:14:28.444221314 +0000 UTC m=+1113.063396485" watchObservedRunningTime="2025-10-01 15:14:28.48837284 +0000 UTC m=+1113.107548021" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.506560 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7975b9b95f-8lwcw"] Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.519941 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7975b9b95f-8lwcw"] Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.528616 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 15:14:28 crc kubenswrapper[4771]: E1001 15:14:28.529569 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0" containerName="horizon-log" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.529597 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0" containerName="horizon-log" Oct 01 15:14:28 crc kubenswrapper[4771]: E1001 15:14:28.529609 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe922d16-c5a9-4d8d-ba3e-33042e83372b" containerName="horizon-log" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.529615 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe922d16-c5a9-4d8d-ba3e-33042e83372b" containerName="horizon-log" Oct 01 15:14:28 crc kubenswrapper[4771]: E1001 15:14:28.529633 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b474b0f5-0050-4edd-afac-9237aa7284a5" containerName="horizon" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.529639 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b474b0f5-0050-4edd-afac-9237aa7284a5" containerName="horizon" Oct 01 15:14:28 crc kubenswrapper[4771]: E1001 15:14:28.529652 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a523e37-804a-4173-8012-19848efc8cc0" containerName="cinder-db-sync" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.529657 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a523e37-804a-4173-8012-19848efc8cc0" containerName="cinder-db-sync" Oct 01 15:14:28 crc kubenswrapper[4771]: E1001 15:14:28.529672 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b474b0f5-0050-4edd-afac-9237aa7284a5" containerName="horizon-log" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.529677 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b474b0f5-0050-4edd-afac-9237aa7284a5" containerName="horizon-log" Oct 01 15:14:28 crc kubenswrapper[4771]: E1001 15:14:28.529685 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0" containerName="horizon" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.529690 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0" containerName="horizon" Oct 01 15:14:28 crc kubenswrapper[4771]: E1001 15:14:28.529700 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe922d16-c5a9-4d8d-ba3e-33042e83372b" containerName="horizon" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.529705 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe922d16-c5a9-4d8d-ba3e-33042e83372b" containerName="horizon" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.532434 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0" containerName="horizon" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.532463 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b474b0f5-0050-4edd-afac-9237aa7284a5" containerName="horizon-log" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.532478 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe922d16-c5a9-4d8d-ba3e-33042e83372b" containerName="horizon-log" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.532491 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b474b0f5-0050-4edd-afac-9237aa7284a5" containerName="horizon" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.532499 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe922d16-c5a9-4d8d-ba3e-33042e83372b" containerName="horizon" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.532513 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a523e37-804a-4173-8012-19848efc8cc0" containerName="cinder-db-sync" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.532530 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0" containerName="horizon-log" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.533724 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.536906 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.538040 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-8dwpm" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.538294 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.538297 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.538422 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.550953 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-789c5c5cb7-7vn6t"] Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.592296 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-d8c64bd85-clrr4"] Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.598317 4771 scope.go:117] "RemoveContainer" containerID="916620a2144c8c9eebb35a2e594232a97b679f6372443ca55296a2a41bdc9e41" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.614636 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-d8c64bd85-clrr4"] Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.652441 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-95d56546f-jnd8s"] Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.654318 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95d56546f-jnd8s" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.681786 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-95d56546f-jnd8s"] Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.703616 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f26bf4c-7205-40ae-b698-e4d8a796f8f6-config-data\") pod \"cinder-scheduler-0\" (UID: \"4f26bf4c-7205-40ae-b698-e4d8a796f8f6\") " pod="openstack/cinder-scheduler-0" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.703855 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4f26bf4c-7205-40ae-b698-e4d8a796f8f6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4f26bf4c-7205-40ae-b698-e4d8a796f8f6\") " pod="openstack/cinder-scheduler-0" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.703936 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x298k\" (UniqueName: \"kubernetes.io/projected/4f26bf4c-7205-40ae-b698-e4d8a796f8f6-kube-api-access-x298k\") pod \"cinder-scheduler-0\" (UID: \"4f26bf4c-7205-40ae-b698-e4d8a796f8f6\") " pod="openstack/cinder-scheduler-0" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.704029 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f26bf4c-7205-40ae-b698-e4d8a796f8f6-scripts\") pod \"cinder-scheduler-0\" (UID: \"4f26bf4c-7205-40ae-b698-e4d8a796f8f6\") " pod="openstack/cinder-scheduler-0" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.704113 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f26bf4c-7205-40ae-b698-e4d8a796f8f6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4f26bf4c-7205-40ae-b698-e4d8a796f8f6\") " pod="openstack/cinder-scheduler-0" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.704222 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4f26bf4c-7205-40ae-b698-e4d8a796f8f6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4f26bf4c-7205-40ae-b698-e4d8a796f8f6\") " pod="openstack/cinder-scheduler-0" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.704426 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7ccd9d5bb7-8mdjn"] Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.712902 4771 scope.go:117] "RemoveContainer" containerID="b91bbfae12a5fc93cf8ef735589297d3e089c9dc513874859fff4236fc6d7fc0" Oct 01 15:14:28 crc kubenswrapper[4771]: E1001 15:14:28.714032 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b91bbfae12a5fc93cf8ef735589297d3e089c9dc513874859fff4236fc6d7fc0\": container with ID starting with b91bbfae12a5fc93cf8ef735589297d3e089c9dc513874859fff4236fc6d7fc0 not found: ID does not exist" containerID="b91bbfae12a5fc93cf8ef735589297d3e089c9dc513874859fff4236fc6d7fc0" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.714055 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b91bbfae12a5fc93cf8ef735589297d3e089c9dc513874859fff4236fc6d7fc0"} err="failed to get container status \"b91bbfae12a5fc93cf8ef735589297d3e089c9dc513874859fff4236fc6d7fc0\": rpc error: code = NotFound desc = could not find container \"b91bbfae12a5fc93cf8ef735589297d3e089c9dc513874859fff4236fc6d7fc0\": container with ID starting with b91bbfae12a5fc93cf8ef735589297d3e089c9dc513874859fff4236fc6d7fc0 not found: ID does not exist" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.714073 4771 scope.go:117] "RemoveContainer" containerID="916620a2144c8c9eebb35a2e594232a97b679f6372443ca55296a2a41bdc9e41" Oct 01 15:14:28 crc kubenswrapper[4771]: E1001 15:14:28.716391 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"916620a2144c8c9eebb35a2e594232a97b679f6372443ca55296a2a41bdc9e41\": container with ID starting with 916620a2144c8c9eebb35a2e594232a97b679f6372443ca55296a2a41bdc9e41 not found: ID does not exist" containerID="916620a2144c8c9eebb35a2e594232a97b679f6372443ca55296a2a41bdc9e41" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.716418 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"916620a2144c8c9eebb35a2e594232a97b679f6372443ca55296a2a41bdc9e41"} err="failed to get container status \"916620a2144c8c9eebb35a2e594232a97b679f6372443ca55296a2a41bdc9e41\": rpc error: code = NotFound desc = could not find container \"916620a2144c8c9eebb35a2e594232a97b679f6372443ca55296a2a41bdc9e41\": container with ID starting with 916620a2144c8c9eebb35a2e594232a97b679f6372443ca55296a2a41bdc9e41 not found: ID does not exist" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.716431 4771 scope.go:117] "RemoveContainer" containerID="b91bbfae12a5fc93cf8ef735589297d3e089c9dc513874859fff4236fc6d7fc0" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.716742 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b91bbfae12a5fc93cf8ef735589297d3e089c9dc513874859fff4236fc6d7fc0"} err="failed to get container status \"b91bbfae12a5fc93cf8ef735589297d3e089c9dc513874859fff4236fc6d7fc0\": rpc error: code = NotFound desc = could not find container \"b91bbfae12a5fc93cf8ef735589297d3e089c9dc513874859fff4236fc6d7fc0\": container with ID starting with b91bbfae12a5fc93cf8ef735589297d3e089c9dc513874859fff4236fc6d7fc0 not found: ID does not exist" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.716819 4771 scope.go:117] "RemoveContainer" containerID="916620a2144c8c9eebb35a2e594232a97b679f6372443ca55296a2a41bdc9e41" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.717244 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7ccd9d5bb7-8mdjn"] Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.719397 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"916620a2144c8c9eebb35a2e594232a97b679f6372443ca55296a2a41bdc9e41"} err="failed to get container status \"916620a2144c8c9eebb35a2e594232a97b679f6372443ca55296a2a41bdc9e41\": rpc error: code = NotFound desc = could not find container \"916620a2144c8c9eebb35a2e594232a97b679f6372443ca55296a2a41bdc9e41\": container with ID starting with 916620a2144c8c9eebb35a2e594232a97b679f6372443ca55296a2a41bdc9e41 not found: ID does not exist" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.719417 4771 scope.go:117] "RemoveContainer" containerID="5b2dbe6ff477b1c6c847a723de09cfcb54f04a483fd30ff4dea68ed0c3baa529" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.731085 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.742466 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.742792 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.748632 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.805584 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4f26bf4c-7205-40ae-b698-e4d8a796f8f6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4f26bf4c-7205-40ae-b698-e4d8a796f8f6\") " pod="openstack/cinder-scheduler-0" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.805779 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x298k\" (UniqueName: \"kubernetes.io/projected/4f26bf4c-7205-40ae-b698-e4d8a796f8f6-kube-api-access-x298k\") pod \"cinder-scheduler-0\" (UID: \"4f26bf4c-7205-40ae-b698-e4d8a796f8f6\") " pod="openstack/cinder-scheduler-0" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.806089 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f26bf4c-7205-40ae-b698-e4d8a796f8f6-scripts\") pod \"cinder-scheduler-0\" (UID: \"4f26bf4c-7205-40ae-b698-e4d8a796f8f6\") " pod="openstack/cinder-scheduler-0" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.806610 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f1ed27a-5b60-4efe-8dde-58545050953f-dns-swift-storage-0\") pod \"dnsmasq-dns-95d56546f-jnd8s\" (UID: \"2f1ed27a-5b60-4efe-8dde-58545050953f\") " pod="openstack/dnsmasq-dns-95d56546f-jnd8s" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.806713 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f26bf4c-7205-40ae-b698-e4d8a796f8f6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4f26bf4c-7205-40ae-b698-e4d8a796f8f6\") " pod="openstack/cinder-scheduler-0" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.807132 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f1ed27a-5b60-4efe-8dde-58545050953f-config\") pod \"dnsmasq-dns-95d56546f-jnd8s\" (UID: \"2f1ed27a-5b60-4efe-8dde-58545050953f\") " pod="openstack/dnsmasq-dns-95d56546f-jnd8s" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.807254 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f1ed27a-5b60-4efe-8dde-58545050953f-ovsdbserver-sb\") pod \"dnsmasq-dns-95d56546f-jnd8s\" (UID: \"2f1ed27a-5b60-4efe-8dde-58545050953f\") " pod="openstack/dnsmasq-dns-95d56546f-jnd8s" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.807354 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f1ed27a-5b60-4efe-8dde-58545050953f-ovsdbserver-nb\") pod \"dnsmasq-dns-95d56546f-jnd8s\" (UID: \"2f1ed27a-5b60-4efe-8dde-58545050953f\") " pod="openstack/dnsmasq-dns-95d56546f-jnd8s" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.807502 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4f26bf4c-7205-40ae-b698-e4d8a796f8f6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4f26bf4c-7205-40ae-b698-e4d8a796f8f6\") " pod="openstack/cinder-scheduler-0" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.807572 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26nzn\" (UniqueName: \"kubernetes.io/projected/2f1ed27a-5b60-4efe-8dde-58545050953f-kube-api-access-26nzn\") pod \"dnsmasq-dns-95d56546f-jnd8s\" (UID: \"2f1ed27a-5b60-4efe-8dde-58545050953f\") " pod="openstack/dnsmasq-dns-95d56546f-jnd8s" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.807684 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f1ed27a-5b60-4efe-8dde-58545050953f-dns-svc\") pod \"dnsmasq-dns-95d56546f-jnd8s\" (UID: \"2f1ed27a-5b60-4efe-8dde-58545050953f\") " pod="openstack/dnsmasq-dns-95d56546f-jnd8s" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.807792 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f26bf4c-7205-40ae-b698-e4d8a796f8f6-config-data\") pod \"cinder-scheduler-0\" (UID: \"4f26bf4c-7205-40ae-b698-e4d8a796f8f6\") " pod="openstack/cinder-scheduler-0" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.805741 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4f26bf4c-7205-40ae-b698-e4d8a796f8f6-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4f26bf4c-7205-40ae-b698-e4d8a796f8f6\") " pod="openstack/cinder-scheduler-0" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.809828 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f26bf4c-7205-40ae-b698-e4d8a796f8f6-scripts\") pod \"cinder-scheduler-0\" (UID: \"4f26bf4c-7205-40ae-b698-e4d8a796f8f6\") " pod="openstack/cinder-scheduler-0" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.811035 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4f26bf4c-7205-40ae-b698-e4d8a796f8f6-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4f26bf4c-7205-40ae-b698-e4d8a796f8f6\") " pod="openstack/cinder-scheduler-0" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.812876 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f26bf4c-7205-40ae-b698-e4d8a796f8f6-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4f26bf4c-7205-40ae-b698-e4d8a796f8f6\") " pod="openstack/cinder-scheduler-0" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.816081 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f26bf4c-7205-40ae-b698-e4d8a796f8f6-config-data\") pod \"cinder-scheduler-0\" (UID: \"4f26bf4c-7205-40ae-b698-e4d8a796f8f6\") " pod="openstack/cinder-scheduler-0" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.820761 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x298k\" (UniqueName: \"kubernetes.io/projected/4f26bf4c-7205-40ae-b698-e4d8a796f8f6-kube-api-access-x298k\") pod \"cinder-scheduler-0\" (UID: \"4f26bf4c-7205-40ae-b698-e4d8a796f8f6\") " pod="openstack/cinder-scheduler-0" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.899042 4771 scope.go:117] "RemoveContainer" containerID="04f3e4001efd65b4a9a1aec7134fb2075a5a1d446ae02e867830c733db9cc0a5" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.899805 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.908845 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f1ed27a-5b60-4efe-8dde-58545050953f-dns-svc\") pod \"dnsmasq-dns-95d56546f-jnd8s\" (UID: \"2f1ed27a-5b60-4efe-8dde-58545050953f\") " pod="openstack/dnsmasq-dns-95d56546f-jnd8s" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.908919 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5b6a127-5e69-4245-bb95-b5aa136a80e5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d5b6a127-5e69-4245-bb95-b5aa136a80e5\") " pod="openstack/cinder-api-0" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.908948 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdjpb\" (UniqueName: \"kubernetes.io/projected/d5b6a127-5e69-4245-bb95-b5aa136a80e5-kube-api-access-jdjpb\") pod \"cinder-api-0\" (UID: \"d5b6a127-5e69-4245-bb95-b5aa136a80e5\") " pod="openstack/cinder-api-0" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.908992 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f1ed27a-5b60-4efe-8dde-58545050953f-dns-swift-storage-0\") pod \"dnsmasq-dns-95d56546f-jnd8s\" (UID: \"2f1ed27a-5b60-4efe-8dde-58545050953f\") " pod="openstack/dnsmasq-dns-95d56546f-jnd8s" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.909017 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f1ed27a-5b60-4efe-8dde-58545050953f-config\") pod \"dnsmasq-dns-95d56546f-jnd8s\" (UID: \"2f1ed27a-5b60-4efe-8dde-58545050953f\") " pod="openstack/dnsmasq-dns-95d56546f-jnd8s" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.909039 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f1ed27a-5b60-4efe-8dde-58545050953f-ovsdbserver-sb\") pod \"dnsmasq-dns-95d56546f-jnd8s\" (UID: \"2f1ed27a-5b60-4efe-8dde-58545050953f\") " pod="openstack/dnsmasq-dns-95d56546f-jnd8s" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.909058 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f1ed27a-5b60-4efe-8dde-58545050953f-ovsdbserver-nb\") pod \"dnsmasq-dns-95d56546f-jnd8s\" (UID: \"2f1ed27a-5b60-4efe-8dde-58545050953f\") " pod="openstack/dnsmasq-dns-95d56546f-jnd8s" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.909084 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d5b6a127-5e69-4245-bb95-b5aa136a80e5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d5b6a127-5e69-4245-bb95-b5aa136a80e5\") " pod="openstack/cinder-api-0" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.909103 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26nzn\" (UniqueName: \"kubernetes.io/projected/2f1ed27a-5b60-4efe-8dde-58545050953f-kube-api-access-26nzn\") pod \"dnsmasq-dns-95d56546f-jnd8s\" (UID: \"2f1ed27a-5b60-4efe-8dde-58545050953f\") " pod="openstack/dnsmasq-dns-95d56546f-jnd8s" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.909126 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5b6a127-5e69-4245-bb95-b5aa136a80e5-scripts\") pod \"cinder-api-0\" (UID: \"d5b6a127-5e69-4245-bb95-b5aa136a80e5\") " pod="openstack/cinder-api-0" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.909147 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5b6a127-5e69-4245-bb95-b5aa136a80e5-logs\") pod \"cinder-api-0\" (UID: \"d5b6a127-5e69-4245-bb95-b5aa136a80e5\") " pod="openstack/cinder-api-0" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.909163 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5b6a127-5e69-4245-bb95-b5aa136a80e5-config-data\") pod \"cinder-api-0\" (UID: \"d5b6a127-5e69-4245-bb95-b5aa136a80e5\") " pod="openstack/cinder-api-0" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.909210 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d5b6a127-5e69-4245-bb95-b5aa136a80e5-config-data-custom\") pod \"cinder-api-0\" (UID: \"d5b6a127-5e69-4245-bb95-b5aa136a80e5\") " pod="openstack/cinder-api-0" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.910009 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f1ed27a-5b60-4efe-8dde-58545050953f-dns-svc\") pod \"dnsmasq-dns-95d56546f-jnd8s\" (UID: \"2f1ed27a-5b60-4efe-8dde-58545050953f\") " pod="openstack/dnsmasq-dns-95d56546f-jnd8s" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.910532 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f1ed27a-5b60-4efe-8dde-58545050953f-dns-swift-storage-0\") pod \"dnsmasq-dns-95d56546f-jnd8s\" (UID: \"2f1ed27a-5b60-4efe-8dde-58545050953f\") " pod="openstack/dnsmasq-dns-95d56546f-jnd8s" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.911078 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f1ed27a-5b60-4efe-8dde-58545050953f-config\") pod \"dnsmasq-dns-95d56546f-jnd8s\" (UID: \"2f1ed27a-5b60-4efe-8dde-58545050953f\") " pod="openstack/dnsmasq-dns-95d56546f-jnd8s" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.911752 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f1ed27a-5b60-4efe-8dde-58545050953f-ovsdbserver-nb\") pod \"dnsmasq-dns-95d56546f-jnd8s\" (UID: \"2f1ed27a-5b60-4efe-8dde-58545050953f\") " pod="openstack/dnsmasq-dns-95d56546f-jnd8s" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.912062 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f1ed27a-5b60-4efe-8dde-58545050953f-ovsdbserver-sb\") pod \"dnsmasq-dns-95d56546f-jnd8s\" (UID: \"2f1ed27a-5b60-4efe-8dde-58545050953f\") " pod="openstack/dnsmasq-dns-95d56546f-jnd8s" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.928344 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26nzn\" (UniqueName: \"kubernetes.io/projected/2f1ed27a-5b60-4efe-8dde-58545050953f-kube-api-access-26nzn\") pod \"dnsmasq-dns-95d56546f-jnd8s\" (UID: \"2f1ed27a-5b60-4efe-8dde-58545050953f\") " pod="openstack/dnsmasq-dns-95d56546f-jnd8s" Oct 01 15:14:28 crc kubenswrapper[4771]: I1001 15:14:28.997911 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95d56546f-jnd8s" Oct 01 15:14:29 crc kubenswrapper[4771]: I1001 15:14:29.011165 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5b6a127-5e69-4245-bb95-b5aa136a80e5-logs\") pod \"cinder-api-0\" (UID: \"d5b6a127-5e69-4245-bb95-b5aa136a80e5\") " pod="openstack/cinder-api-0" Oct 01 15:14:29 crc kubenswrapper[4771]: I1001 15:14:29.011476 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5b6a127-5e69-4245-bb95-b5aa136a80e5-config-data\") pod \"cinder-api-0\" (UID: \"d5b6a127-5e69-4245-bb95-b5aa136a80e5\") " pod="openstack/cinder-api-0" Oct 01 15:14:29 crc kubenswrapper[4771]: I1001 15:14:29.011510 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d5b6a127-5e69-4245-bb95-b5aa136a80e5-config-data-custom\") pod \"cinder-api-0\" (UID: \"d5b6a127-5e69-4245-bb95-b5aa136a80e5\") " pod="openstack/cinder-api-0" Oct 01 15:14:29 crc kubenswrapper[4771]: I1001 15:14:29.011577 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5b6a127-5e69-4245-bb95-b5aa136a80e5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d5b6a127-5e69-4245-bb95-b5aa136a80e5\") " pod="openstack/cinder-api-0" Oct 01 15:14:29 crc kubenswrapper[4771]: I1001 15:14:29.011606 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdjpb\" (UniqueName: \"kubernetes.io/projected/d5b6a127-5e69-4245-bb95-b5aa136a80e5-kube-api-access-jdjpb\") pod \"cinder-api-0\" (UID: \"d5b6a127-5e69-4245-bb95-b5aa136a80e5\") " pod="openstack/cinder-api-0" Oct 01 15:14:29 crc kubenswrapper[4771]: I1001 15:14:29.011614 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5b6a127-5e69-4245-bb95-b5aa136a80e5-logs\") pod \"cinder-api-0\" (UID: \"d5b6a127-5e69-4245-bb95-b5aa136a80e5\") " pod="openstack/cinder-api-0" Oct 01 15:14:29 crc kubenswrapper[4771]: I1001 15:14:29.011773 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d5b6a127-5e69-4245-bb95-b5aa136a80e5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d5b6a127-5e69-4245-bb95-b5aa136a80e5\") " pod="openstack/cinder-api-0" Oct 01 15:14:29 crc kubenswrapper[4771]: I1001 15:14:29.011823 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5b6a127-5e69-4245-bb95-b5aa136a80e5-scripts\") pod \"cinder-api-0\" (UID: \"d5b6a127-5e69-4245-bb95-b5aa136a80e5\") " pod="openstack/cinder-api-0" Oct 01 15:14:29 crc kubenswrapper[4771]: I1001 15:14:29.011942 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d5b6a127-5e69-4245-bb95-b5aa136a80e5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d5b6a127-5e69-4245-bb95-b5aa136a80e5\") " pod="openstack/cinder-api-0" Oct 01 15:14:29 crc kubenswrapper[4771]: I1001 15:14:29.014829 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5b6a127-5e69-4245-bb95-b5aa136a80e5-scripts\") pod \"cinder-api-0\" (UID: \"d5b6a127-5e69-4245-bb95-b5aa136a80e5\") " pod="openstack/cinder-api-0" Oct 01 15:14:29 crc kubenswrapper[4771]: I1001 15:14:29.017044 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5b6a127-5e69-4245-bb95-b5aa136a80e5-config-data\") pod \"cinder-api-0\" (UID: \"d5b6a127-5e69-4245-bb95-b5aa136a80e5\") " pod="openstack/cinder-api-0" Oct 01 15:14:29 crc kubenswrapper[4771]: I1001 15:14:29.017489 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d5b6a127-5e69-4245-bb95-b5aa136a80e5-config-data-custom\") pod \"cinder-api-0\" (UID: \"d5b6a127-5e69-4245-bb95-b5aa136a80e5\") " pod="openstack/cinder-api-0" Oct 01 15:14:29 crc kubenswrapper[4771]: I1001 15:14:29.017564 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5b6a127-5e69-4245-bb95-b5aa136a80e5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d5b6a127-5e69-4245-bb95-b5aa136a80e5\") " pod="openstack/cinder-api-0" Oct 01 15:14:29 crc kubenswrapper[4771]: I1001 15:14:29.027819 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdjpb\" (UniqueName: \"kubernetes.io/projected/d5b6a127-5e69-4245-bb95-b5aa136a80e5-kube-api-access-jdjpb\") pod \"cinder-api-0\" (UID: \"d5b6a127-5e69-4245-bb95-b5aa136a80e5\") " pod="openstack/cinder-api-0" Oct 01 15:14:29 crc kubenswrapper[4771]: I1001 15:14:29.072180 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 01 15:14:29 crc kubenswrapper[4771]: I1001 15:14:29.205989 4771 scope.go:117] "RemoveContainer" containerID="5b2dbe6ff477b1c6c847a723de09cfcb54f04a483fd30ff4dea68ed0c3baa529" Oct 01 15:14:29 crc kubenswrapper[4771]: E1001 15:14:29.214117 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b2dbe6ff477b1c6c847a723de09cfcb54f04a483fd30ff4dea68ed0c3baa529\": container with ID starting with 5b2dbe6ff477b1c6c847a723de09cfcb54f04a483fd30ff4dea68ed0c3baa529 not found: ID does not exist" containerID="5b2dbe6ff477b1c6c847a723de09cfcb54f04a483fd30ff4dea68ed0c3baa529" Oct 01 15:14:29 crc kubenswrapper[4771]: I1001 15:14:29.214161 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b2dbe6ff477b1c6c847a723de09cfcb54f04a483fd30ff4dea68ed0c3baa529"} err="failed to get container status \"5b2dbe6ff477b1c6c847a723de09cfcb54f04a483fd30ff4dea68ed0c3baa529\": rpc error: code = NotFound desc = could not find container \"5b2dbe6ff477b1c6c847a723de09cfcb54f04a483fd30ff4dea68ed0c3baa529\": container with ID starting with 5b2dbe6ff477b1c6c847a723de09cfcb54f04a483fd30ff4dea68ed0c3baa529 not found: ID does not exist" Oct 01 15:14:29 crc kubenswrapper[4771]: I1001 15:14:29.214193 4771 scope.go:117] "RemoveContainer" containerID="04f3e4001efd65b4a9a1aec7134fb2075a5a1d446ae02e867830c733db9cc0a5" Oct 01 15:14:29 crc kubenswrapper[4771]: E1001 15:14:29.215364 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04f3e4001efd65b4a9a1aec7134fb2075a5a1d446ae02e867830c733db9cc0a5\": container with ID starting with 04f3e4001efd65b4a9a1aec7134fb2075a5a1d446ae02e867830c733db9cc0a5 not found: ID does not exist" containerID="04f3e4001efd65b4a9a1aec7134fb2075a5a1d446ae02e867830c733db9cc0a5" Oct 01 15:14:29 crc kubenswrapper[4771]: I1001 15:14:29.215391 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04f3e4001efd65b4a9a1aec7134fb2075a5a1d446ae02e867830c733db9cc0a5"} err="failed to get container status \"04f3e4001efd65b4a9a1aec7134fb2075a5a1d446ae02e867830c733db9cc0a5\": rpc error: code = NotFound desc = could not find container \"04f3e4001efd65b4a9a1aec7134fb2075a5a1d446ae02e867830c733db9cc0a5\": container with ID starting with 04f3e4001efd65b4a9a1aec7134fb2075a5a1d446ae02e867830c733db9cc0a5 not found: ID does not exist" Oct 01 15:14:29 crc kubenswrapper[4771]: I1001 15:14:29.215406 4771 scope.go:117] "RemoveContainer" containerID="5b2dbe6ff477b1c6c847a723de09cfcb54f04a483fd30ff4dea68ed0c3baa529" Oct 01 15:14:29 crc kubenswrapper[4771]: I1001 15:14:29.215999 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b2dbe6ff477b1c6c847a723de09cfcb54f04a483fd30ff4dea68ed0c3baa529"} err="failed to get container status \"5b2dbe6ff477b1c6c847a723de09cfcb54f04a483fd30ff4dea68ed0c3baa529\": rpc error: code = NotFound desc = could not find container \"5b2dbe6ff477b1c6c847a723de09cfcb54f04a483fd30ff4dea68ed0c3baa529\": container with ID starting with 5b2dbe6ff477b1c6c847a723de09cfcb54f04a483fd30ff4dea68ed0c3baa529 not found: ID does not exist" Oct 01 15:14:29 crc kubenswrapper[4771]: I1001 15:14:29.216027 4771 scope.go:117] "RemoveContainer" containerID="04f3e4001efd65b4a9a1aec7134fb2075a5a1d446ae02e867830c733db9cc0a5" Oct 01 15:14:29 crc kubenswrapper[4771]: I1001 15:14:29.218413 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04f3e4001efd65b4a9a1aec7134fb2075a5a1d446ae02e867830c733db9cc0a5"} err="failed to get container status \"04f3e4001efd65b4a9a1aec7134fb2075a5a1d446ae02e867830c733db9cc0a5\": rpc error: code = NotFound desc = could not find container \"04f3e4001efd65b4a9a1aec7134fb2075a5a1d446ae02e867830c733db9cc0a5\": container with ID starting with 04f3e4001efd65b4a9a1aec7134fb2075a5a1d446ae02e867830c733db9cc0a5 not found: ID does not exist" Oct 01 15:14:29 crc kubenswrapper[4771]: I1001 15:14:29.218436 4771 scope.go:117] "RemoveContainer" containerID="1bd15e2f9655dbcc18b1e30a9a51e6d0824a7c6accf54c984e89caec4321ea31" Oct 01 15:14:29 crc kubenswrapper[4771]: I1001 15:14:29.418272 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 15:14:29 crc kubenswrapper[4771]: I1001 15:14:29.476719 4771 scope.go:117] "RemoveContainer" containerID="2809c1f07caf07bac6fafb7905817befa47759d615f86ffdddc85b049955a712" Oct 01 15:14:29 crc kubenswrapper[4771]: W1001 15:14:29.483255 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f26bf4c_7205_40ae_b698_e4d8a796f8f6.slice/crio-7d502be1a04c13a4e07e8301f4135216653172113967137e3723df28543f0ad6 WatchSource:0}: Error finding container 7d502be1a04c13a4e07e8301f4135216653172113967137e3723df28543f0ad6: Status 404 returned error can't find the container with id 7d502be1a04c13a4e07e8301f4135216653172113967137e3723df28543f0ad6 Oct 01 15:14:29 crc kubenswrapper[4771]: I1001 15:14:29.494195 4771 scope.go:117] "RemoveContainer" containerID="1bd15e2f9655dbcc18b1e30a9a51e6d0824a7c6accf54c984e89caec4321ea31" Oct 01 15:14:29 crc kubenswrapper[4771]: E1001 15:14:29.494834 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bd15e2f9655dbcc18b1e30a9a51e6d0824a7c6accf54c984e89caec4321ea31\": container with ID starting with 1bd15e2f9655dbcc18b1e30a9a51e6d0824a7c6accf54c984e89caec4321ea31 not found: ID does not exist" containerID="1bd15e2f9655dbcc18b1e30a9a51e6d0824a7c6accf54c984e89caec4321ea31" Oct 01 15:14:29 crc kubenswrapper[4771]: I1001 15:14:29.494870 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bd15e2f9655dbcc18b1e30a9a51e6d0824a7c6accf54c984e89caec4321ea31"} err="failed to get container status \"1bd15e2f9655dbcc18b1e30a9a51e6d0824a7c6accf54c984e89caec4321ea31\": rpc error: code = NotFound desc = could not find container \"1bd15e2f9655dbcc18b1e30a9a51e6d0824a7c6accf54c984e89caec4321ea31\": container with ID starting with 1bd15e2f9655dbcc18b1e30a9a51e6d0824a7c6accf54c984e89caec4321ea31 not found: ID does not exist" Oct 01 15:14:29 crc kubenswrapper[4771]: I1001 15:14:29.494895 4771 scope.go:117] "RemoveContainer" containerID="2809c1f07caf07bac6fafb7905817befa47759d615f86ffdddc85b049955a712" Oct 01 15:14:29 crc kubenswrapper[4771]: E1001 15:14:29.495165 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2809c1f07caf07bac6fafb7905817befa47759d615f86ffdddc85b049955a712\": container with ID starting with 2809c1f07caf07bac6fafb7905817befa47759d615f86ffdddc85b049955a712 not found: ID does not exist" containerID="2809c1f07caf07bac6fafb7905817befa47759d615f86ffdddc85b049955a712" Oct 01 15:14:29 crc kubenswrapper[4771]: I1001 15:14:29.495183 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2809c1f07caf07bac6fafb7905817befa47759d615f86ffdddc85b049955a712"} err="failed to get container status \"2809c1f07caf07bac6fafb7905817befa47759d615f86ffdddc85b049955a712\": rpc error: code = NotFound desc = could not find container \"2809c1f07caf07bac6fafb7905817befa47759d615f86ffdddc85b049955a712\": container with ID starting with 2809c1f07caf07bac6fafb7905817befa47759d615f86ffdddc85b049955a712 not found: ID does not exist" Oct 01 15:14:29 crc kubenswrapper[4771]: I1001 15:14:29.495210 4771 scope.go:117] "RemoveContainer" containerID="1bd15e2f9655dbcc18b1e30a9a51e6d0824a7c6accf54c984e89caec4321ea31" Oct 01 15:14:29 crc kubenswrapper[4771]: I1001 15:14:29.495478 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bd15e2f9655dbcc18b1e30a9a51e6d0824a7c6accf54c984e89caec4321ea31"} err="failed to get container status \"1bd15e2f9655dbcc18b1e30a9a51e6d0824a7c6accf54c984e89caec4321ea31\": rpc error: code = NotFound desc = could not find container \"1bd15e2f9655dbcc18b1e30a9a51e6d0824a7c6accf54c984e89caec4321ea31\": container with ID starting with 1bd15e2f9655dbcc18b1e30a9a51e6d0824a7c6accf54c984e89caec4321ea31 not found: ID does not exist" Oct 01 15:14:29 crc kubenswrapper[4771]: I1001 15:14:29.495500 4771 scope.go:117] "RemoveContainer" containerID="2809c1f07caf07bac6fafb7905817befa47759d615f86ffdddc85b049955a712" Oct 01 15:14:29 crc kubenswrapper[4771]: I1001 15:14:29.495703 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2809c1f07caf07bac6fafb7905817befa47759d615f86ffdddc85b049955a712"} err="failed to get container status \"2809c1f07caf07bac6fafb7905817befa47759d615f86ffdddc85b049955a712\": rpc error: code = NotFound desc = could not find container \"2809c1f07caf07bac6fafb7905817befa47759d615f86ffdddc85b049955a712\": container with ID starting with 2809c1f07caf07bac6fafb7905817befa47759d615f86ffdddc85b049955a712 not found: ID does not exist" Oct 01 15:14:29 crc kubenswrapper[4771]: I1001 15:14:29.498389 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"501a1350-8a05-4e56-8e04-57cb1f4c721b","Type":"ContainerStarted","Data":"0e3fec9e2d3321977f601130aa2e001e5c14828d6eaf2a799904298413c30b9d"} Oct 01 15:14:29 crc kubenswrapper[4771]: I1001 15:14:29.500847 4771 generic.go:334] "Generic (PLEG): container finished" podID="7b7689a2-6ac8-47ac-86f7-7456994c39ca" containerID="c00acf985cfe64678616d36ec81d0a930db7d80fdf370d7eb6a3a5fa5167bcb0" exitCode=0 Oct 01 15:14:29 crc kubenswrapper[4771]: I1001 15:14:29.500906 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-gq7gl" event={"ID":"7b7689a2-6ac8-47ac-86f7-7456994c39ca","Type":"ContainerDied","Data":"c00acf985cfe64678616d36ec81d0a930db7d80fdf370d7eb6a3a5fa5167bcb0"} Oct 01 15:14:29 crc kubenswrapper[4771]: I1001 15:14:29.505190 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59879d9576-fvcgl" event={"ID":"d7137719-9397-4b5e-97ae-10176a7deea3","Type":"ContainerStarted","Data":"8953d79122f5d93b640fa0301e418854903923e6d822835360b6c90ed71014bd"} Oct 01 15:14:29 crc kubenswrapper[4771]: I1001 15:14:29.505237 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59879d9576-fvcgl" event={"ID":"d7137719-9397-4b5e-97ae-10176a7deea3","Type":"ContainerStarted","Data":"8e7f90c64c0cde9ac68456a795806745e16d522fa40d26488aaf34206dd36101"} Oct 01 15:14:29 crc kubenswrapper[4771]: I1001 15:14:29.505246 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-59879d9576-fvcgl" event={"ID":"d7137719-9397-4b5e-97ae-10176a7deea3","Type":"ContainerStarted","Data":"b9d4dfded0b86a28318a0d24687f3f248a7b0ffa10faa6e9ddd9cf4b0f13c8cd"} Oct 01 15:14:29 crc kubenswrapper[4771]: I1001 15:14:29.505391 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-59879d9576-fvcgl" Oct 01 15:14:29 crc kubenswrapper[4771]: I1001 15:14:29.506053 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-59879d9576-fvcgl" Oct 01 15:14:29 crc kubenswrapper[4771]: I1001 15:14:29.508774 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4f26bf4c-7205-40ae-b698-e4d8a796f8f6","Type":"ContainerStarted","Data":"7d502be1a04c13a4e07e8301f4135216653172113967137e3723df28543f0ad6"} Oct 01 15:14:29 crc kubenswrapper[4771]: I1001 15:14:29.541019 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-59879d9576-fvcgl" podStartSLOduration=2.541000871 podStartE2EDuration="2.541000871s" podCreationTimestamp="2025-10-01 15:14:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:14:29.538931701 +0000 UTC m=+1114.158106892" watchObservedRunningTime="2025-10-01 15:14:29.541000871 +0000 UTC m=+1114.160176042" Oct 01 15:14:29 crc kubenswrapper[4771]: I1001 15:14:29.617526 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 01 15:14:29 crc kubenswrapper[4771]: I1001 15:14:29.627422 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-95d56546f-jnd8s"] Oct 01 15:14:30 crc kubenswrapper[4771]: I1001 15:14:30.002441 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b474b0f5-0050-4edd-afac-9237aa7284a5" path="/var/lib/kubelet/pods/b474b0f5-0050-4edd-afac-9237aa7284a5/volumes" Oct 01 15:14:30 crc kubenswrapper[4771]: I1001 15:14:30.003879 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0" path="/var/lib/kubelet/pods/e17d9e2b-57c8-4ce4-b4b6-c479ec9128a0/volumes" Oct 01 15:14:30 crc kubenswrapper[4771]: I1001 15:14:30.005274 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe922d16-c5a9-4d8d-ba3e-33042e83372b" path="/var/lib/kubelet/pods/fe922d16-c5a9-4d8d-ba3e-33042e83372b/volumes" Oct 01 15:14:30 crc kubenswrapper[4771]: I1001 15:14:30.519987 4771 generic.go:334] "Generic (PLEG): container finished" podID="2f1ed27a-5b60-4efe-8dde-58545050953f" containerID="1cdf1537ce894b93395d5e90c4994a08697c3732e3956617b4f01854296ab21d" exitCode=0 Oct 01 15:14:30 crc kubenswrapper[4771]: I1001 15:14:30.520031 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95d56546f-jnd8s" event={"ID":"2f1ed27a-5b60-4efe-8dde-58545050953f","Type":"ContainerDied","Data":"1cdf1537ce894b93395d5e90c4994a08697c3732e3956617b4f01854296ab21d"} Oct 01 15:14:30 crc kubenswrapper[4771]: I1001 15:14:30.520311 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95d56546f-jnd8s" event={"ID":"2f1ed27a-5b60-4efe-8dde-58545050953f","Type":"ContainerStarted","Data":"6521fb8737cb82bd088e7ba8606ecd9b1e73688c617ed0d42dd02605e03727c7"} Oct 01 15:14:30 crc kubenswrapper[4771]: I1001 15:14:30.530937 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d5b6a127-5e69-4245-bb95-b5aa136a80e5","Type":"ContainerStarted","Data":"28e917088bcd60e0fdaf0aa6d0eff59e4c73b02d1d4bd8015e414f95abe783ce"} Oct 01 15:14:30 crc kubenswrapper[4771]: I1001 15:14:30.530994 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d5b6a127-5e69-4245-bb95-b5aa136a80e5","Type":"ContainerStarted","Data":"d1bc830d5dca8cfc2f5d88569309135ee1c7b0416f31a94d3a65fe07bc60e9fe"} Oct 01 15:14:30 crc kubenswrapper[4771]: I1001 15:14:30.531211 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-789c5c5cb7-7vn6t" podUID="8eccb2db-b788-4639-a452-b2d7738c5126" containerName="dnsmasq-dns" containerID="cri-o://a050d637d6eff43e669df13511e7d45795ab9b83a29ee8970652d33555303056" gracePeriod=10 Oct 01 15:14:30 crc kubenswrapper[4771]: I1001 15:14:30.665595 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.368200 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-789c5c5cb7-7vn6t" Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.382087 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-gq7gl" Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.474565 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8eccb2db-b788-4639-a452-b2d7738c5126-dns-svc\") pod \"8eccb2db-b788-4639-a452-b2d7738c5126\" (UID: \"8eccb2db-b788-4639-a452-b2d7738c5126\") " Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.474681 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b7689a2-6ac8-47ac-86f7-7456994c39ca-combined-ca-bundle\") pod \"7b7689a2-6ac8-47ac-86f7-7456994c39ca\" (UID: \"7b7689a2-6ac8-47ac-86f7-7456994c39ca\") " Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.474711 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8eccb2db-b788-4639-a452-b2d7738c5126-ovsdbserver-sb\") pod \"8eccb2db-b788-4639-a452-b2d7738c5126\" (UID: \"8eccb2db-b788-4639-a452-b2d7738c5126\") " Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.474746 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctt42\" (UniqueName: \"kubernetes.io/projected/8eccb2db-b788-4639-a452-b2d7738c5126-kube-api-access-ctt42\") pod \"8eccb2db-b788-4639-a452-b2d7738c5126\" (UID: \"8eccb2db-b788-4639-a452-b2d7738c5126\") " Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.474771 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8eccb2db-b788-4639-a452-b2d7738c5126-ovsdbserver-nb\") pod \"8eccb2db-b788-4639-a452-b2d7738c5126\" (UID: \"8eccb2db-b788-4639-a452-b2d7738c5126\") " Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.474819 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8eccb2db-b788-4639-a452-b2d7738c5126-dns-swift-storage-0\") pod \"8eccb2db-b788-4639-a452-b2d7738c5126\" (UID: \"8eccb2db-b788-4639-a452-b2d7738c5126\") " Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.474848 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b7689a2-6ac8-47ac-86f7-7456994c39ca-config-data\") pod \"7b7689a2-6ac8-47ac-86f7-7456994c39ca\" (UID: \"7b7689a2-6ac8-47ac-86f7-7456994c39ca\") " Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.474868 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7b7689a2-6ac8-47ac-86f7-7456994c39ca-db-sync-config-data\") pod \"7b7689a2-6ac8-47ac-86f7-7456994c39ca\" (UID: \"7b7689a2-6ac8-47ac-86f7-7456994c39ca\") " Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.474932 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ncw8\" (UniqueName: \"kubernetes.io/projected/7b7689a2-6ac8-47ac-86f7-7456994c39ca-kube-api-access-6ncw8\") pod \"7b7689a2-6ac8-47ac-86f7-7456994c39ca\" (UID: \"7b7689a2-6ac8-47ac-86f7-7456994c39ca\") " Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.474956 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8eccb2db-b788-4639-a452-b2d7738c5126-config\") pod \"8eccb2db-b788-4639-a452-b2d7738c5126\" (UID: \"8eccb2db-b788-4639-a452-b2d7738c5126\") " Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.486030 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b7689a2-6ac8-47ac-86f7-7456994c39ca-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "7b7689a2-6ac8-47ac-86f7-7456994c39ca" (UID: "7b7689a2-6ac8-47ac-86f7-7456994c39ca"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.516004 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b7689a2-6ac8-47ac-86f7-7456994c39ca-kube-api-access-6ncw8" (OuterVolumeSpecName: "kube-api-access-6ncw8") pod "7b7689a2-6ac8-47ac-86f7-7456994c39ca" (UID: "7b7689a2-6ac8-47ac-86f7-7456994c39ca"). InnerVolumeSpecName "kube-api-access-6ncw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.519205 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8eccb2db-b788-4639-a452-b2d7738c5126-kube-api-access-ctt42" (OuterVolumeSpecName: "kube-api-access-ctt42") pod "8eccb2db-b788-4639-a452-b2d7738c5126" (UID: "8eccb2db-b788-4639-a452-b2d7738c5126"). InnerVolumeSpecName "kube-api-access-ctt42". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.558854 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"501a1350-8a05-4e56-8e04-57cb1f4c721b","Type":"ContainerStarted","Data":"711c0d9cbcec5dcb83a304ad264baccaf01817bedf2c05babbc4989523752a60"} Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.560052 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.564211 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-gq7gl" event={"ID":"7b7689a2-6ac8-47ac-86f7-7456994c39ca","Type":"ContainerDied","Data":"21ea8d6fe0775b907e8f132fc2b8712483c01cde09959015fbdec9235c0eb530"} Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.564242 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21ea8d6fe0775b907e8f132fc2b8712483c01cde09959015fbdec9235c0eb530" Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.564286 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-gq7gl" Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.567380 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d5b6a127-5e69-4245-bb95-b5aa136a80e5","Type":"ContainerStarted","Data":"84abb579fa2accc1e6928c8cfcaa03cacfd3aa073a677b804838ee82506ea95c"} Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.567504 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="d5b6a127-5e69-4245-bb95-b5aa136a80e5" containerName="cinder-api-log" containerID="cri-o://28e917088bcd60e0fdaf0aa6d0eff59e4c73b02d1d4bd8015e414f95abe783ce" gracePeriod=30 Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.567748 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.567783 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="d5b6a127-5e69-4245-bb95-b5aa136a80e5" containerName="cinder-api" containerID="cri-o://84abb579fa2accc1e6928c8cfcaa03cacfd3aa073a677b804838ee82506ea95c" gracePeriod=30 Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.571339 4771 generic.go:334] "Generic (PLEG): container finished" podID="8eccb2db-b788-4639-a452-b2d7738c5126" containerID="a050d637d6eff43e669df13511e7d45795ab9b83a29ee8970652d33555303056" exitCode=0 Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.571389 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-789c5c5cb7-7vn6t" event={"ID":"8eccb2db-b788-4639-a452-b2d7738c5126","Type":"ContainerDied","Data":"a050d637d6eff43e669df13511e7d45795ab9b83a29ee8970652d33555303056"} Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.571439 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-789c5c5cb7-7vn6t" event={"ID":"8eccb2db-b788-4639-a452-b2d7738c5126","Type":"ContainerDied","Data":"1bc4d7f1aea4ba89007ec35fa5153e8ce0c579983ea02236323e656b940b7252"} Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.571457 4771 scope.go:117] "RemoveContainer" containerID="a050d637d6eff43e669df13511e7d45795ab9b83a29ee8970652d33555303056" Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.571568 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-789c5c5cb7-7vn6t" Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.576965 4771 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/7b7689a2-6ac8-47ac-86f7-7456994c39ca-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.576995 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ncw8\" (UniqueName: \"kubernetes.io/projected/7b7689a2-6ac8-47ac-86f7-7456994c39ca-kube-api-access-6ncw8\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.577004 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctt42\" (UniqueName: \"kubernetes.io/projected/8eccb2db-b788-4639-a452-b2d7738c5126-kube-api-access-ctt42\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.580208 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95d56546f-jnd8s" event={"ID":"2f1ed27a-5b60-4efe-8dde-58545050953f","Type":"ContainerStarted","Data":"43cd7e5965c7a73c4f780c932f3499366cbea99caf86f303b94dea377e87303d"} Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.588445 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.730174122 podStartE2EDuration="7.5884203s" podCreationTimestamp="2025-10-01 15:14:24 +0000 UTC" firstStartedPulling="2025-10-01 15:14:25.196285235 +0000 UTC m=+1109.815460406" lastFinishedPulling="2025-10-01 15:14:31.054531413 +0000 UTC m=+1115.673706584" observedRunningTime="2025-10-01 15:14:31.58111733 +0000 UTC m=+1116.200292491" watchObservedRunningTime="2025-10-01 15:14:31.5884203 +0000 UTC m=+1116.207595471" Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.620599 4771 scope.go:117] "RemoveContainer" containerID="1d746fb4716f9d94557319f61b83ac84c791f1e5652b88caa35dc9c741d9c5a2" Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.624599 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.62458021 podStartE2EDuration="3.62458021s" podCreationTimestamp="2025-10-01 15:14:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:14:31.615147877 +0000 UTC m=+1116.234323048" watchObservedRunningTime="2025-10-01 15:14:31.62458021 +0000 UTC m=+1116.243755381" Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.651492 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-95d56546f-jnd8s" podStartSLOduration=3.651474361 podStartE2EDuration="3.651474361s" podCreationTimestamp="2025-10-01 15:14:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:14:31.641508496 +0000 UTC m=+1116.260683667" watchObservedRunningTime="2025-10-01 15:14:31.651474361 +0000 UTC m=+1116.270649532" Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.683475 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b7689a2-6ac8-47ac-86f7-7456994c39ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b7689a2-6ac8-47ac-86f7-7456994c39ca" (UID: "7b7689a2-6ac8-47ac-86f7-7456994c39ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.701107 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8eccb2db-b788-4639-a452-b2d7738c5126-config" (OuterVolumeSpecName: "config") pod "8eccb2db-b788-4639-a452-b2d7738c5126" (UID: "8eccb2db-b788-4639-a452-b2d7738c5126"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.726924 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8eccb2db-b788-4639-a452-b2d7738c5126-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "8eccb2db-b788-4639-a452-b2d7738c5126" (UID: "8eccb2db-b788-4639-a452-b2d7738c5126"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.727938 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8eccb2db-b788-4639-a452-b2d7738c5126-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8eccb2db-b788-4639-a452-b2d7738c5126" (UID: "8eccb2db-b788-4639-a452-b2d7738c5126"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.761977 4771 scope.go:117] "RemoveContainer" containerID="a050d637d6eff43e669df13511e7d45795ab9b83a29ee8970652d33555303056" Oct 01 15:14:31 crc kubenswrapper[4771]: E1001 15:14:31.764824 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a050d637d6eff43e669df13511e7d45795ab9b83a29ee8970652d33555303056\": container with ID starting with a050d637d6eff43e669df13511e7d45795ab9b83a29ee8970652d33555303056 not found: ID does not exist" containerID="a050d637d6eff43e669df13511e7d45795ab9b83a29ee8970652d33555303056" Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.764869 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a050d637d6eff43e669df13511e7d45795ab9b83a29ee8970652d33555303056"} err="failed to get container status \"a050d637d6eff43e669df13511e7d45795ab9b83a29ee8970652d33555303056\": rpc error: code = NotFound desc = could not find container \"a050d637d6eff43e669df13511e7d45795ab9b83a29ee8970652d33555303056\": container with ID starting with a050d637d6eff43e669df13511e7d45795ab9b83a29ee8970652d33555303056 not found: ID does not exist" Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.764889 4771 scope.go:117] "RemoveContainer" containerID="1d746fb4716f9d94557319f61b83ac84c791f1e5652b88caa35dc9c741d9c5a2" Oct 01 15:14:31 crc kubenswrapper[4771]: E1001 15:14:31.772178 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d746fb4716f9d94557319f61b83ac84c791f1e5652b88caa35dc9c741d9c5a2\": container with ID starting with 1d746fb4716f9d94557319f61b83ac84c791f1e5652b88caa35dc9c741d9c5a2 not found: ID does not exist" containerID="1d746fb4716f9d94557319f61b83ac84c791f1e5652b88caa35dc9c741d9c5a2" Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.772221 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d746fb4716f9d94557319f61b83ac84c791f1e5652b88caa35dc9c741d9c5a2"} err="failed to get container status \"1d746fb4716f9d94557319f61b83ac84c791f1e5652b88caa35dc9c741d9c5a2\": rpc error: code = NotFound desc = could not find container \"1d746fb4716f9d94557319f61b83ac84c791f1e5652b88caa35dc9c741d9c5a2\": container with ID starting with 1d746fb4716f9d94557319f61b83ac84c791f1e5652b88caa35dc9c741d9c5a2 not found: ID does not exist" Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.786222 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b7689a2-6ac8-47ac-86f7-7456994c39ca-config-data" (OuterVolumeSpecName: "config-data") pod "7b7689a2-6ac8-47ac-86f7-7456994c39ca" (UID: "7b7689a2-6ac8-47ac-86f7-7456994c39ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.786768 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8eccb2db-b788-4639-a452-b2d7738c5126-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8eccb2db-b788-4639-a452-b2d7738c5126" (UID: "8eccb2db-b788-4639-a452-b2d7738c5126"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.787158 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b7689a2-6ac8-47ac-86f7-7456994c39ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.787180 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8eccb2db-b788-4639-a452-b2d7738c5126-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.787190 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8eccb2db-b788-4639-a452-b2d7738c5126-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.787200 4771 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8eccb2db-b788-4639-a452-b2d7738c5126-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.787211 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b7689a2-6ac8-47ac-86f7-7456994c39ca-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.787219 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8eccb2db-b788-4639-a452-b2d7738c5126-config\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.796281 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8eccb2db-b788-4639-a452-b2d7738c5126-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8eccb2db-b788-4639-a452-b2d7738c5126" (UID: "8eccb2db-b788-4639-a452-b2d7738c5126"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.889601 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8eccb2db-b788-4639-a452-b2d7738c5126-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.895676 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-95d56546f-jnd8s"] Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.942037 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-6rfg4"] Oct 01 15:14:31 crc kubenswrapper[4771]: E1001 15:14:31.942400 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eccb2db-b788-4639-a452-b2d7738c5126" containerName="dnsmasq-dns" Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.942413 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eccb2db-b788-4639-a452-b2d7738c5126" containerName="dnsmasq-dns" Oct 01 15:14:31 crc kubenswrapper[4771]: E1001 15:14:31.942453 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eccb2db-b788-4639-a452-b2d7738c5126" containerName="init" Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.942463 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eccb2db-b788-4639-a452-b2d7738c5126" containerName="init" Oct 01 15:14:31 crc kubenswrapper[4771]: E1001 15:14:31.942472 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b7689a2-6ac8-47ac-86f7-7456994c39ca" containerName="glance-db-sync" Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.942477 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b7689a2-6ac8-47ac-86f7-7456994c39ca" containerName="glance-db-sync" Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.942662 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b7689a2-6ac8-47ac-86f7-7456994c39ca" containerName="glance-db-sync" Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.942681 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="8eccb2db-b788-4639-a452-b2d7738c5126" containerName="dnsmasq-dns" Oct 01 15:14:31 crc kubenswrapper[4771]: I1001 15:14:31.943634 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-6rfg4" Oct 01 15:14:32 crc kubenswrapper[4771]: I1001 15:14:32.071114 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-6rfg4"] Oct 01 15:14:32 crc kubenswrapper[4771]: I1001 15:14:32.096331 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9e21fb0-30b7-41e6-b40b-b2f1bca3043a-config\") pod \"dnsmasq-dns-5784cf869f-6rfg4\" (UID: \"a9e21fb0-30b7-41e6-b40b-b2f1bca3043a\") " pod="openstack/dnsmasq-dns-5784cf869f-6rfg4" Oct 01 15:14:32 crc kubenswrapper[4771]: I1001 15:14:32.096476 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a9e21fb0-30b7-41e6-b40b-b2f1bca3043a-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-6rfg4\" (UID: \"a9e21fb0-30b7-41e6-b40b-b2f1bca3043a\") " pod="openstack/dnsmasq-dns-5784cf869f-6rfg4" Oct 01 15:14:32 crc kubenswrapper[4771]: I1001 15:14:32.096494 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9e21fb0-30b7-41e6-b40b-b2f1bca3043a-dns-svc\") pod \"dnsmasq-dns-5784cf869f-6rfg4\" (UID: \"a9e21fb0-30b7-41e6-b40b-b2f1bca3043a\") " pod="openstack/dnsmasq-dns-5784cf869f-6rfg4" Oct 01 15:14:32 crc kubenswrapper[4771]: I1001 15:14:32.096551 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a9e21fb0-30b7-41e6-b40b-b2f1bca3043a-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-6rfg4\" (UID: \"a9e21fb0-30b7-41e6-b40b-b2f1bca3043a\") " pod="openstack/dnsmasq-dns-5784cf869f-6rfg4" Oct 01 15:14:32 crc kubenswrapper[4771]: I1001 15:14:32.096574 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9e21fb0-30b7-41e6-b40b-b2f1bca3043a-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-6rfg4\" (UID: \"a9e21fb0-30b7-41e6-b40b-b2f1bca3043a\") " pod="openstack/dnsmasq-dns-5784cf869f-6rfg4" Oct 01 15:14:32 crc kubenswrapper[4771]: I1001 15:14:32.096605 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v42j\" (UniqueName: \"kubernetes.io/projected/a9e21fb0-30b7-41e6-b40b-b2f1bca3043a-kube-api-access-6v42j\") pod \"dnsmasq-dns-5784cf869f-6rfg4\" (UID: \"a9e21fb0-30b7-41e6-b40b-b2f1bca3043a\") " pod="openstack/dnsmasq-dns-5784cf869f-6rfg4" Oct 01 15:14:32 crc kubenswrapper[4771]: I1001 15:14:32.117463 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-789c5c5cb7-7vn6t"] Oct 01 15:14:32 crc kubenswrapper[4771]: I1001 15:14:32.165064 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-789c5c5cb7-7vn6t"] Oct 01 15:14:32 crc kubenswrapper[4771]: I1001 15:14:32.197664 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a9e21fb0-30b7-41e6-b40b-b2f1bca3043a-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-6rfg4\" (UID: \"a9e21fb0-30b7-41e6-b40b-b2f1bca3043a\") " pod="openstack/dnsmasq-dns-5784cf869f-6rfg4" Oct 01 15:14:32 crc kubenswrapper[4771]: I1001 15:14:32.197717 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9e21fb0-30b7-41e6-b40b-b2f1bca3043a-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-6rfg4\" (UID: \"a9e21fb0-30b7-41e6-b40b-b2f1bca3043a\") " pod="openstack/dnsmasq-dns-5784cf869f-6rfg4" Oct 01 15:14:32 crc kubenswrapper[4771]: I1001 15:14:32.197758 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v42j\" (UniqueName: \"kubernetes.io/projected/a9e21fb0-30b7-41e6-b40b-b2f1bca3043a-kube-api-access-6v42j\") pod \"dnsmasq-dns-5784cf869f-6rfg4\" (UID: \"a9e21fb0-30b7-41e6-b40b-b2f1bca3043a\") " pod="openstack/dnsmasq-dns-5784cf869f-6rfg4" Oct 01 15:14:32 crc kubenswrapper[4771]: I1001 15:14:32.197799 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9e21fb0-30b7-41e6-b40b-b2f1bca3043a-config\") pod \"dnsmasq-dns-5784cf869f-6rfg4\" (UID: \"a9e21fb0-30b7-41e6-b40b-b2f1bca3043a\") " pod="openstack/dnsmasq-dns-5784cf869f-6rfg4" Oct 01 15:14:32 crc kubenswrapper[4771]: I1001 15:14:32.197901 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a9e21fb0-30b7-41e6-b40b-b2f1bca3043a-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-6rfg4\" (UID: \"a9e21fb0-30b7-41e6-b40b-b2f1bca3043a\") " pod="openstack/dnsmasq-dns-5784cf869f-6rfg4" Oct 01 15:14:32 crc kubenswrapper[4771]: I1001 15:14:32.197918 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9e21fb0-30b7-41e6-b40b-b2f1bca3043a-dns-svc\") pod \"dnsmasq-dns-5784cf869f-6rfg4\" (UID: \"a9e21fb0-30b7-41e6-b40b-b2f1bca3043a\") " pod="openstack/dnsmasq-dns-5784cf869f-6rfg4" Oct 01 15:14:32 crc kubenswrapper[4771]: I1001 15:14:32.199147 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9e21fb0-30b7-41e6-b40b-b2f1bca3043a-dns-svc\") pod \"dnsmasq-dns-5784cf869f-6rfg4\" (UID: \"a9e21fb0-30b7-41e6-b40b-b2f1bca3043a\") " pod="openstack/dnsmasq-dns-5784cf869f-6rfg4" Oct 01 15:14:32 crc kubenswrapper[4771]: I1001 15:14:32.199673 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9e21fb0-30b7-41e6-b40b-b2f1bca3043a-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-6rfg4\" (UID: \"a9e21fb0-30b7-41e6-b40b-b2f1bca3043a\") " pod="openstack/dnsmasq-dns-5784cf869f-6rfg4" Oct 01 15:14:32 crc kubenswrapper[4771]: I1001 15:14:32.200044 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a9e21fb0-30b7-41e6-b40b-b2f1bca3043a-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-6rfg4\" (UID: \"a9e21fb0-30b7-41e6-b40b-b2f1bca3043a\") " pod="openstack/dnsmasq-dns-5784cf869f-6rfg4" Oct 01 15:14:32 crc kubenswrapper[4771]: I1001 15:14:32.200269 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9e21fb0-30b7-41e6-b40b-b2f1bca3043a-config\") pod \"dnsmasq-dns-5784cf869f-6rfg4\" (UID: \"a9e21fb0-30b7-41e6-b40b-b2f1bca3043a\") " pod="openstack/dnsmasq-dns-5784cf869f-6rfg4" Oct 01 15:14:32 crc kubenswrapper[4771]: I1001 15:14:32.200517 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a9e21fb0-30b7-41e6-b40b-b2f1bca3043a-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-6rfg4\" (UID: \"a9e21fb0-30b7-41e6-b40b-b2f1bca3043a\") " pod="openstack/dnsmasq-dns-5784cf869f-6rfg4" Oct 01 15:14:32 crc kubenswrapper[4771]: I1001 15:14:32.218470 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v42j\" (UniqueName: \"kubernetes.io/projected/a9e21fb0-30b7-41e6-b40b-b2f1bca3043a-kube-api-access-6v42j\") pod \"dnsmasq-dns-5784cf869f-6rfg4\" (UID: \"a9e21fb0-30b7-41e6-b40b-b2f1bca3043a\") " pod="openstack/dnsmasq-dns-5784cf869f-6rfg4" Oct 01 15:14:32 crc kubenswrapper[4771]: I1001 15:14:32.301145 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-6rfg4" Oct 01 15:14:32 crc kubenswrapper[4771]: I1001 15:14:32.593804 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4f26bf4c-7205-40ae-b698-e4d8a796f8f6","Type":"ContainerStarted","Data":"d48291e5e50b8a4cc2f0a97e00cba34d3e77cb75dfa7b65f2a93ab54743e77e5"} Oct 01 15:14:32 crc kubenswrapper[4771]: I1001 15:14:32.600971 4771 generic.go:334] "Generic (PLEG): container finished" podID="d5b6a127-5e69-4245-bb95-b5aa136a80e5" containerID="28e917088bcd60e0fdaf0aa6d0eff59e4c73b02d1d4bd8015e414f95abe783ce" exitCode=143 Oct 01 15:14:32 crc kubenswrapper[4771]: I1001 15:14:32.601079 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d5b6a127-5e69-4245-bb95-b5aa136a80e5","Type":"ContainerDied","Data":"28e917088bcd60e0fdaf0aa6d0eff59e4c73b02d1d4bd8015e414f95abe783ce"} Oct 01 15:14:32 crc kubenswrapper[4771]: I1001 15:14:32.606385 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-95d56546f-jnd8s" Oct 01 15:14:32 crc kubenswrapper[4771]: I1001 15:14:32.764171 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 15:14:32 crc kubenswrapper[4771]: I1001 15:14:32.765575 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 15:14:32 crc kubenswrapper[4771]: I1001 15:14:32.767550 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-hfxfb" Oct 01 15:14:32 crc kubenswrapper[4771]: I1001 15:14:32.772684 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 01 15:14:32 crc kubenswrapper[4771]: I1001 15:14:32.787882 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 15:14:32 crc kubenswrapper[4771]: I1001 15:14:32.788066 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 01 15:14:32 crc kubenswrapper[4771]: I1001 15:14:32.871816 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-6rfg4"] Oct 01 15:14:32 crc kubenswrapper[4771]: I1001 15:14:32.917721 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c76c23fa-917c-4457-8a9c-4123d13b17a8-logs\") pod \"glance-default-external-api-0\" (UID: \"c76c23fa-917c-4457-8a9c-4123d13b17a8\") " pod="openstack/glance-default-external-api-0" Oct 01 15:14:32 crc kubenswrapper[4771]: I1001 15:14:32.917839 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjhx5\" (UniqueName: \"kubernetes.io/projected/c76c23fa-917c-4457-8a9c-4123d13b17a8-kube-api-access-pjhx5\") pod \"glance-default-external-api-0\" (UID: \"c76c23fa-917c-4457-8a9c-4123d13b17a8\") " pod="openstack/glance-default-external-api-0" Oct 01 15:14:32 crc kubenswrapper[4771]: I1001 15:14:32.917882 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c76c23fa-917c-4457-8a9c-4123d13b17a8-config-data\") pod \"glance-default-external-api-0\" (UID: \"c76c23fa-917c-4457-8a9c-4123d13b17a8\") " pod="openstack/glance-default-external-api-0" Oct 01 15:14:32 crc kubenswrapper[4771]: I1001 15:14:32.917901 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c76c23fa-917c-4457-8a9c-4123d13b17a8-scripts\") pod \"glance-default-external-api-0\" (UID: \"c76c23fa-917c-4457-8a9c-4123d13b17a8\") " pod="openstack/glance-default-external-api-0" Oct 01 15:14:32 crc kubenswrapper[4771]: I1001 15:14:32.917943 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"c76c23fa-917c-4457-8a9c-4123d13b17a8\") " pod="openstack/glance-default-external-api-0" Oct 01 15:14:32 crc kubenswrapper[4771]: I1001 15:14:32.918082 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c76c23fa-917c-4457-8a9c-4123d13b17a8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c76c23fa-917c-4457-8a9c-4123d13b17a8\") " pod="openstack/glance-default-external-api-0" Oct 01 15:14:32 crc kubenswrapper[4771]: I1001 15:14:32.918243 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c76c23fa-917c-4457-8a9c-4123d13b17a8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c76c23fa-917c-4457-8a9c-4123d13b17a8\") " pod="openstack/glance-default-external-api-0" Oct 01 15:14:32 crc kubenswrapper[4771]: I1001 15:14:32.990960 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.000599 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.003238 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.019391 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c76c23fa-917c-4457-8a9c-4123d13b17a8-config-data\") pod \"glance-default-external-api-0\" (UID: \"c76c23fa-917c-4457-8a9c-4123d13b17a8\") " pod="openstack/glance-default-external-api-0" Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.019436 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c76c23fa-917c-4457-8a9c-4123d13b17a8-scripts\") pod \"glance-default-external-api-0\" (UID: \"c76c23fa-917c-4457-8a9c-4123d13b17a8\") " pod="openstack/glance-default-external-api-0" Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.019482 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"c76c23fa-917c-4457-8a9c-4123d13b17a8\") " pod="openstack/glance-default-external-api-0" Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.019503 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c76c23fa-917c-4457-8a9c-4123d13b17a8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c76c23fa-917c-4457-8a9c-4123d13b17a8\") " pod="openstack/glance-default-external-api-0" Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.019543 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c76c23fa-917c-4457-8a9c-4123d13b17a8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c76c23fa-917c-4457-8a9c-4123d13b17a8\") " pod="openstack/glance-default-external-api-0" Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.019590 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c76c23fa-917c-4457-8a9c-4123d13b17a8-logs\") pod \"glance-default-external-api-0\" (UID: \"c76c23fa-917c-4457-8a9c-4123d13b17a8\") " pod="openstack/glance-default-external-api-0" Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.019630 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjhx5\" (UniqueName: \"kubernetes.io/projected/c76c23fa-917c-4457-8a9c-4123d13b17a8-kube-api-access-pjhx5\") pod \"glance-default-external-api-0\" (UID: \"c76c23fa-917c-4457-8a9c-4123d13b17a8\") " pod="openstack/glance-default-external-api-0" Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.024805 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c76c23fa-917c-4457-8a9c-4123d13b17a8-config-data\") pod \"glance-default-external-api-0\" (UID: \"c76c23fa-917c-4457-8a9c-4123d13b17a8\") " pod="openstack/glance-default-external-api-0" Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.029243 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c76c23fa-917c-4457-8a9c-4123d13b17a8-scripts\") pod \"glance-default-external-api-0\" (UID: \"c76c23fa-917c-4457-8a9c-4123d13b17a8\") " pod="openstack/glance-default-external-api-0" Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.029683 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"c76c23fa-917c-4457-8a9c-4123d13b17a8\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.040710 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c76c23fa-917c-4457-8a9c-4123d13b17a8-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c76c23fa-917c-4457-8a9c-4123d13b17a8\") " pod="openstack/glance-default-external-api-0" Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.042105 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c76c23fa-917c-4457-8a9c-4123d13b17a8-logs\") pod \"glance-default-external-api-0\" (UID: \"c76c23fa-917c-4457-8a9c-4123d13b17a8\") " pod="openstack/glance-default-external-api-0" Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.047234 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.068658 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjhx5\" (UniqueName: \"kubernetes.io/projected/c76c23fa-917c-4457-8a9c-4123d13b17a8-kube-api-access-pjhx5\") pod \"glance-default-external-api-0\" (UID: \"c76c23fa-917c-4457-8a9c-4123d13b17a8\") " pod="openstack/glance-default-external-api-0" Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.069688 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c76c23fa-917c-4457-8a9c-4123d13b17a8-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c76c23fa-917c-4457-8a9c-4123d13b17a8\") " pod="openstack/glance-default-external-api-0" Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.075774 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"c76c23fa-917c-4457-8a9c-4123d13b17a8\") " pod="openstack/glance-default-external-api-0" Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.081455 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.121652 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.121766 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.121805 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.121860 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.121901 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3-logs\") pod \"glance-default-internal-api-0\" (UID: \"43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.121938 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.122006 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8m2w\" (UniqueName: \"kubernetes.io/projected/43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3-kube-api-access-x8m2w\") pod \"glance-default-internal-api-0\" (UID: \"43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.224608 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8m2w\" (UniqueName: \"kubernetes.io/projected/43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3-kube-api-access-x8m2w\") pod \"glance-default-internal-api-0\" (UID: \"43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.224771 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.224834 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.224881 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.224934 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.224979 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3-logs\") pod \"glance-default-internal-api-0\" (UID: \"43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.225017 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.225288 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.225380 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.225662 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3-logs\") pod \"glance-default-internal-api-0\" (UID: \"43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.235182 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.237633 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.238167 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.243067 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8m2w\" (UniqueName: \"kubernetes.io/projected/43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3-kube-api-access-x8m2w\") pod \"glance-default-internal-api-0\" (UID: \"43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.257059 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.481060 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.630675 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4f26bf4c-7205-40ae-b698-e4d8a796f8f6","Type":"ContainerStarted","Data":"902492f5d1db20b120e1e3e344698dee4d3fb8ffd1ec2d0cd21ac3b22b036681"} Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.632463 4771 generic.go:334] "Generic (PLEG): container finished" podID="a9e21fb0-30b7-41e6-b40b-b2f1bca3043a" containerID="48592c9a7cb5619d3e4d46aeb65545fcb70d6aa842f0e7ce407ef457489cc869" exitCode=0 Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.632643 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-95d56546f-jnd8s" podUID="2f1ed27a-5b60-4efe-8dde-58545050953f" containerName="dnsmasq-dns" containerID="cri-o://43cd7e5965c7a73c4f780c932f3499366cbea99caf86f303b94dea377e87303d" gracePeriod=10 Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.633202 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-6rfg4" event={"ID":"a9e21fb0-30b7-41e6-b40b-b2f1bca3043a","Type":"ContainerDied","Data":"48592c9a7cb5619d3e4d46aeb65545fcb70d6aa842f0e7ce407ef457489cc869"} Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.633226 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-6rfg4" event={"ID":"a9e21fb0-30b7-41e6-b40b-b2f1bca3043a","Type":"ContainerStarted","Data":"0946af39b7eb661a94b8d0b8c3bd7dbcc335b425a37f465b78d8f447e9af8cf2"} Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.662315 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.615604836 podStartE2EDuration="5.66229125s" podCreationTimestamp="2025-10-01 15:14:28 +0000 UTC" firstStartedPulling="2025-10-01 15:14:29.49419872 +0000 UTC m=+1114.113373891" lastFinishedPulling="2025-10-01 15:14:30.540885134 +0000 UTC m=+1115.160060305" observedRunningTime="2025-10-01 15:14:33.656002145 +0000 UTC m=+1118.275177316" watchObservedRunningTime="2025-10-01 15:14:33.66229125 +0000 UTC m=+1118.281466421" Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.725896 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 15:14:33 crc kubenswrapper[4771]: I1001 15:14:33.903808 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 01 15:14:34 crc kubenswrapper[4771]: I1001 15:14:34.007526 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8eccb2db-b788-4639-a452-b2d7738c5126" path="/var/lib/kubelet/pods/8eccb2db-b788-4639-a452-b2d7738c5126/volumes" Oct 01 15:14:34 crc kubenswrapper[4771]: W1001 15:14:34.130961 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43eb254e_a3d2_483c_9f8b_7e9ee2bb3ff3.slice/crio-6b1cdea39039c2a9c934c15a1b45b435721ac98c331b5030859a8c282f4f045b WatchSource:0}: Error finding container 6b1cdea39039c2a9c934c15a1b45b435721ac98c331b5030859a8c282f4f045b: Status 404 returned error can't find the container with id 6b1cdea39039c2a9c934c15a1b45b435721ac98c331b5030859a8c282f4f045b Oct 01 15:14:34 crc kubenswrapper[4771]: I1001 15:14:34.131872 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 15:14:34 crc kubenswrapper[4771]: I1001 15:14:34.199752 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95d56546f-jnd8s" Oct 01 15:14:34 crc kubenswrapper[4771]: I1001 15:14:34.250534 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f1ed27a-5b60-4efe-8dde-58545050953f-ovsdbserver-sb\") pod \"2f1ed27a-5b60-4efe-8dde-58545050953f\" (UID: \"2f1ed27a-5b60-4efe-8dde-58545050953f\") " Oct 01 15:14:34 crc kubenswrapper[4771]: I1001 15:14:34.250619 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26nzn\" (UniqueName: \"kubernetes.io/projected/2f1ed27a-5b60-4efe-8dde-58545050953f-kube-api-access-26nzn\") pod \"2f1ed27a-5b60-4efe-8dde-58545050953f\" (UID: \"2f1ed27a-5b60-4efe-8dde-58545050953f\") " Oct 01 15:14:34 crc kubenswrapper[4771]: I1001 15:14:34.250684 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f1ed27a-5b60-4efe-8dde-58545050953f-ovsdbserver-nb\") pod \"2f1ed27a-5b60-4efe-8dde-58545050953f\" (UID: \"2f1ed27a-5b60-4efe-8dde-58545050953f\") " Oct 01 15:14:34 crc kubenswrapper[4771]: I1001 15:14:34.250718 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f1ed27a-5b60-4efe-8dde-58545050953f-dns-swift-storage-0\") pod \"2f1ed27a-5b60-4efe-8dde-58545050953f\" (UID: \"2f1ed27a-5b60-4efe-8dde-58545050953f\") " Oct 01 15:14:34 crc kubenswrapper[4771]: I1001 15:14:34.250762 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f1ed27a-5b60-4efe-8dde-58545050953f-config\") pod \"2f1ed27a-5b60-4efe-8dde-58545050953f\" (UID: \"2f1ed27a-5b60-4efe-8dde-58545050953f\") " Oct 01 15:14:34 crc kubenswrapper[4771]: I1001 15:14:34.250782 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f1ed27a-5b60-4efe-8dde-58545050953f-dns-svc\") pod \"2f1ed27a-5b60-4efe-8dde-58545050953f\" (UID: \"2f1ed27a-5b60-4efe-8dde-58545050953f\") " Oct 01 15:14:34 crc kubenswrapper[4771]: I1001 15:14:34.277907 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f1ed27a-5b60-4efe-8dde-58545050953f-kube-api-access-26nzn" (OuterVolumeSpecName: "kube-api-access-26nzn") pod "2f1ed27a-5b60-4efe-8dde-58545050953f" (UID: "2f1ed27a-5b60-4efe-8dde-58545050953f"). InnerVolumeSpecName "kube-api-access-26nzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:14:34 crc kubenswrapper[4771]: I1001 15:14:34.353180 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26nzn\" (UniqueName: \"kubernetes.io/projected/2f1ed27a-5b60-4efe-8dde-58545050953f-kube-api-access-26nzn\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:34 crc kubenswrapper[4771]: I1001 15:14:34.420495 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f1ed27a-5b60-4efe-8dde-58545050953f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2f1ed27a-5b60-4efe-8dde-58545050953f" (UID: "2f1ed27a-5b60-4efe-8dde-58545050953f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:14:34 crc kubenswrapper[4771]: I1001 15:14:34.420659 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f1ed27a-5b60-4efe-8dde-58545050953f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2f1ed27a-5b60-4efe-8dde-58545050953f" (UID: "2f1ed27a-5b60-4efe-8dde-58545050953f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:14:34 crc kubenswrapper[4771]: I1001 15:14:34.421073 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f1ed27a-5b60-4efe-8dde-58545050953f-config" (OuterVolumeSpecName: "config") pod "2f1ed27a-5b60-4efe-8dde-58545050953f" (UID: "2f1ed27a-5b60-4efe-8dde-58545050953f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:14:34 crc kubenswrapper[4771]: I1001 15:14:34.437428 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f1ed27a-5b60-4efe-8dde-58545050953f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2f1ed27a-5b60-4efe-8dde-58545050953f" (UID: "2f1ed27a-5b60-4efe-8dde-58545050953f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:14:34 crc kubenswrapper[4771]: I1001 15:14:34.447940 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f1ed27a-5b60-4efe-8dde-58545050953f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2f1ed27a-5b60-4efe-8dde-58545050953f" (UID: "2f1ed27a-5b60-4efe-8dde-58545050953f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:14:34 crc kubenswrapper[4771]: I1001 15:14:34.461846 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2f1ed27a-5b60-4efe-8dde-58545050953f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:34 crc kubenswrapper[4771]: I1001 15:14:34.461879 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2f1ed27a-5b60-4efe-8dde-58545050953f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:34 crc kubenswrapper[4771]: I1001 15:14:34.461889 4771 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2f1ed27a-5b60-4efe-8dde-58545050953f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:34 crc kubenswrapper[4771]: I1001 15:14:34.461899 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f1ed27a-5b60-4efe-8dde-58545050953f-config\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:34 crc kubenswrapper[4771]: I1001 15:14:34.461909 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2f1ed27a-5b60-4efe-8dde-58545050953f-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:34 crc kubenswrapper[4771]: I1001 15:14:34.673016 4771 generic.go:334] "Generic (PLEG): container finished" podID="2f1ed27a-5b60-4efe-8dde-58545050953f" containerID="43cd7e5965c7a73c4f780c932f3499366cbea99caf86f303b94dea377e87303d" exitCode=0 Oct 01 15:14:34 crc kubenswrapper[4771]: I1001 15:14:34.673211 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-95d56546f-jnd8s" Oct 01 15:14:34 crc kubenswrapper[4771]: I1001 15:14:34.673637 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95d56546f-jnd8s" event={"ID":"2f1ed27a-5b60-4efe-8dde-58545050953f","Type":"ContainerDied","Data":"43cd7e5965c7a73c4f780c932f3499366cbea99caf86f303b94dea377e87303d"} Oct 01 15:14:34 crc kubenswrapper[4771]: I1001 15:14:34.673675 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-95d56546f-jnd8s" event={"ID":"2f1ed27a-5b60-4efe-8dde-58545050953f","Type":"ContainerDied","Data":"6521fb8737cb82bd088e7ba8606ecd9b1e73688c617ed0d42dd02605e03727c7"} Oct 01 15:14:34 crc kubenswrapper[4771]: I1001 15:14:34.673691 4771 scope.go:117] "RemoveContainer" containerID="43cd7e5965c7a73c4f780c932f3499366cbea99caf86f303b94dea377e87303d" Oct 01 15:14:34 crc kubenswrapper[4771]: I1001 15:14:34.685684 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c76c23fa-917c-4457-8a9c-4123d13b17a8","Type":"ContainerStarted","Data":"0b71a10e730a0960b8087925063b948ccf9ab92ce65efb799288afd29c631585"} Oct 01 15:14:34 crc kubenswrapper[4771]: I1001 15:14:34.685747 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c76c23fa-917c-4457-8a9c-4123d13b17a8","Type":"ContainerStarted","Data":"4ebf84f7c50e104789659aae5e4446a86419b47d14aaeb19c473496bab4318cc"} Oct 01 15:14:34 crc kubenswrapper[4771]: I1001 15:14:34.730522 4771 scope.go:117] "RemoveContainer" containerID="1cdf1537ce894b93395d5e90c4994a08697c3732e3956617b4f01854296ab21d" Oct 01 15:14:34 crc kubenswrapper[4771]: I1001 15:14:34.731020 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-6rfg4" event={"ID":"a9e21fb0-30b7-41e6-b40b-b2f1bca3043a","Type":"ContainerStarted","Data":"88e46f77c24fe812a9282ebcc93064318f0b3fbbb76eb9a355bcf862301ba3d0"} Oct 01 15:14:34 crc kubenswrapper[4771]: I1001 15:14:34.732365 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5784cf869f-6rfg4" Oct 01 15:14:34 crc kubenswrapper[4771]: I1001 15:14:34.733450 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-95d56546f-jnd8s"] Oct 01 15:14:34 crc kubenswrapper[4771]: I1001 15:14:34.735466 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3","Type":"ContainerStarted","Data":"6b1cdea39039c2a9c934c15a1b45b435721ac98c331b5030859a8c282f4f045b"} Oct 01 15:14:34 crc kubenswrapper[4771]: I1001 15:14:34.740314 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-95d56546f-jnd8s"] Oct 01 15:14:34 crc kubenswrapper[4771]: I1001 15:14:34.761067 4771 scope.go:117] "RemoveContainer" containerID="43cd7e5965c7a73c4f780c932f3499366cbea99caf86f303b94dea377e87303d" Oct 01 15:14:34 crc kubenswrapper[4771]: I1001 15:14:34.764596 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5784cf869f-6rfg4" podStartSLOduration=3.764576543 podStartE2EDuration="3.764576543s" podCreationTimestamp="2025-10-01 15:14:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:14:34.752069705 +0000 UTC m=+1119.371244886" watchObservedRunningTime="2025-10-01 15:14:34.764576543 +0000 UTC m=+1119.383751714" Oct 01 15:14:34 crc kubenswrapper[4771]: E1001 15:14:34.766925 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43cd7e5965c7a73c4f780c932f3499366cbea99caf86f303b94dea377e87303d\": container with ID starting with 43cd7e5965c7a73c4f780c932f3499366cbea99caf86f303b94dea377e87303d not found: ID does not exist" containerID="43cd7e5965c7a73c4f780c932f3499366cbea99caf86f303b94dea377e87303d" Oct 01 15:14:34 crc kubenswrapper[4771]: I1001 15:14:34.766988 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43cd7e5965c7a73c4f780c932f3499366cbea99caf86f303b94dea377e87303d"} err="failed to get container status \"43cd7e5965c7a73c4f780c932f3499366cbea99caf86f303b94dea377e87303d\": rpc error: code = NotFound desc = could not find container \"43cd7e5965c7a73c4f780c932f3499366cbea99caf86f303b94dea377e87303d\": container with ID starting with 43cd7e5965c7a73c4f780c932f3499366cbea99caf86f303b94dea377e87303d not found: ID does not exist" Oct 01 15:14:34 crc kubenswrapper[4771]: I1001 15:14:34.767021 4771 scope.go:117] "RemoveContainer" containerID="1cdf1537ce894b93395d5e90c4994a08697c3732e3956617b4f01854296ab21d" Oct 01 15:14:34 crc kubenswrapper[4771]: E1001 15:14:34.767398 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cdf1537ce894b93395d5e90c4994a08697c3732e3956617b4f01854296ab21d\": container with ID starting with 1cdf1537ce894b93395d5e90c4994a08697c3732e3956617b4f01854296ab21d not found: ID does not exist" containerID="1cdf1537ce894b93395d5e90c4994a08697c3732e3956617b4f01854296ab21d" Oct 01 15:14:34 crc kubenswrapper[4771]: I1001 15:14:34.767429 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cdf1537ce894b93395d5e90c4994a08697c3732e3956617b4f01854296ab21d"} err="failed to get container status \"1cdf1537ce894b93395d5e90c4994a08697c3732e3956617b4f01854296ab21d\": rpc error: code = NotFound desc = could not find container \"1cdf1537ce894b93395d5e90c4994a08697c3732e3956617b4f01854296ab21d\": container with ID starting with 1cdf1537ce894b93395d5e90c4994a08697c3732e3956617b4f01854296ab21d not found: ID does not exist" Oct 01 15:14:34 crc kubenswrapper[4771]: I1001 15:14:34.850709 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 15:14:34 crc kubenswrapper[4771]: I1001 15:14:34.915147 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 15:14:35 crc kubenswrapper[4771]: I1001 15:14:35.746285 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3","Type":"ContainerStarted","Data":"35667ae51083b65f157604ef33843596c45a65acc2090effd0811ec1a8cd6b6f"} Oct 01 15:14:35 crc kubenswrapper[4771]: I1001 15:14:35.749115 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c76c23fa-917c-4457-8a9c-4123d13b17a8","Type":"ContainerStarted","Data":"17ed8c8d3ebbccdbd8bfc51f79f133ef50f63eaaa428d5854ae2b82f9c262e87"} Oct 01 15:14:35 crc kubenswrapper[4771]: I1001 15:14:35.749240 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c76c23fa-917c-4457-8a9c-4123d13b17a8" containerName="glance-log" containerID="cri-o://0b71a10e730a0960b8087925063b948ccf9ab92ce65efb799288afd29c631585" gracePeriod=30 Oct 01 15:14:35 crc kubenswrapper[4771]: I1001 15:14:35.749308 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="c76c23fa-917c-4457-8a9c-4123d13b17a8" containerName="glance-httpd" containerID="cri-o://17ed8c8d3ebbccdbd8bfc51f79f133ef50f63eaaa428d5854ae2b82f9c262e87" gracePeriod=30 Oct 01 15:14:35 crc kubenswrapper[4771]: I1001 15:14:35.776911 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.776891742 podStartE2EDuration="4.776891742s" podCreationTimestamp="2025-10-01 15:14:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:14:35.769792567 +0000 UTC m=+1120.388967748" watchObservedRunningTime="2025-10-01 15:14:35.776891742 +0000 UTC m=+1120.396066903" Oct 01 15:14:35 crc kubenswrapper[4771]: I1001 15:14:35.999205 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f1ed27a-5b60-4efe-8dde-58545050953f" path="/var/lib/kubelet/pods/2f1ed27a-5b60-4efe-8dde-58545050953f/volumes" Oct 01 15:14:36 crc kubenswrapper[4771]: I1001 15:14:36.571681 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 15:14:36 crc kubenswrapper[4771]: I1001 15:14:36.688218 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-74b8bdcb7c-xttgq" Oct 01 15:14:36 crc kubenswrapper[4771]: I1001 15:14:36.724401 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjhx5\" (UniqueName: \"kubernetes.io/projected/c76c23fa-917c-4457-8a9c-4123d13b17a8-kube-api-access-pjhx5\") pod \"c76c23fa-917c-4457-8a9c-4123d13b17a8\" (UID: \"c76c23fa-917c-4457-8a9c-4123d13b17a8\") " Oct 01 15:14:36 crc kubenswrapper[4771]: I1001 15:14:36.724473 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c76c23fa-917c-4457-8a9c-4123d13b17a8-httpd-run\") pod \"c76c23fa-917c-4457-8a9c-4123d13b17a8\" (UID: \"c76c23fa-917c-4457-8a9c-4123d13b17a8\") " Oct 01 15:14:36 crc kubenswrapper[4771]: I1001 15:14:36.724549 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"c76c23fa-917c-4457-8a9c-4123d13b17a8\" (UID: \"c76c23fa-917c-4457-8a9c-4123d13b17a8\") " Oct 01 15:14:36 crc kubenswrapper[4771]: I1001 15:14:36.724625 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c76c23fa-917c-4457-8a9c-4123d13b17a8-logs\") pod \"c76c23fa-917c-4457-8a9c-4123d13b17a8\" (UID: \"c76c23fa-917c-4457-8a9c-4123d13b17a8\") " Oct 01 15:14:36 crc kubenswrapper[4771]: I1001 15:14:36.724808 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c76c23fa-917c-4457-8a9c-4123d13b17a8-config-data\") pod \"c76c23fa-917c-4457-8a9c-4123d13b17a8\" (UID: \"c76c23fa-917c-4457-8a9c-4123d13b17a8\") " Oct 01 15:14:36 crc kubenswrapper[4771]: I1001 15:14:36.724837 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c76c23fa-917c-4457-8a9c-4123d13b17a8-combined-ca-bundle\") pod \"c76c23fa-917c-4457-8a9c-4123d13b17a8\" (UID: \"c76c23fa-917c-4457-8a9c-4123d13b17a8\") " Oct 01 15:14:36 crc kubenswrapper[4771]: I1001 15:14:36.724913 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c76c23fa-917c-4457-8a9c-4123d13b17a8-scripts\") pod \"c76c23fa-917c-4457-8a9c-4123d13b17a8\" (UID: \"c76c23fa-917c-4457-8a9c-4123d13b17a8\") " Oct 01 15:14:36 crc kubenswrapper[4771]: I1001 15:14:36.727008 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c76c23fa-917c-4457-8a9c-4123d13b17a8-logs" (OuterVolumeSpecName: "logs") pod "c76c23fa-917c-4457-8a9c-4123d13b17a8" (UID: "c76c23fa-917c-4457-8a9c-4123d13b17a8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:14:36 crc kubenswrapper[4771]: I1001 15:14:36.727119 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c76c23fa-917c-4457-8a9c-4123d13b17a8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c76c23fa-917c-4457-8a9c-4123d13b17a8" (UID: "c76c23fa-917c-4457-8a9c-4123d13b17a8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:14:36 crc kubenswrapper[4771]: I1001 15:14:36.731954 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c76c23fa-917c-4457-8a9c-4123d13b17a8-scripts" (OuterVolumeSpecName: "scripts") pod "c76c23fa-917c-4457-8a9c-4123d13b17a8" (UID: "c76c23fa-917c-4457-8a9c-4123d13b17a8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:36 crc kubenswrapper[4771]: I1001 15:14:36.742710 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "c76c23fa-917c-4457-8a9c-4123d13b17a8" (UID: "c76c23fa-917c-4457-8a9c-4123d13b17a8"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 15:14:36 crc kubenswrapper[4771]: I1001 15:14:36.756215 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c76c23fa-917c-4457-8a9c-4123d13b17a8-kube-api-access-pjhx5" (OuterVolumeSpecName: "kube-api-access-pjhx5") pod "c76c23fa-917c-4457-8a9c-4123d13b17a8" (UID: "c76c23fa-917c-4457-8a9c-4123d13b17a8"). InnerVolumeSpecName "kube-api-access-pjhx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:14:36 crc kubenswrapper[4771]: I1001 15:14:36.768278 4771 generic.go:334] "Generic (PLEG): container finished" podID="c76c23fa-917c-4457-8a9c-4123d13b17a8" containerID="17ed8c8d3ebbccdbd8bfc51f79f133ef50f63eaaa428d5854ae2b82f9c262e87" exitCode=0 Oct 01 15:14:36 crc kubenswrapper[4771]: I1001 15:14:36.768423 4771 generic.go:334] "Generic (PLEG): container finished" podID="c76c23fa-917c-4457-8a9c-4123d13b17a8" containerID="0b71a10e730a0960b8087925063b948ccf9ab92ce65efb799288afd29c631585" exitCode=143 Oct 01 15:14:36 crc kubenswrapper[4771]: I1001 15:14:36.768529 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c76c23fa-917c-4457-8a9c-4123d13b17a8","Type":"ContainerDied","Data":"17ed8c8d3ebbccdbd8bfc51f79f133ef50f63eaaa428d5854ae2b82f9c262e87"} Oct 01 15:14:36 crc kubenswrapper[4771]: I1001 15:14:36.768632 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c76c23fa-917c-4457-8a9c-4123d13b17a8","Type":"ContainerDied","Data":"0b71a10e730a0960b8087925063b948ccf9ab92ce65efb799288afd29c631585"} Oct 01 15:14:36 crc kubenswrapper[4771]: I1001 15:14:36.768707 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c76c23fa-917c-4457-8a9c-4123d13b17a8","Type":"ContainerDied","Data":"4ebf84f7c50e104789659aae5e4446a86419b47d14aaeb19c473496bab4318cc"} Oct 01 15:14:36 crc kubenswrapper[4771]: I1001 15:14:36.768797 4771 scope.go:117] "RemoveContainer" containerID="17ed8c8d3ebbccdbd8bfc51f79f133ef50f63eaaa428d5854ae2b82f9c262e87" Oct 01 15:14:36 crc kubenswrapper[4771]: I1001 15:14:36.769039 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 15:14:36 crc kubenswrapper[4771]: I1001 15:14:36.793095 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c76c23fa-917c-4457-8a9c-4123d13b17a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c76c23fa-917c-4457-8a9c-4123d13b17a8" (UID: "c76c23fa-917c-4457-8a9c-4123d13b17a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:36 crc kubenswrapper[4771]: I1001 15:14:36.793779 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3" containerName="glance-log" containerID="cri-o://35667ae51083b65f157604ef33843596c45a65acc2090effd0811ec1a8cd6b6f" gracePeriod=30 Oct 01 15:14:36 crc kubenswrapper[4771]: I1001 15:14:36.794216 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3","Type":"ContainerStarted","Data":"8803bf34b581a6c65cd757c381064f893d99b3356fdc1710b47433fa2266d950"} Oct 01 15:14:36 crc kubenswrapper[4771]: I1001 15:14:36.794296 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3" containerName="glance-httpd" containerID="cri-o://8803bf34b581a6c65cd757c381064f893d99b3356fdc1710b47433fa2266d950" gracePeriod=30 Oct 01 15:14:36 crc kubenswrapper[4771]: I1001 15:14:36.824027 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.824008347 podStartE2EDuration="5.824008347s" podCreationTimestamp="2025-10-01 15:14:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:14:36.819162898 +0000 UTC m=+1121.438338059" watchObservedRunningTime="2025-10-01 15:14:36.824008347 +0000 UTC m=+1121.443183518" Oct 01 15:14:36 crc kubenswrapper[4771]: I1001 15:14:36.826943 4771 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Oct 01 15:14:36 crc kubenswrapper[4771]: I1001 15:14:36.826986 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c76c23fa-917c-4457-8a9c-4123d13b17a8-logs\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:36 crc kubenswrapper[4771]: I1001 15:14:36.826997 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c76c23fa-917c-4457-8a9c-4123d13b17a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:36 crc kubenswrapper[4771]: I1001 15:14:36.827013 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c76c23fa-917c-4457-8a9c-4123d13b17a8-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:36 crc kubenswrapper[4771]: I1001 15:14:36.827022 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjhx5\" (UniqueName: \"kubernetes.io/projected/c76c23fa-917c-4457-8a9c-4123d13b17a8-kube-api-access-pjhx5\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:36 crc kubenswrapper[4771]: I1001 15:14:36.827030 4771 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c76c23fa-917c-4457-8a9c-4123d13b17a8-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:36 crc kubenswrapper[4771]: I1001 15:14:36.834048 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c76c23fa-917c-4457-8a9c-4123d13b17a8-config-data" (OuterVolumeSpecName: "config-data") pod "c76c23fa-917c-4457-8a9c-4123d13b17a8" (UID: "c76c23fa-917c-4457-8a9c-4123d13b17a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:36 crc kubenswrapper[4771]: I1001 15:14:36.862086 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-554565568b-5js82" Oct 01 15:14:36 crc kubenswrapper[4771]: I1001 15:14:36.866935 4771 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Oct 01 15:14:36 crc kubenswrapper[4771]: I1001 15:14:36.893766 4771 scope.go:117] "RemoveContainer" containerID="0b71a10e730a0960b8087925063b948ccf9ab92ce65efb799288afd29c631585" Oct 01 15:14:36 crc kubenswrapper[4771]: I1001 15:14:36.929868 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c76c23fa-917c-4457-8a9c-4123d13b17a8-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:36 crc kubenswrapper[4771]: I1001 15:14:36.929903 4771 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:36 crc kubenswrapper[4771]: I1001 15:14:36.969624 4771 scope.go:117] "RemoveContainer" containerID="17ed8c8d3ebbccdbd8bfc51f79f133ef50f63eaaa428d5854ae2b82f9c262e87" Oct 01 15:14:36 crc kubenswrapper[4771]: E1001 15:14:36.975138 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17ed8c8d3ebbccdbd8bfc51f79f133ef50f63eaaa428d5854ae2b82f9c262e87\": container with ID starting with 17ed8c8d3ebbccdbd8bfc51f79f133ef50f63eaaa428d5854ae2b82f9c262e87 not found: ID does not exist" containerID="17ed8c8d3ebbccdbd8bfc51f79f133ef50f63eaaa428d5854ae2b82f9c262e87" Oct 01 15:14:36 crc kubenswrapper[4771]: I1001 15:14:36.975180 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17ed8c8d3ebbccdbd8bfc51f79f133ef50f63eaaa428d5854ae2b82f9c262e87"} err="failed to get container status \"17ed8c8d3ebbccdbd8bfc51f79f133ef50f63eaaa428d5854ae2b82f9c262e87\": rpc error: code = NotFound desc = could not find container \"17ed8c8d3ebbccdbd8bfc51f79f133ef50f63eaaa428d5854ae2b82f9c262e87\": container with ID starting with 17ed8c8d3ebbccdbd8bfc51f79f133ef50f63eaaa428d5854ae2b82f9c262e87 not found: ID does not exist" Oct 01 15:14:36 crc kubenswrapper[4771]: I1001 15:14:36.975219 4771 scope.go:117] "RemoveContainer" containerID="0b71a10e730a0960b8087925063b948ccf9ab92ce65efb799288afd29c631585" Oct 01 15:14:36 crc kubenswrapper[4771]: E1001 15:14:36.976912 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b71a10e730a0960b8087925063b948ccf9ab92ce65efb799288afd29c631585\": container with ID starting with 0b71a10e730a0960b8087925063b948ccf9ab92ce65efb799288afd29c631585 not found: ID does not exist" containerID="0b71a10e730a0960b8087925063b948ccf9ab92ce65efb799288afd29c631585" Oct 01 15:14:36 crc kubenswrapper[4771]: I1001 15:14:36.976959 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b71a10e730a0960b8087925063b948ccf9ab92ce65efb799288afd29c631585"} err="failed to get container status \"0b71a10e730a0960b8087925063b948ccf9ab92ce65efb799288afd29c631585\": rpc error: code = NotFound desc = could not find container \"0b71a10e730a0960b8087925063b948ccf9ab92ce65efb799288afd29c631585\": container with ID starting with 0b71a10e730a0960b8087925063b948ccf9ab92ce65efb799288afd29c631585 not found: ID does not exist" Oct 01 15:14:36 crc kubenswrapper[4771]: I1001 15:14:36.976985 4771 scope.go:117] "RemoveContainer" containerID="17ed8c8d3ebbccdbd8bfc51f79f133ef50f63eaaa428d5854ae2b82f9c262e87" Oct 01 15:14:36 crc kubenswrapper[4771]: I1001 15:14:36.977637 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17ed8c8d3ebbccdbd8bfc51f79f133ef50f63eaaa428d5854ae2b82f9c262e87"} err="failed to get container status \"17ed8c8d3ebbccdbd8bfc51f79f133ef50f63eaaa428d5854ae2b82f9c262e87\": rpc error: code = NotFound desc = could not find container \"17ed8c8d3ebbccdbd8bfc51f79f133ef50f63eaaa428d5854ae2b82f9c262e87\": container with ID starting with 17ed8c8d3ebbccdbd8bfc51f79f133ef50f63eaaa428d5854ae2b82f9c262e87 not found: ID does not exist" Oct 01 15:14:36 crc kubenswrapper[4771]: I1001 15:14:36.977657 4771 scope.go:117] "RemoveContainer" containerID="0b71a10e730a0960b8087925063b948ccf9ab92ce65efb799288afd29c631585" Oct 01 15:14:36 crc kubenswrapper[4771]: I1001 15:14:36.978155 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b71a10e730a0960b8087925063b948ccf9ab92ce65efb799288afd29c631585"} err="failed to get container status \"0b71a10e730a0960b8087925063b948ccf9ab92ce65efb799288afd29c631585\": rpc error: code = NotFound desc = could not find container \"0b71a10e730a0960b8087925063b948ccf9ab92ce65efb799288afd29c631585\": container with ID starting with 0b71a10e730a0960b8087925063b948ccf9ab92ce65efb799288afd29c631585 not found: ID does not exist" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.030536 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-554565568b-5js82" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.198807 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.202552 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.249281 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 15:14:37 crc kubenswrapper[4771]: E1001 15:14:37.251589 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f1ed27a-5b60-4efe-8dde-58545050953f" containerName="init" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.251771 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f1ed27a-5b60-4efe-8dde-58545050953f" containerName="init" Oct 01 15:14:37 crc kubenswrapper[4771]: E1001 15:14:37.251915 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f1ed27a-5b60-4efe-8dde-58545050953f" containerName="dnsmasq-dns" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.252012 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f1ed27a-5b60-4efe-8dde-58545050953f" containerName="dnsmasq-dns" Oct 01 15:14:37 crc kubenswrapper[4771]: E1001 15:14:37.253329 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c76c23fa-917c-4457-8a9c-4123d13b17a8" containerName="glance-httpd" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.253503 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="c76c23fa-917c-4457-8a9c-4123d13b17a8" containerName="glance-httpd" Oct 01 15:14:37 crc kubenswrapper[4771]: E1001 15:14:37.253705 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c76c23fa-917c-4457-8a9c-4123d13b17a8" containerName="glance-log" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.253842 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="c76c23fa-917c-4457-8a9c-4123d13b17a8" containerName="glance-log" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.254183 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f1ed27a-5b60-4efe-8dde-58545050953f" containerName="dnsmasq-dns" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.254293 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="c76c23fa-917c-4457-8a9c-4123d13b17a8" containerName="glance-log" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.254417 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="c76c23fa-917c-4457-8a9c-4123d13b17a8" containerName="glance-httpd" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.256173 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.267992 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.268107 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.273188 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.443777 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f267338d-cce8-4abe-b836-519fcca98eed-config-data\") pod \"glance-default-external-api-0\" (UID: \"f267338d-cce8-4abe-b836-519fcca98eed\") " pod="openstack/glance-default-external-api-0" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.443837 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f267338d-cce8-4abe-b836-519fcca98eed-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f267338d-cce8-4abe-b836-519fcca98eed\") " pod="openstack/glance-default-external-api-0" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.443883 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"f267338d-cce8-4abe-b836-519fcca98eed\") " pod="openstack/glance-default-external-api-0" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.443953 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f267338d-cce8-4abe-b836-519fcca98eed-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f267338d-cce8-4abe-b836-519fcca98eed\") " pod="openstack/glance-default-external-api-0" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.443983 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f267338d-cce8-4abe-b836-519fcca98eed-logs\") pod \"glance-default-external-api-0\" (UID: \"f267338d-cce8-4abe-b836-519fcca98eed\") " pod="openstack/glance-default-external-api-0" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.444040 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f267338d-cce8-4abe-b836-519fcca98eed-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f267338d-cce8-4abe-b836-519fcca98eed\") " pod="openstack/glance-default-external-api-0" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.444067 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f267338d-cce8-4abe-b836-519fcca98eed-scripts\") pod \"glance-default-external-api-0\" (UID: \"f267338d-cce8-4abe-b836-519fcca98eed\") " pod="openstack/glance-default-external-api-0" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.444105 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkmxd\" (UniqueName: \"kubernetes.io/projected/f267338d-cce8-4abe-b836-519fcca98eed-kube-api-access-hkmxd\") pod \"glance-default-external-api-0\" (UID: \"f267338d-cce8-4abe-b836-519fcca98eed\") " pod="openstack/glance-default-external-api-0" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.545746 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f267338d-cce8-4abe-b836-519fcca98eed-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f267338d-cce8-4abe-b836-519fcca98eed\") " pod="openstack/glance-default-external-api-0" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.545838 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"f267338d-cce8-4abe-b836-519fcca98eed\") " pod="openstack/glance-default-external-api-0" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.545900 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f267338d-cce8-4abe-b836-519fcca98eed-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f267338d-cce8-4abe-b836-519fcca98eed\") " pod="openstack/glance-default-external-api-0" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.545922 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f267338d-cce8-4abe-b836-519fcca98eed-logs\") pod \"glance-default-external-api-0\" (UID: \"f267338d-cce8-4abe-b836-519fcca98eed\") " pod="openstack/glance-default-external-api-0" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.545972 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f267338d-cce8-4abe-b836-519fcca98eed-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f267338d-cce8-4abe-b836-519fcca98eed\") " pod="openstack/glance-default-external-api-0" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.546001 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f267338d-cce8-4abe-b836-519fcca98eed-scripts\") pod \"glance-default-external-api-0\" (UID: \"f267338d-cce8-4abe-b836-519fcca98eed\") " pod="openstack/glance-default-external-api-0" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.546024 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkmxd\" (UniqueName: \"kubernetes.io/projected/f267338d-cce8-4abe-b836-519fcca98eed-kube-api-access-hkmxd\") pod \"glance-default-external-api-0\" (UID: \"f267338d-cce8-4abe-b836-519fcca98eed\") " pod="openstack/glance-default-external-api-0" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.546050 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f267338d-cce8-4abe-b836-519fcca98eed-config-data\") pod \"glance-default-external-api-0\" (UID: \"f267338d-cce8-4abe-b836-519fcca98eed\") " pod="openstack/glance-default-external-api-0" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.546998 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f267338d-cce8-4abe-b836-519fcca98eed-logs\") pod \"glance-default-external-api-0\" (UID: \"f267338d-cce8-4abe-b836-519fcca98eed\") " pod="openstack/glance-default-external-api-0" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.547097 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"f267338d-cce8-4abe-b836-519fcca98eed\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.552059 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f267338d-cce8-4abe-b836-519fcca98eed-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"f267338d-cce8-4abe-b836-519fcca98eed\") " pod="openstack/glance-default-external-api-0" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.552686 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f267338d-cce8-4abe-b836-519fcca98eed-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"f267338d-cce8-4abe-b836-519fcca98eed\") " pod="openstack/glance-default-external-api-0" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.552970 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f267338d-cce8-4abe-b836-519fcca98eed-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"f267338d-cce8-4abe-b836-519fcca98eed\") " pod="openstack/glance-default-external-api-0" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.553215 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f267338d-cce8-4abe-b836-519fcca98eed-scripts\") pod \"glance-default-external-api-0\" (UID: \"f267338d-cce8-4abe-b836-519fcca98eed\") " pod="openstack/glance-default-external-api-0" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.571196 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f267338d-cce8-4abe-b836-519fcca98eed-config-data\") pod \"glance-default-external-api-0\" (UID: \"f267338d-cce8-4abe-b836-519fcca98eed\") " pod="openstack/glance-default-external-api-0" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.571463 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkmxd\" (UniqueName: \"kubernetes.io/projected/f267338d-cce8-4abe-b836-519fcca98eed-kube-api-access-hkmxd\") pod \"glance-default-external-api-0\" (UID: \"f267338d-cce8-4abe-b836-519fcca98eed\") " pod="openstack/glance-default-external-api-0" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.594951 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"f267338d-cce8-4abe-b836-519fcca98eed\") " pod="openstack/glance-default-external-api-0" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.649692 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.650788 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.655024 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.655072 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.655234 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-zzkhx" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.659422 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.709613 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.749391 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ce86cee0-2e3b-4bee-9663-143d72bacd33-openstack-config\") pod \"openstackclient\" (UID: \"ce86cee0-2e3b-4bee-9663-143d72bacd33\") " pod="openstack/openstackclient" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.749442 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce86cee0-2e3b-4bee-9663-143d72bacd33-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ce86cee0-2e3b-4bee-9663-143d72bacd33\") " pod="openstack/openstackclient" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.749495 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsqtd\" (UniqueName: \"kubernetes.io/projected/ce86cee0-2e3b-4bee-9663-143d72bacd33-kube-api-access-rsqtd\") pod \"openstackclient\" (UID: \"ce86cee0-2e3b-4bee-9663-143d72bacd33\") " pod="openstack/openstackclient" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.749569 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ce86cee0-2e3b-4bee-9663-143d72bacd33-openstack-config-secret\") pod \"openstackclient\" (UID: \"ce86cee0-2e3b-4bee-9663-143d72bacd33\") " pod="openstack/openstackclient" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.805269 4771 generic.go:334] "Generic (PLEG): container finished" podID="43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3" containerID="8803bf34b581a6c65cd757c381064f893d99b3356fdc1710b47433fa2266d950" exitCode=0 Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.805305 4771 generic.go:334] "Generic (PLEG): container finished" podID="43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3" containerID="35667ae51083b65f157604ef33843596c45a65acc2090effd0811ec1a8cd6b6f" exitCode=143 Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.805364 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3","Type":"ContainerDied","Data":"8803bf34b581a6c65cd757c381064f893d99b3356fdc1710b47433fa2266d950"} Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.805391 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3","Type":"ContainerDied","Data":"35667ae51083b65f157604ef33843596c45a65acc2090effd0811ec1a8cd6b6f"} Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.805403 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3","Type":"ContainerDied","Data":"6b1cdea39039c2a9c934c15a1b45b435721ac98c331b5030859a8c282f4f045b"} Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.805425 4771 scope.go:117] "RemoveContainer" containerID="8803bf34b581a6c65cd757c381064f893d99b3356fdc1710b47433fa2266d950" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.805581 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.833983 4771 scope.go:117] "RemoveContainer" containerID="35667ae51083b65f157604ef33843596c45a65acc2090effd0811ec1a8cd6b6f" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.850489 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3-httpd-run\") pod \"43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3\" (UID: \"43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3\") " Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.850533 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3-combined-ca-bundle\") pod \"43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3\" (UID: \"43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3\") " Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.850554 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3-scripts\") pod \"43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3\" (UID: \"43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3\") " Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.850623 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3-config-data\") pod \"43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3\" (UID: \"43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3\") " Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.850649 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3\" (UID: \"43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3\") " Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.850721 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8m2w\" (UniqueName: \"kubernetes.io/projected/43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3-kube-api-access-x8m2w\") pod \"43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3\" (UID: \"43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3\") " Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.850806 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3-logs\") pod \"43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3\" (UID: \"43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3\") " Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.851176 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ce86cee0-2e3b-4bee-9663-143d72bacd33-openstack-config-secret\") pod \"openstackclient\" (UID: \"ce86cee0-2e3b-4bee-9663-143d72bacd33\") " pod="openstack/openstackclient" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.851255 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ce86cee0-2e3b-4bee-9663-143d72bacd33-openstack-config\") pod \"openstackclient\" (UID: \"ce86cee0-2e3b-4bee-9663-143d72bacd33\") " pod="openstack/openstackclient" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.851315 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce86cee0-2e3b-4bee-9663-143d72bacd33-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ce86cee0-2e3b-4bee-9663-143d72bacd33\") " pod="openstack/openstackclient" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.851397 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsqtd\" (UniqueName: \"kubernetes.io/projected/ce86cee0-2e3b-4bee-9663-143d72bacd33-kube-api-access-rsqtd\") pod \"openstackclient\" (UID: \"ce86cee0-2e3b-4bee-9663-143d72bacd33\") " pod="openstack/openstackclient" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.851945 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3" (UID: "43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.852977 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ce86cee0-2e3b-4bee-9663-143d72bacd33-openstack-config\") pod \"openstackclient\" (UID: \"ce86cee0-2e3b-4bee-9663-143d72bacd33\") " pod="openstack/openstackclient" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.853394 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3-logs" (OuterVolumeSpecName: "logs") pod "43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3" (UID: "43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.856898 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce86cee0-2e3b-4bee-9663-143d72bacd33-combined-ca-bundle\") pod \"openstackclient\" (UID: \"ce86cee0-2e3b-4bee-9663-143d72bacd33\") " pod="openstack/openstackclient" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.870511 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ce86cee0-2e3b-4bee-9663-143d72bacd33-openstack-config-secret\") pod \"openstackclient\" (UID: \"ce86cee0-2e3b-4bee-9663-143d72bacd33\") " pod="openstack/openstackclient" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.870843 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3-kube-api-access-x8m2w" (OuterVolumeSpecName: "kube-api-access-x8m2w") pod "43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3" (UID: "43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3"). InnerVolumeSpecName "kube-api-access-x8m2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.871408 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsqtd\" (UniqueName: \"kubernetes.io/projected/ce86cee0-2e3b-4bee-9663-143d72bacd33-kube-api-access-rsqtd\") pod \"openstackclient\" (UID: \"ce86cee0-2e3b-4bee-9663-143d72bacd33\") " pod="openstack/openstackclient" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.873351 4771 scope.go:117] "RemoveContainer" containerID="8803bf34b581a6c65cd757c381064f893d99b3356fdc1710b47433fa2266d950" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.876862 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3-scripts" (OuterVolumeSpecName: "scripts") pod "43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3" (UID: "43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:37 crc kubenswrapper[4771]: E1001 15:14:37.888933 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8803bf34b581a6c65cd757c381064f893d99b3356fdc1710b47433fa2266d950\": container with ID starting with 8803bf34b581a6c65cd757c381064f893d99b3356fdc1710b47433fa2266d950 not found: ID does not exist" containerID="8803bf34b581a6c65cd757c381064f893d99b3356fdc1710b47433fa2266d950" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.888984 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8803bf34b581a6c65cd757c381064f893d99b3356fdc1710b47433fa2266d950"} err="failed to get container status \"8803bf34b581a6c65cd757c381064f893d99b3356fdc1710b47433fa2266d950\": rpc error: code = NotFound desc = could not find container \"8803bf34b581a6c65cd757c381064f893d99b3356fdc1710b47433fa2266d950\": container with ID starting with 8803bf34b581a6c65cd757c381064f893d99b3356fdc1710b47433fa2266d950 not found: ID does not exist" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.889009 4771 scope.go:117] "RemoveContainer" containerID="35667ae51083b65f157604ef33843596c45a65acc2090effd0811ec1a8cd6b6f" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.889785 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 15:14:37 crc kubenswrapper[4771]: E1001 15:14:37.892228 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35667ae51083b65f157604ef33843596c45a65acc2090effd0811ec1a8cd6b6f\": container with ID starting with 35667ae51083b65f157604ef33843596c45a65acc2090effd0811ec1a8cd6b6f not found: ID does not exist" containerID="35667ae51083b65f157604ef33843596c45a65acc2090effd0811ec1a8cd6b6f" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.895026 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35667ae51083b65f157604ef33843596c45a65acc2090effd0811ec1a8cd6b6f"} err="failed to get container status \"35667ae51083b65f157604ef33843596c45a65acc2090effd0811ec1a8cd6b6f\": rpc error: code = NotFound desc = could not find container \"35667ae51083b65f157604ef33843596c45a65acc2090effd0811ec1a8cd6b6f\": container with ID starting with 35667ae51083b65f157604ef33843596c45a65acc2090effd0811ec1a8cd6b6f not found: ID does not exist" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.895084 4771 scope.go:117] "RemoveContainer" containerID="8803bf34b581a6c65cd757c381064f893d99b3356fdc1710b47433fa2266d950" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.903984 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8803bf34b581a6c65cd757c381064f893d99b3356fdc1710b47433fa2266d950"} err="failed to get container status \"8803bf34b581a6c65cd757c381064f893d99b3356fdc1710b47433fa2266d950\": rpc error: code = NotFound desc = could not find container \"8803bf34b581a6c65cd757c381064f893d99b3356fdc1710b47433fa2266d950\": container with ID starting with 8803bf34b581a6c65cd757c381064f893d99b3356fdc1710b47433fa2266d950 not found: ID does not exist" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.904222 4771 scope.go:117] "RemoveContainer" containerID="35667ae51083b65f157604ef33843596c45a65acc2090effd0811ec1a8cd6b6f" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.904427 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3" (UID: "43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.909850 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35667ae51083b65f157604ef33843596c45a65acc2090effd0811ec1a8cd6b6f"} err="failed to get container status \"35667ae51083b65f157604ef33843596c45a65acc2090effd0811ec1a8cd6b6f\": rpc error: code = NotFound desc = could not find container \"35667ae51083b65f157604ef33843596c45a65acc2090effd0811ec1a8cd6b6f\": container with ID starting with 35667ae51083b65f157604ef33843596c45a65acc2090effd0811ec1a8cd6b6f not found: ID does not exist" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.918855 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3" (UID: "43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.940308 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.943791 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.952421 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.952965 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 01 15:14:37 crc kubenswrapper[4771]: E1001 15:14:37.953452 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3" containerName="glance-log" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.953469 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3" containerName="glance-log" Oct 01 15:14:37 crc kubenswrapper[4771]: E1001 15:14:37.953512 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3" containerName="glance-httpd" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.953518 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3" containerName="glance-httpd" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.953677 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3" containerName="glance-httpd" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.953701 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3" containerName="glance-log" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.954304 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.954987 4771 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.955269 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.955986 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.956036 4771 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.956051 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8m2w\" (UniqueName: \"kubernetes.io/projected/43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3-kube-api-access-x8m2w\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.956065 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3-logs\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.963020 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.970909 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3-config-data" (OuterVolumeSpecName: "config-data") pod "43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3" (UID: "43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:37 crc kubenswrapper[4771]: I1001 15:14:37.987642 4771 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.038357 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c76c23fa-917c-4457-8a9c-4123d13b17a8" path="/var/lib/kubelet/pods/c76c23fa-917c-4457-8a9c-4123d13b17a8/volumes" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.057813 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a40ee0c4-9c6b-4ed0-9f06-1bb104bb9a11-openstack-config\") pod \"openstackclient\" (UID: \"a40ee0c4-9c6b-4ed0-9f06-1bb104bb9a11\") " pod="openstack/openstackclient" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.057879 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a40ee0c4-9c6b-4ed0-9f06-1bb104bb9a11-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a40ee0c4-9c6b-4ed0-9f06-1bb104bb9a11\") " pod="openstack/openstackclient" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.057901 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a40ee0c4-9c6b-4ed0-9f06-1bb104bb9a11-openstack-config-secret\") pod \"openstackclient\" (UID: \"a40ee0c4-9c6b-4ed0-9f06-1bb104bb9a11\") " pod="openstack/openstackclient" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.057968 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47wtm\" (UniqueName: \"kubernetes.io/projected/a40ee0c4-9c6b-4ed0-9f06-1bb104bb9a11-kube-api-access-47wtm\") pod \"openstackclient\" (UID: \"a40ee0c4-9c6b-4ed0-9f06-1bb104bb9a11\") " pod="openstack/openstackclient" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.058032 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.058045 4771 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.165844 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47wtm\" (UniqueName: \"kubernetes.io/projected/a40ee0c4-9c6b-4ed0-9f06-1bb104bb9a11-kube-api-access-47wtm\") pod \"openstackclient\" (UID: \"a40ee0c4-9c6b-4ed0-9f06-1bb104bb9a11\") " pod="openstack/openstackclient" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.166291 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a40ee0c4-9c6b-4ed0-9f06-1bb104bb9a11-openstack-config\") pod \"openstackclient\" (UID: \"a40ee0c4-9c6b-4ed0-9f06-1bb104bb9a11\") " pod="openstack/openstackclient" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.166356 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a40ee0c4-9c6b-4ed0-9f06-1bb104bb9a11-openstack-config-secret\") pod \"openstackclient\" (UID: \"a40ee0c4-9c6b-4ed0-9f06-1bb104bb9a11\") " pod="openstack/openstackclient" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.166377 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a40ee0c4-9c6b-4ed0-9f06-1bb104bb9a11-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a40ee0c4-9c6b-4ed0-9f06-1bb104bb9a11\") " pod="openstack/openstackclient" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.169400 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a40ee0c4-9c6b-4ed0-9f06-1bb104bb9a11-openstack-config\") pod \"openstackclient\" (UID: \"a40ee0c4-9c6b-4ed0-9f06-1bb104bb9a11\") " pod="openstack/openstackclient" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.178486 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a40ee0c4-9c6b-4ed0-9f06-1bb104bb9a11-openstack-config-secret\") pod \"openstackclient\" (UID: \"a40ee0c4-9c6b-4ed0-9f06-1bb104bb9a11\") " pod="openstack/openstackclient" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.184389 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a40ee0c4-9c6b-4ed0-9f06-1bb104bb9a11-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a40ee0c4-9c6b-4ed0-9f06-1bb104bb9a11\") " pod="openstack/openstackclient" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.189420 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47wtm\" (UniqueName: \"kubernetes.io/projected/a40ee0c4-9c6b-4ed0-9f06-1bb104bb9a11-kube-api-access-47wtm\") pod \"openstackclient\" (UID: \"a40ee0c4-9c6b-4ed0-9f06-1bb104bb9a11\") " pod="openstack/openstackclient" Oct 01 15:14:38 crc kubenswrapper[4771]: E1001 15:14:38.190792 4771 log.go:32] "RunPodSandbox from runtime service failed" err=< Oct 01 15:14:38 crc kubenswrapper[4771]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_ce86cee0-2e3b-4bee-9663-143d72bacd33_0(b50a60c4949359d82dc0e9e9e53fb581da210d898f50f2baa8986b979be14f47): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"b50a60c4949359d82dc0e9e9e53fb581da210d898f50f2baa8986b979be14f47" Netns:"/var/run/netns/a3fea132-c035-428a-ad42-8a8ff8cef882" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=b50a60c4949359d82dc0e9e9e53fb581da210d898f50f2baa8986b979be14f47;K8S_POD_UID=ce86cee0-2e3b-4bee-9663-143d72bacd33" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/ce86cee0-2e3b-4bee-9663-143d72bacd33]: expected pod UID "ce86cee0-2e3b-4bee-9663-143d72bacd33" but got "a40ee0c4-9c6b-4ed0-9f06-1bb104bb9a11" from Kube API Oct 01 15:14:38 crc kubenswrapper[4771]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Oct 01 15:14:38 crc kubenswrapper[4771]: > Oct 01 15:14:38 crc kubenswrapper[4771]: E1001 15:14:38.190861 4771 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Oct 01 15:14:38 crc kubenswrapper[4771]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_ce86cee0-2e3b-4bee-9663-143d72bacd33_0(b50a60c4949359d82dc0e9e9e53fb581da210d898f50f2baa8986b979be14f47): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"b50a60c4949359d82dc0e9e9e53fb581da210d898f50f2baa8986b979be14f47" Netns:"/var/run/netns/a3fea132-c035-428a-ad42-8a8ff8cef882" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=b50a60c4949359d82dc0e9e9e53fb581da210d898f50f2baa8986b979be14f47;K8S_POD_UID=ce86cee0-2e3b-4bee-9663-143d72bacd33" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/ce86cee0-2e3b-4bee-9663-143d72bacd33]: expected pod UID "ce86cee0-2e3b-4bee-9663-143d72bacd33" but got "a40ee0c4-9c6b-4ed0-9f06-1bb104bb9a11" from Kube API Oct 01 15:14:38 crc kubenswrapper[4771]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Oct 01 15:14:38 crc kubenswrapper[4771]: > pod="openstack/openstackclient" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.309481 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.320000 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.331932 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.333666 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.336949 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.337020 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.337122 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.340626 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.476667 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz64z\" (UniqueName: \"kubernetes.io/projected/6e1a8666-1d85-403b-88ac-8ee7417faba9-kube-api-access-fz64z\") pod \"glance-default-internal-api-0\" (UID: \"6e1a8666-1d85-403b-88ac-8ee7417faba9\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.477040 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6e1a8666-1d85-403b-88ac-8ee7417faba9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6e1a8666-1d85-403b-88ac-8ee7417faba9\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.477108 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"6e1a8666-1d85-403b-88ac-8ee7417faba9\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.477135 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e1a8666-1d85-403b-88ac-8ee7417faba9-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6e1a8666-1d85-403b-88ac-8ee7417faba9\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.477216 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1a8666-1d85-403b-88ac-8ee7417faba9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6e1a8666-1d85-403b-88ac-8ee7417faba9\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.477237 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e1a8666-1d85-403b-88ac-8ee7417faba9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6e1a8666-1d85-403b-88ac-8ee7417faba9\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.477278 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e1a8666-1d85-403b-88ac-8ee7417faba9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6e1a8666-1d85-403b-88ac-8ee7417faba9\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.477313 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e1a8666-1d85-403b-88ac-8ee7417faba9-logs\") pod \"glance-default-internal-api-0\" (UID: \"6e1a8666-1d85-403b-88ac-8ee7417faba9\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.578596 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz64z\" (UniqueName: \"kubernetes.io/projected/6e1a8666-1d85-403b-88ac-8ee7417faba9-kube-api-access-fz64z\") pod \"glance-default-internal-api-0\" (UID: \"6e1a8666-1d85-403b-88ac-8ee7417faba9\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.578640 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6e1a8666-1d85-403b-88ac-8ee7417faba9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6e1a8666-1d85-403b-88ac-8ee7417faba9\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.578710 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"6e1a8666-1d85-403b-88ac-8ee7417faba9\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.578758 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e1a8666-1d85-403b-88ac-8ee7417faba9-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6e1a8666-1d85-403b-88ac-8ee7417faba9\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.578821 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1a8666-1d85-403b-88ac-8ee7417faba9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6e1a8666-1d85-403b-88ac-8ee7417faba9\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.578839 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e1a8666-1d85-403b-88ac-8ee7417faba9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6e1a8666-1d85-403b-88ac-8ee7417faba9\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.578875 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e1a8666-1d85-403b-88ac-8ee7417faba9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6e1a8666-1d85-403b-88ac-8ee7417faba9\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.578907 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e1a8666-1d85-403b-88ac-8ee7417faba9-logs\") pod \"glance-default-internal-api-0\" (UID: \"6e1a8666-1d85-403b-88ac-8ee7417faba9\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.579186 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"6e1a8666-1d85-403b-88ac-8ee7417faba9\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.581069 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6e1a8666-1d85-403b-88ac-8ee7417faba9-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6e1a8666-1d85-403b-88ac-8ee7417faba9\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.581966 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e1a8666-1d85-403b-88ac-8ee7417faba9-logs\") pod \"glance-default-internal-api-0\" (UID: \"6e1a8666-1d85-403b-88ac-8ee7417faba9\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.590471 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1a8666-1d85-403b-88ac-8ee7417faba9-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6e1a8666-1d85-403b-88ac-8ee7417faba9\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.591004 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e1a8666-1d85-403b-88ac-8ee7417faba9-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6e1a8666-1d85-403b-88ac-8ee7417faba9\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.596984 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e1a8666-1d85-403b-88ac-8ee7417faba9-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6e1a8666-1d85-403b-88ac-8ee7417faba9\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.609577 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz64z\" (UniqueName: \"kubernetes.io/projected/6e1a8666-1d85-403b-88ac-8ee7417faba9-kube-api-access-fz64z\") pod \"glance-default-internal-api-0\" (UID: \"6e1a8666-1d85-403b-88ac-8ee7417faba9\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.623095 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e1a8666-1d85-403b-88ac-8ee7417faba9-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6e1a8666-1d85-403b-88ac-8ee7417faba9\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.649038 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 15:14:38 crc kubenswrapper[4771]: W1001 15:14:38.650156 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf267338d_cce8_4abe_b836_519fcca98eed.slice/crio-28c9a7fe73f401cda6b2dbd5c92fa8e3d96e6b393d763fe9ae04f87bc0bb94c8 WatchSource:0}: Error finding container 28c9a7fe73f401cda6b2dbd5c92fa8e3d96e6b393d763fe9ae04f87bc0bb94c8: Status 404 returned error can't find the container with id 28c9a7fe73f401cda6b2dbd5c92fa8e3d96e6b393d763fe9ae04f87bc0bb94c8 Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.673001 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"6e1a8666-1d85-403b-88ac-8ee7417faba9\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.717229 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.853060 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.853751 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f267338d-cce8-4abe-b836-519fcca98eed","Type":"ContainerStarted","Data":"28c9a7fe73f401cda6b2dbd5c92fa8e3d96e6b393d763fe9ae04f87bc0bb94c8"} Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.892146 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.900133 4771 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="ce86cee0-2e3b-4bee-9663-143d72bacd33" podUID="a40ee0c4-9c6b-4ed0-9f06-1bb104bb9a11" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.912992 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.988386 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ce86cee0-2e3b-4bee-9663-143d72bacd33-openstack-config-secret\") pod \"ce86cee0-2e3b-4bee-9663-143d72bacd33\" (UID: \"ce86cee0-2e3b-4bee-9663-143d72bacd33\") " Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.988473 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce86cee0-2e3b-4bee-9663-143d72bacd33-combined-ca-bundle\") pod \"ce86cee0-2e3b-4bee-9663-143d72bacd33\" (UID: \"ce86cee0-2e3b-4bee-9663-143d72bacd33\") " Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.988537 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ce86cee0-2e3b-4bee-9663-143d72bacd33-openstack-config\") pod \"ce86cee0-2e3b-4bee-9663-143d72bacd33\" (UID: \"ce86cee0-2e3b-4bee-9663-143d72bacd33\") " Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.988599 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsqtd\" (UniqueName: \"kubernetes.io/projected/ce86cee0-2e3b-4bee-9663-143d72bacd33-kube-api-access-rsqtd\") pod \"ce86cee0-2e3b-4bee-9663-143d72bacd33\" (UID: \"ce86cee0-2e3b-4bee-9663-143d72bacd33\") " Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.989245 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce86cee0-2e3b-4bee-9663-143d72bacd33-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "ce86cee0-2e3b-4bee-9663-143d72bacd33" (UID: "ce86cee0-2e3b-4bee-9663-143d72bacd33"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.989750 4771 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/ce86cee0-2e3b-4bee-9663-143d72bacd33-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.993258 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce86cee0-2e3b-4bee-9663-143d72bacd33-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce86cee0-2e3b-4bee-9663-143d72bacd33" (UID: "ce86cee0-2e3b-4bee-9663-143d72bacd33"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.993958 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce86cee0-2e3b-4bee-9663-143d72bacd33-kube-api-access-rsqtd" (OuterVolumeSpecName: "kube-api-access-rsqtd") pod "ce86cee0-2e3b-4bee-9663-143d72bacd33" (UID: "ce86cee0-2e3b-4bee-9663-143d72bacd33"). InnerVolumeSpecName "kube-api-access-rsqtd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:14:38 crc kubenswrapper[4771]: I1001 15:14:38.998847 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce86cee0-2e3b-4bee-9663-143d72bacd33-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "ce86cee0-2e3b-4bee-9663-143d72bacd33" (UID: "ce86cee0-2e3b-4bee-9663-143d72bacd33"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:39 crc kubenswrapper[4771]: I1001 15:14:39.092780 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce86cee0-2e3b-4bee-9663-143d72bacd33-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:39 crc kubenswrapper[4771]: I1001 15:14:39.092829 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsqtd\" (UniqueName: \"kubernetes.io/projected/ce86cee0-2e3b-4bee-9663-143d72bacd33-kube-api-access-rsqtd\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:39 crc kubenswrapper[4771]: I1001 15:14:39.092841 4771 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/ce86cee0-2e3b-4bee-9663-143d72bacd33-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:39 crc kubenswrapper[4771]: I1001 15:14:39.168598 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 01 15:14:39 crc kubenswrapper[4771]: I1001 15:14:39.206326 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 15:14:39 crc kubenswrapper[4771]: I1001 15:14:39.299403 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 15:14:39 crc kubenswrapper[4771]: I1001 15:14:39.770553 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-59879d9576-fvcgl" Oct 01 15:14:39 crc kubenswrapper[4771]: I1001 15:14:39.867068 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a40ee0c4-9c6b-4ed0-9f06-1bb104bb9a11","Type":"ContainerStarted","Data":"211f831c2bde64aa212e940796865d4a94682978655d70782ec995b572fbd637"} Oct 01 15:14:39 crc kubenswrapper[4771]: I1001 15:14:39.868157 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f267338d-cce8-4abe-b836-519fcca98eed","Type":"ContainerStarted","Data":"11a86c2d50fbf4102cdb5f5fbef715c02373931b76a24fd501536adb701cd374"} Oct 01 15:14:39 crc kubenswrapper[4771]: I1001 15:14:39.869152 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 01 15:14:39 crc kubenswrapper[4771]: I1001 15:14:39.870257 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6e1a8666-1d85-403b-88ac-8ee7417faba9","Type":"ContainerStarted","Data":"ade83319257f6b31faf018cada3ce391a4094fd0f817184736d785090453fef6"} Oct 01 15:14:39 crc kubenswrapper[4771]: I1001 15:14:39.870495 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="4f26bf4c-7205-40ae-b698-e4d8a796f8f6" containerName="cinder-scheduler" containerID="cri-o://d48291e5e50b8a4cc2f0a97e00cba34d3e77cb75dfa7b65f2a93ab54743e77e5" gracePeriod=30 Oct 01 15:14:39 crc kubenswrapper[4771]: I1001 15:14:39.870596 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="4f26bf4c-7205-40ae-b698-e4d8a796f8f6" containerName="probe" containerID="cri-o://902492f5d1db20b120e1e3e344698dee4d3fb8ffd1ec2d0cd21ac3b22b036681" gracePeriod=30 Oct 01 15:14:39 crc kubenswrapper[4771]: I1001 15:14:39.890893 4771 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="ce86cee0-2e3b-4bee-9663-143d72bacd33" podUID="a40ee0c4-9c6b-4ed0-9f06-1bb104bb9a11" Oct 01 15:14:40 crc kubenswrapper[4771]: I1001 15:14:40.009948 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3" path="/var/lib/kubelet/pods/43eb254e-a3d2-483c-9f8b-7e9ee2bb3ff3/volumes" Oct 01 15:14:40 crc kubenswrapper[4771]: I1001 15:14:40.010875 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce86cee0-2e3b-4bee-9663-143d72bacd33" path="/var/lib/kubelet/pods/ce86cee0-2e3b-4bee-9663-143d72bacd33/volumes" Oct 01 15:14:40 crc kubenswrapper[4771]: I1001 15:14:40.104170 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-59879d9576-fvcgl" Oct 01 15:14:40 crc kubenswrapper[4771]: I1001 15:14:40.136513 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6c4877d5c6-thmgz" Oct 01 15:14:40 crc kubenswrapper[4771]: I1001 15:14:40.184032 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-554565568b-5js82"] Oct 01 15:14:40 crc kubenswrapper[4771]: I1001 15:14:40.184534 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-554565568b-5js82" podUID="7323fb43-89a2-4be3-96fc-f8633f91fd8c" containerName="barbican-api-log" containerID="cri-o://54b337cec3f6977f9a09c08b16d495d2123ae4bad884381dc31029bbf7614c68" gracePeriod=30 Oct 01 15:14:40 crc kubenswrapper[4771]: I1001 15:14:40.184790 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-554565568b-5js82" podUID="7323fb43-89a2-4be3-96fc-f8633f91fd8c" containerName="barbican-api" containerID="cri-o://68d43ea486383f4d067516899ec14d857f94fb7ddb5f08f81c2f81aa6934d79f" gracePeriod=30 Oct 01 15:14:40 crc kubenswrapper[4771]: I1001 15:14:40.891157 4771 generic.go:334] "Generic (PLEG): container finished" podID="7323fb43-89a2-4be3-96fc-f8633f91fd8c" containerID="54b337cec3f6977f9a09c08b16d495d2123ae4bad884381dc31029bbf7614c68" exitCode=143 Oct 01 15:14:40 crc kubenswrapper[4771]: I1001 15:14:40.891253 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-554565568b-5js82" event={"ID":"7323fb43-89a2-4be3-96fc-f8633f91fd8c","Type":"ContainerDied","Data":"54b337cec3f6977f9a09c08b16d495d2123ae4bad884381dc31029bbf7614c68"} Oct 01 15:14:40 crc kubenswrapper[4771]: I1001 15:14:40.895630 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f267338d-cce8-4abe-b836-519fcca98eed","Type":"ContainerStarted","Data":"8df1a31fa585058e4faf5751ffdbf997240f309b7cb136fb0ddc8a85c0d1b7a0"} Oct 01 15:14:40 crc kubenswrapper[4771]: I1001 15:14:40.902600 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6e1a8666-1d85-403b-88ac-8ee7417faba9","Type":"ContainerStarted","Data":"11ffecfc6813991424be80103fa3cc6c4c21baf81e0ca539d2bebb199a4b3356"} Oct 01 15:14:40 crc kubenswrapper[4771]: I1001 15:14:40.902649 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6e1a8666-1d85-403b-88ac-8ee7417faba9","Type":"ContainerStarted","Data":"686e91eb3fb6a5d0be094dd11ce59a79a83eeddbf9e9c9a3a86df4a6f4c25305"} Oct 01 15:14:40 crc kubenswrapper[4771]: I1001 15:14:40.909306 4771 generic.go:334] "Generic (PLEG): container finished" podID="4f26bf4c-7205-40ae-b698-e4d8a796f8f6" containerID="902492f5d1db20b120e1e3e344698dee4d3fb8ffd1ec2d0cd21ac3b22b036681" exitCode=0 Oct 01 15:14:40 crc kubenswrapper[4771]: I1001 15:14:40.909932 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4f26bf4c-7205-40ae-b698-e4d8a796f8f6","Type":"ContainerDied","Data":"902492f5d1db20b120e1e3e344698dee4d3fb8ffd1ec2d0cd21ac3b22b036681"} Oct 01 15:14:40 crc kubenswrapper[4771]: I1001 15:14:40.914080 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.914063647 podStartE2EDuration="3.914063647s" podCreationTimestamp="2025-10-01 15:14:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:14:40.911403742 +0000 UTC m=+1125.530578903" watchObservedRunningTime="2025-10-01 15:14:40.914063647 +0000 UTC m=+1125.533238838" Oct 01 15:14:40 crc kubenswrapper[4771]: I1001 15:14:40.945368 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.945351167 podStartE2EDuration="2.945351167s" podCreationTimestamp="2025-10-01 15:14:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:14:40.941202305 +0000 UTC m=+1125.560377476" watchObservedRunningTime="2025-10-01 15:14:40.945351167 +0000 UTC m=+1125.564526338" Oct 01 15:14:41 crc kubenswrapper[4771]: I1001 15:14:41.302642 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 01 15:14:42 crc kubenswrapper[4771]: I1001 15:14:42.179431 4771 patch_prober.go:28] interesting pod/machine-config-daemon-vck47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 15:14:42 crc kubenswrapper[4771]: I1001 15:14:42.180575 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 15:14:42 crc kubenswrapper[4771]: I1001 15:14:42.276084 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-67bb68cc5c-l7gnn" Oct 01 15:14:42 crc kubenswrapper[4771]: I1001 15:14:42.308892 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5784cf869f-6rfg4" Oct 01 15:14:42 crc kubenswrapper[4771]: I1001 15:14:42.349755 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6c4877d5c6-thmgz"] Oct 01 15:14:42 crc kubenswrapper[4771]: I1001 15:14:42.350056 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6c4877d5c6-thmgz" podUID="f66ea972-6475-4624-a093-8884ead588f8" containerName="neutron-api" containerID="cri-o://23622ff95d6dcef208b1ee5e0aa03a223373fc0f91fb4beeaf4078217eaab3c5" gracePeriod=30 Oct 01 15:14:42 crc kubenswrapper[4771]: I1001 15:14:42.350544 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6c4877d5c6-thmgz" podUID="f66ea972-6475-4624-a093-8884ead588f8" containerName="neutron-httpd" containerID="cri-o://1295acd6636ce8c5c95b86fa2c5b81795c12be6d40eb7105163c265e27ddebc4" gracePeriod=30 Oct 01 15:14:42 crc kubenswrapper[4771]: I1001 15:14:42.383748 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65965d6475-kvgqz"] Oct 01 15:14:42 crc kubenswrapper[4771]: I1001 15:14:42.384191 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-65965d6475-kvgqz" podUID="b3ad98d7-c092-4ef4-96b7-194255e37e83" containerName="dnsmasq-dns" containerID="cri-o://909ecd1dd1bb27090a1c5e17a30eaa369edde5b3077b211232eb37e1175a5409" gracePeriod=10 Oct 01 15:14:42 crc kubenswrapper[4771]: I1001 15:14:42.937049 4771 generic.go:334] "Generic (PLEG): container finished" podID="f66ea972-6475-4624-a093-8884ead588f8" containerID="1295acd6636ce8c5c95b86fa2c5b81795c12be6d40eb7105163c265e27ddebc4" exitCode=0 Oct 01 15:14:42 crc kubenswrapper[4771]: I1001 15:14:42.937408 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c4877d5c6-thmgz" event={"ID":"f66ea972-6475-4624-a093-8884ead588f8","Type":"ContainerDied","Data":"1295acd6636ce8c5c95b86fa2c5b81795c12be6d40eb7105163c265e27ddebc4"} Oct 01 15:14:42 crc kubenswrapper[4771]: I1001 15:14:42.939369 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65965d6475-kvgqz" Oct 01 15:14:42 crc kubenswrapper[4771]: I1001 15:14:42.939872 4771 generic.go:334] "Generic (PLEG): container finished" podID="b3ad98d7-c092-4ef4-96b7-194255e37e83" containerID="909ecd1dd1bb27090a1c5e17a30eaa369edde5b3077b211232eb37e1175a5409" exitCode=0 Oct 01 15:14:42 crc kubenswrapper[4771]: I1001 15:14:42.939904 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65965d6475-kvgqz" event={"ID":"b3ad98d7-c092-4ef4-96b7-194255e37e83","Type":"ContainerDied","Data":"909ecd1dd1bb27090a1c5e17a30eaa369edde5b3077b211232eb37e1175a5409"} Oct 01 15:14:42 crc kubenswrapper[4771]: I1001 15:14:42.939924 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65965d6475-kvgqz" event={"ID":"b3ad98d7-c092-4ef4-96b7-194255e37e83","Type":"ContainerDied","Data":"1527d0e9c8b96e4f81c5aada4a981f12a1699c639f2b1fe2ca1041fc86dedce1"} Oct 01 15:14:42 crc kubenswrapper[4771]: I1001 15:14:42.939943 4771 scope.go:117] "RemoveContainer" containerID="909ecd1dd1bb27090a1c5e17a30eaa369edde5b3077b211232eb37e1175a5409" Oct 01 15:14:42 crc kubenswrapper[4771]: I1001 15:14:42.980108 4771 scope.go:117] "RemoveContainer" containerID="154182c8baec91e004c6d9fc94ba3acc61f40ecd165a8c1fc0a5681284238649" Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.007339 4771 scope.go:117] "RemoveContainer" containerID="909ecd1dd1bb27090a1c5e17a30eaa369edde5b3077b211232eb37e1175a5409" Oct 01 15:14:43 crc kubenswrapper[4771]: E1001 15:14:43.007889 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"909ecd1dd1bb27090a1c5e17a30eaa369edde5b3077b211232eb37e1175a5409\": container with ID starting with 909ecd1dd1bb27090a1c5e17a30eaa369edde5b3077b211232eb37e1175a5409 not found: ID does not exist" containerID="909ecd1dd1bb27090a1c5e17a30eaa369edde5b3077b211232eb37e1175a5409" Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.007929 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"909ecd1dd1bb27090a1c5e17a30eaa369edde5b3077b211232eb37e1175a5409"} err="failed to get container status \"909ecd1dd1bb27090a1c5e17a30eaa369edde5b3077b211232eb37e1175a5409\": rpc error: code = NotFound desc = could not find container \"909ecd1dd1bb27090a1c5e17a30eaa369edde5b3077b211232eb37e1175a5409\": container with ID starting with 909ecd1dd1bb27090a1c5e17a30eaa369edde5b3077b211232eb37e1175a5409 not found: ID does not exist" Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.007956 4771 scope.go:117] "RemoveContainer" containerID="154182c8baec91e004c6d9fc94ba3acc61f40ecd165a8c1fc0a5681284238649" Oct 01 15:14:43 crc kubenswrapper[4771]: E1001 15:14:43.010145 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"154182c8baec91e004c6d9fc94ba3acc61f40ecd165a8c1fc0a5681284238649\": container with ID starting with 154182c8baec91e004c6d9fc94ba3acc61f40ecd165a8c1fc0a5681284238649 not found: ID does not exist" containerID="154182c8baec91e004c6d9fc94ba3acc61f40ecd165a8c1fc0a5681284238649" Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.010171 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"154182c8baec91e004c6d9fc94ba3acc61f40ecd165a8c1fc0a5681284238649"} err="failed to get container status \"154182c8baec91e004c6d9fc94ba3acc61f40ecd165a8c1fc0a5681284238649\": rpc error: code = NotFound desc = could not find container \"154182c8baec91e004c6d9fc94ba3acc61f40ecd165a8c1fc0a5681284238649\": container with ID starting with 154182c8baec91e004c6d9fc94ba3acc61f40ecd165a8c1fc0a5681284238649 not found: ID does not exist" Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.020419 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3ad98d7-c092-4ef4-96b7-194255e37e83-ovsdbserver-sb\") pod \"b3ad98d7-c092-4ef4-96b7-194255e37e83\" (UID: \"b3ad98d7-c092-4ef4-96b7-194255e37e83\") " Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.020467 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3ad98d7-c092-4ef4-96b7-194255e37e83-ovsdbserver-nb\") pod \"b3ad98d7-c092-4ef4-96b7-194255e37e83\" (UID: \"b3ad98d7-c092-4ef4-96b7-194255e37e83\") " Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.020522 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3ad98d7-c092-4ef4-96b7-194255e37e83-dns-svc\") pod \"b3ad98d7-c092-4ef4-96b7-194255e37e83\" (UID: \"b3ad98d7-c092-4ef4-96b7-194255e37e83\") " Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.020572 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsz66\" (UniqueName: \"kubernetes.io/projected/b3ad98d7-c092-4ef4-96b7-194255e37e83-kube-api-access-fsz66\") pod \"b3ad98d7-c092-4ef4-96b7-194255e37e83\" (UID: \"b3ad98d7-c092-4ef4-96b7-194255e37e83\") " Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.020783 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b3ad98d7-c092-4ef4-96b7-194255e37e83-dns-swift-storage-0\") pod \"b3ad98d7-c092-4ef4-96b7-194255e37e83\" (UID: \"b3ad98d7-c092-4ef4-96b7-194255e37e83\") " Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.020817 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3ad98d7-c092-4ef4-96b7-194255e37e83-config\") pod \"b3ad98d7-c092-4ef4-96b7-194255e37e83\" (UID: \"b3ad98d7-c092-4ef4-96b7-194255e37e83\") " Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.033261 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3ad98d7-c092-4ef4-96b7-194255e37e83-kube-api-access-fsz66" (OuterVolumeSpecName: "kube-api-access-fsz66") pod "b3ad98d7-c092-4ef4-96b7-194255e37e83" (UID: "b3ad98d7-c092-4ef4-96b7-194255e37e83"). InnerVolumeSpecName "kube-api-access-fsz66". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.078562 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3ad98d7-c092-4ef4-96b7-194255e37e83-config" (OuterVolumeSpecName: "config") pod "b3ad98d7-c092-4ef4-96b7-194255e37e83" (UID: "b3ad98d7-c092-4ef4-96b7-194255e37e83"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.079034 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3ad98d7-c092-4ef4-96b7-194255e37e83-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b3ad98d7-c092-4ef4-96b7-194255e37e83" (UID: "b3ad98d7-c092-4ef4-96b7-194255e37e83"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.084079 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3ad98d7-c092-4ef4-96b7-194255e37e83-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b3ad98d7-c092-4ef4-96b7-194255e37e83" (UID: "b3ad98d7-c092-4ef4-96b7-194255e37e83"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.087486 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3ad98d7-c092-4ef4-96b7-194255e37e83-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b3ad98d7-c092-4ef4-96b7-194255e37e83" (UID: "b3ad98d7-c092-4ef4-96b7-194255e37e83"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.090242 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3ad98d7-c092-4ef4-96b7-194255e37e83-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b3ad98d7-c092-4ef4-96b7-194255e37e83" (UID: "b3ad98d7-c092-4ef4-96b7-194255e37e83"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.123666 4771 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b3ad98d7-c092-4ef4-96b7-194255e37e83-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.123711 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3ad98d7-c092-4ef4-96b7-194255e37e83-config\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.123741 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3ad98d7-c092-4ef4-96b7-194255e37e83-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.123754 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3ad98d7-c092-4ef4-96b7-194255e37e83-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.123767 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3ad98d7-c092-4ef4-96b7-194255e37e83-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.123778 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsz66\" (UniqueName: \"kubernetes.io/projected/b3ad98d7-c092-4ef4-96b7-194255e37e83-kube-api-access-fsz66\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.419469 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-6sk5b"] Oct 01 15:14:43 crc kubenswrapper[4771]: E1001 15:14:43.419984 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3ad98d7-c092-4ef4-96b7-194255e37e83" containerName="dnsmasq-dns" Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.420000 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3ad98d7-c092-4ef4-96b7-194255e37e83" containerName="dnsmasq-dns" Oct 01 15:14:43 crc kubenswrapper[4771]: E1001 15:14:43.420023 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3ad98d7-c092-4ef4-96b7-194255e37e83" containerName="init" Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.420031 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3ad98d7-c092-4ef4-96b7-194255e37e83" containerName="init" Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.420241 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3ad98d7-c092-4ef4-96b7-194255e37e83" containerName="dnsmasq-dns" Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.420997 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6sk5b" Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.429870 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-6sk5b"] Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.504410 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-bszpn"] Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.505533 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-bszpn" Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.534751 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npl8p\" (UniqueName: \"kubernetes.io/projected/3afe16e8-e581-4752-9128-0516838132ae-kube-api-access-npl8p\") pod \"nova-api-db-create-6sk5b\" (UID: \"3afe16e8-e581-4752-9128-0516838132ae\") " pod="openstack/nova-api-db-create-6sk5b" Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.534807 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgf6m\" (UniqueName: \"kubernetes.io/projected/8c4d2578-710e-45af-86c8-8c8677ecc0b6-kube-api-access-rgf6m\") pod \"nova-cell0-db-create-bszpn\" (UID: \"8c4d2578-710e-45af-86c8-8c8677ecc0b6\") " pod="openstack/nova-cell0-db-create-bszpn" Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.538319 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-bszpn"] Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.607817 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-bshc5"] Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.609155 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-bshc5" Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.615060 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-bshc5"] Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.635884 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npl8p\" (UniqueName: \"kubernetes.io/projected/3afe16e8-e581-4752-9128-0516838132ae-kube-api-access-npl8p\") pod \"nova-api-db-create-6sk5b\" (UID: \"3afe16e8-e581-4752-9128-0516838132ae\") " pod="openstack/nova-api-db-create-6sk5b" Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.635941 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgf6m\" (UniqueName: \"kubernetes.io/projected/8c4d2578-710e-45af-86c8-8c8677ecc0b6-kube-api-access-rgf6m\") pod \"nova-cell0-db-create-bszpn\" (UID: \"8c4d2578-710e-45af-86c8-8c8677ecc0b6\") " pod="openstack/nova-cell0-db-create-bszpn" Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.635966 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfwr2\" (UniqueName: \"kubernetes.io/projected/f6003332-ade7-416e-8165-0b3768b94dc0-kube-api-access-pfwr2\") pod \"nova-cell1-db-create-bshc5\" (UID: \"f6003332-ade7-416e-8165-0b3768b94dc0\") " pod="openstack/nova-cell1-db-create-bshc5" Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.654167 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npl8p\" (UniqueName: \"kubernetes.io/projected/3afe16e8-e581-4752-9128-0516838132ae-kube-api-access-npl8p\") pod \"nova-api-db-create-6sk5b\" (UID: \"3afe16e8-e581-4752-9128-0516838132ae\") " pod="openstack/nova-api-db-create-6sk5b" Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.659543 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgf6m\" (UniqueName: \"kubernetes.io/projected/8c4d2578-710e-45af-86c8-8c8677ecc0b6-kube-api-access-rgf6m\") pod \"nova-cell0-db-create-bszpn\" (UID: \"8c4d2578-710e-45af-86c8-8c8677ecc0b6\") " pod="openstack/nova-cell0-db-create-bszpn" Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.723976 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-bszpn" Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.737024 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfwr2\" (UniqueName: \"kubernetes.io/projected/f6003332-ade7-416e-8165-0b3768b94dc0-kube-api-access-pfwr2\") pod \"nova-cell1-db-create-bshc5\" (UID: \"f6003332-ade7-416e-8165-0b3768b94dc0\") " pod="openstack/nova-cell1-db-create-bshc5" Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.740141 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6sk5b" Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.759382 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfwr2\" (UniqueName: \"kubernetes.io/projected/f6003332-ade7-416e-8165-0b3768b94dc0-kube-api-access-pfwr2\") pod \"nova-cell1-db-create-bshc5\" (UID: \"f6003332-ade7-416e-8165-0b3768b94dc0\") " pod="openstack/nova-cell1-db-create-bshc5" Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.917056 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-554565568b-5js82" Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.941778 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7323fb43-89a2-4be3-96fc-f8633f91fd8c-config-data-custom\") pod \"7323fb43-89a2-4be3-96fc-f8633f91fd8c\" (UID: \"7323fb43-89a2-4be3-96fc-f8633f91fd8c\") " Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.941913 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wl92\" (UniqueName: \"kubernetes.io/projected/7323fb43-89a2-4be3-96fc-f8633f91fd8c-kube-api-access-8wl92\") pod \"7323fb43-89a2-4be3-96fc-f8633f91fd8c\" (UID: \"7323fb43-89a2-4be3-96fc-f8633f91fd8c\") " Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.941974 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7323fb43-89a2-4be3-96fc-f8633f91fd8c-logs\") pod \"7323fb43-89a2-4be3-96fc-f8633f91fd8c\" (UID: \"7323fb43-89a2-4be3-96fc-f8633f91fd8c\") " Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.942003 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7323fb43-89a2-4be3-96fc-f8633f91fd8c-config-data\") pod \"7323fb43-89a2-4be3-96fc-f8633f91fd8c\" (UID: \"7323fb43-89a2-4be3-96fc-f8633f91fd8c\") " Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.942066 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7323fb43-89a2-4be3-96fc-f8633f91fd8c-combined-ca-bundle\") pod \"7323fb43-89a2-4be3-96fc-f8633f91fd8c\" (UID: \"7323fb43-89a2-4be3-96fc-f8633f91fd8c\") " Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.945167 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7323fb43-89a2-4be3-96fc-f8633f91fd8c-logs" (OuterVolumeSpecName: "logs") pod "7323fb43-89a2-4be3-96fc-f8633f91fd8c" (UID: "7323fb43-89a2-4be3-96fc-f8633f91fd8c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.949947 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7323fb43-89a2-4be3-96fc-f8633f91fd8c-kube-api-access-8wl92" (OuterVolumeSpecName: "kube-api-access-8wl92") pod "7323fb43-89a2-4be3-96fc-f8633f91fd8c" (UID: "7323fb43-89a2-4be3-96fc-f8633f91fd8c"). InnerVolumeSpecName "kube-api-access-8wl92". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.953087 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7323fb43-89a2-4be3-96fc-f8633f91fd8c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7323fb43-89a2-4be3-96fc-f8633f91fd8c" (UID: "7323fb43-89a2-4be3-96fc-f8633f91fd8c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.975495 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65965d6475-kvgqz" Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.983301 4771 generic.go:334] "Generic (PLEG): container finished" podID="7323fb43-89a2-4be3-96fc-f8633f91fd8c" containerID="68d43ea486383f4d067516899ec14d857f94fb7ddb5f08f81c2f81aa6934d79f" exitCode=0 Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.983334 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-554565568b-5js82" event={"ID":"7323fb43-89a2-4be3-96fc-f8633f91fd8c","Type":"ContainerDied","Data":"68d43ea486383f4d067516899ec14d857f94fb7ddb5f08f81c2f81aa6934d79f"} Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.983358 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-554565568b-5js82" event={"ID":"7323fb43-89a2-4be3-96fc-f8633f91fd8c","Type":"ContainerDied","Data":"942074b4b9bcfcd068c66ff1047b84b470342195265820a4840a741705fb1a27"} Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.983376 4771 scope.go:117] "RemoveContainer" containerID="68d43ea486383f4d067516899ec14d857f94fb7ddb5f08f81c2f81aa6934d79f" Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.983471 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-554565568b-5js82" Oct 01 15:14:43 crc kubenswrapper[4771]: I1001 15:14:43.992242 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7323fb43-89a2-4be3-96fc-f8633f91fd8c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7323fb43-89a2-4be3-96fc-f8633f91fd8c" (UID: "7323fb43-89a2-4be3-96fc-f8633f91fd8c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.031999 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65965d6475-kvgqz"] Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.033832 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7323fb43-89a2-4be3-96fc-f8633f91fd8c-config-data" (OuterVolumeSpecName: "config-data") pod "7323fb43-89a2-4be3-96fc-f8633f91fd8c" (UID: "7323fb43-89a2-4be3-96fc-f8633f91fd8c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.042951 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-65965d6475-kvgqz"] Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.043963 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7323fb43-89a2-4be3-96fc-f8633f91fd8c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.044002 4771 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7323fb43-89a2-4be3-96fc-f8633f91fd8c-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.044011 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wl92\" (UniqueName: \"kubernetes.io/projected/7323fb43-89a2-4be3-96fc-f8633f91fd8c-kube-api-access-8wl92\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.044022 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7323fb43-89a2-4be3-96fc-f8633f91fd8c-logs\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.044031 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7323fb43-89a2-4be3-96fc-f8633f91fd8c-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.059122 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-bshc5" Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.072805 4771 scope.go:117] "RemoveContainer" containerID="54b337cec3f6977f9a09c08b16d495d2123ae4bad884381dc31029bbf7614c68" Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.119609 4771 scope.go:117] "RemoveContainer" containerID="68d43ea486383f4d067516899ec14d857f94fb7ddb5f08f81c2f81aa6934d79f" Oct 01 15:14:44 crc kubenswrapper[4771]: E1001 15:14:44.121348 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68d43ea486383f4d067516899ec14d857f94fb7ddb5f08f81c2f81aa6934d79f\": container with ID starting with 68d43ea486383f4d067516899ec14d857f94fb7ddb5f08f81c2f81aa6934d79f not found: ID does not exist" containerID="68d43ea486383f4d067516899ec14d857f94fb7ddb5f08f81c2f81aa6934d79f" Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.121386 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68d43ea486383f4d067516899ec14d857f94fb7ddb5f08f81c2f81aa6934d79f"} err="failed to get container status \"68d43ea486383f4d067516899ec14d857f94fb7ddb5f08f81c2f81aa6934d79f\": rpc error: code = NotFound desc = could not find container \"68d43ea486383f4d067516899ec14d857f94fb7ddb5f08f81c2f81aa6934d79f\": container with ID starting with 68d43ea486383f4d067516899ec14d857f94fb7ddb5f08f81c2f81aa6934d79f not found: ID does not exist" Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.121412 4771 scope.go:117] "RemoveContainer" containerID="54b337cec3f6977f9a09c08b16d495d2123ae4bad884381dc31029bbf7614c68" Oct 01 15:14:44 crc kubenswrapper[4771]: E1001 15:14:44.121701 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54b337cec3f6977f9a09c08b16d495d2123ae4bad884381dc31029bbf7614c68\": container with ID starting with 54b337cec3f6977f9a09c08b16d495d2123ae4bad884381dc31029bbf7614c68 not found: ID does not exist" containerID="54b337cec3f6977f9a09c08b16d495d2123ae4bad884381dc31029bbf7614c68" Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.121721 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54b337cec3f6977f9a09c08b16d495d2123ae4bad884381dc31029bbf7614c68"} err="failed to get container status \"54b337cec3f6977f9a09c08b16d495d2123ae4bad884381dc31029bbf7614c68\": rpc error: code = NotFound desc = could not find container \"54b337cec3f6977f9a09c08b16d495d2123ae4bad884381dc31029bbf7614c68\": container with ID starting with 54b337cec3f6977f9a09c08b16d495d2123ae4bad884381dc31029bbf7614c68 not found: ID does not exist" Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.311542 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-bszpn"] Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.330650 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-554565568b-5js82"] Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.345333 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-554565568b-5js82"] Oct 01 15:14:44 crc kubenswrapper[4771]: W1001 15:14:44.366550 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c4d2578_710e_45af_86c8_8c8677ecc0b6.slice/crio-344a3b1e006a99e2ec27e49d61e9a928490724ebb3a40934299871f46cc21c95 WatchSource:0}: Error finding container 344a3b1e006a99e2ec27e49d61e9a928490724ebb3a40934299871f46cc21c95: Status 404 returned error can't find the container with id 344a3b1e006a99e2ec27e49d61e9a928490724ebb3a40934299871f46cc21c95 Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.407718 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-6sk5b"] Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.616516 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.657775 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x298k\" (UniqueName: \"kubernetes.io/projected/4f26bf4c-7205-40ae-b698-e4d8a796f8f6-kube-api-access-x298k\") pod \"4f26bf4c-7205-40ae-b698-e4d8a796f8f6\" (UID: \"4f26bf4c-7205-40ae-b698-e4d8a796f8f6\") " Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.657830 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f26bf4c-7205-40ae-b698-e4d8a796f8f6-combined-ca-bundle\") pod \"4f26bf4c-7205-40ae-b698-e4d8a796f8f6\" (UID: \"4f26bf4c-7205-40ae-b698-e4d8a796f8f6\") " Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.657851 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4f26bf4c-7205-40ae-b698-e4d8a796f8f6-etc-machine-id\") pod \"4f26bf4c-7205-40ae-b698-e4d8a796f8f6\" (UID: \"4f26bf4c-7205-40ae-b698-e4d8a796f8f6\") " Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.657929 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4f26bf4c-7205-40ae-b698-e4d8a796f8f6-config-data-custom\") pod \"4f26bf4c-7205-40ae-b698-e4d8a796f8f6\" (UID: \"4f26bf4c-7205-40ae-b698-e4d8a796f8f6\") " Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.657970 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f26bf4c-7205-40ae-b698-e4d8a796f8f6-config-data\") pod \"4f26bf4c-7205-40ae-b698-e4d8a796f8f6\" (UID: \"4f26bf4c-7205-40ae-b698-e4d8a796f8f6\") " Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.658001 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f26bf4c-7205-40ae-b698-e4d8a796f8f6-scripts\") pod \"4f26bf4c-7205-40ae-b698-e4d8a796f8f6\" (UID: \"4f26bf4c-7205-40ae-b698-e4d8a796f8f6\") " Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.658228 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-bshc5"] Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.658320 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4f26bf4c-7205-40ae-b698-e4d8a796f8f6-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "4f26bf4c-7205-40ae-b698-e4d8a796f8f6" (UID: "4f26bf4c-7205-40ae-b698-e4d8a796f8f6"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.675820 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f26bf4c-7205-40ae-b698-e4d8a796f8f6-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4f26bf4c-7205-40ae-b698-e4d8a796f8f6" (UID: "4f26bf4c-7205-40ae-b698-e4d8a796f8f6"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.693949 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f26bf4c-7205-40ae-b698-e4d8a796f8f6-scripts" (OuterVolumeSpecName: "scripts") pod "4f26bf4c-7205-40ae-b698-e4d8a796f8f6" (UID: "4f26bf4c-7205-40ae-b698-e4d8a796f8f6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.693970 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f26bf4c-7205-40ae-b698-e4d8a796f8f6-kube-api-access-x298k" (OuterVolumeSpecName: "kube-api-access-x298k") pod "4f26bf4c-7205-40ae-b698-e4d8a796f8f6" (UID: "4f26bf4c-7205-40ae-b698-e4d8a796f8f6"). InnerVolumeSpecName "kube-api-access-x298k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:14:44 crc kubenswrapper[4771]: W1001 15:14:44.710311 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf6003332_ade7_416e_8165_0b3768b94dc0.slice/crio-d7e7a0b560eb73c8aa369ceb4428adcfb0783e55c027e6a8cc0165b1c34206cb WatchSource:0}: Error finding container d7e7a0b560eb73c8aa369ceb4428adcfb0783e55c027e6a8cc0165b1c34206cb: Status 404 returned error can't find the container with id d7e7a0b560eb73c8aa369ceb4428adcfb0783e55c027e6a8cc0165b1c34206cb Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.760690 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x298k\" (UniqueName: \"kubernetes.io/projected/4f26bf4c-7205-40ae-b698-e4d8a796f8f6-kube-api-access-x298k\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.760745 4771 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4f26bf4c-7205-40ae-b698-e4d8a796f8f6-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.760755 4771 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4f26bf4c-7205-40ae-b698-e4d8a796f8f6-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.760764 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4f26bf4c-7205-40ae-b698-e4d8a796f8f6-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.881426 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f26bf4c-7205-40ae-b698-e4d8a796f8f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f26bf4c-7205-40ae-b698-e4d8a796f8f6" (UID: "4f26bf4c-7205-40ae-b698-e4d8a796f8f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.918265 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-d6484bc47-hkzdw"] Oct 01 15:14:44 crc kubenswrapper[4771]: E1001 15:14:44.918671 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f26bf4c-7205-40ae-b698-e4d8a796f8f6" containerName="probe" Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.918685 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f26bf4c-7205-40ae-b698-e4d8a796f8f6" containerName="probe" Oct 01 15:14:44 crc kubenswrapper[4771]: E1001 15:14:44.918703 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f26bf4c-7205-40ae-b698-e4d8a796f8f6" containerName="cinder-scheduler" Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.918709 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f26bf4c-7205-40ae-b698-e4d8a796f8f6" containerName="cinder-scheduler" Oct 01 15:14:44 crc kubenswrapper[4771]: E1001 15:14:44.918756 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7323fb43-89a2-4be3-96fc-f8633f91fd8c" containerName="barbican-api" Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.918762 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7323fb43-89a2-4be3-96fc-f8633f91fd8c" containerName="barbican-api" Oct 01 15:14:44 crc kubenswrapper[4771]: E1001 15:14:44.918771 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7323fb43-89a2-4be3-96fc-f8633f91fd8c" containerName="barbican-api-log" Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.918777 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7323fb43-89a2-4be3-96fc-f8633f91fd8c" containerName="barbican-api-log" Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.918948 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f26bf4c-7205-40ae-b698-e4d8a796f8f6" containerName="cinder-scheduler" Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.918970 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="7323fb43-89a2-4be3-96fc-f8633f91fd8c" containerName="barbican-api-log" Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.918980 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="7323fb43-89a2-4be3-96fc-f8633f91fd8c" containerName="barbican-api" Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.919000 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f26bf4c-7205-40ae-b698-e4d8a796f8f6" containerName="probe" Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.919940 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-d6484bc47-hkzdw" Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.922270 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.936256 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.936452 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.957474 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f26bf4c-7205-40ae-b698-e4d8a796f8f6-config-data" (OuterVolumeSpecName: "config-data") pod "4f26bf4c-7205-40ae-b698-e4d8a796f8f6" (UID: "4f26bf4c-7205-40ae-b698-e4d8a796f8f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.963327 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f26bf4c-7205-40ae-b698-e4d8a796f8f6-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.963360 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f26bf4c-7205-40ae-b698-e4d8a796f8f6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:44 crc kubenswrapper[4771]: I1001 15:14:44.970890 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-d6484bc47-hkzdw"] Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.009088 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-bshc5" event={"ID":"f6003332-ade7-416e-8165-0b3768b94dc0","Type":"ContainerStarted","Data":"d7e7a0b560eb73c8aa369ceb4428adcfb0783e55c027e6a8cc0165b1c34206cb"} Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.010349 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-bszpn" event={"ID":"8c4d2578-710e-45af-86c8-8c8677ecc0b6","Type":"ContainerStarted","Data":"4fc72c6cf9e93feafe7bc58a3482e603c61ba143774546119094c46592de66bf"} Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.010373 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-bszpn" event={"ID":"8c4d2578-710e-45af-86c8-8c8677ecc0b6","Type":"ContainerStarted","Data":"344a3b1e006a99e2ec27e49d61e9a928490724ebb3a40934299871f46cc21c95"} Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.023994 4771 generic.go:334] "Generic (PLEG): container finished" podID="4f26bf4c-7205-40ae-b698-e4d8a796f8f6" containerID="d48291e5e50b8a4cc2f0a97e00cba34d3e77cb75dfa7b65f2a93ab54743e77e5" exitCode=0 Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.024087 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4f26bf4c-7205-40ae-b698-e4d8a796f8f6","Type":"ContainerDied","Data":"d48291e5e50b8a4cc2f0a97e00cba34d3e77cb75dfa7b65f2a93ab54743e77e5"} Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.024118 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4f26bf4c-7205-40ae-b698-e4d8a796f8f6","Type":"ContainerDied","Data":"7d502be1a04c13a4e07e8301f4135216653172113967137e3723df28543f0ad6"} Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.024139 4771 scope.go:117] "RemoveContainer" containerID="902492f5d1db20b120e1e3e344698dee4d3fb8ffd1ec2d0cd21ac3b22b036681" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.024293 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.033309 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-bszpn" podStartSLOduration=2.033291765 podStartE2EDuration="2.033291765s" podCreationTimestamp="2025-10-01 15:14:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:14:45.033030519 +0000 UTC m=+1129.652205700" watchObservedRunningTime="2025-10-01 15:14:45.033291765 +0000 UTC m=+1129.652466936" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.036951 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-6sk5b" event={"ID":"3afe16e8-e581-4752-9128-0516838132ae","Type":"ContainerStarted","Data":"d038276b3c071804af5139b55833ae2ef7f92b284760ce854c9d49c53f53ea19"} Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.036986 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-6sk5b" event={"ID":"3afe16e8-e581-4752-9128-0516838132ae","Type":"ContainerStarted","Data":"aef91e0e8aca5d0d8000e9e0ad7cbee2dbb7c1a5f92bbf26be2949a89c3988d3"} Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.062065 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-6sk5b" podStartSLOduration=2.062050662 podStartE2EDuration="2.062050662s" podCreationTimestamp="2025-10-01 15:14:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:14:45.061399776 +0000 UTC m=+1129.680574947" watchObservedRunningTime="2025-10-01 15:14:45.062050662 +0000 UTC m=+1129.681225823" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.064880 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4b812be-6e39-4ac8-b43f-dba345603f74-internal-tls-certs\") pod \"swift-proxy-d6484bc47-hkzdw\" (UID: \"e4b812be-6e39-4ac8-b43f-dba345603f74\") " pod="openstack/swift-proxy-d6484bc47-hkzdw" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.064953 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4b812be-6e39-4ac8-b43f-dba345603f74-combined-ca-bundle\") pod \"swift-proxy-d6484bc47-hkzdw\" (UID: \"e4b812be-6e39-4ac8-b43f-dba345603f74\") " pod="openstack/swift-proxy-d6484bc47-hkzdw" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.064989 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4b812be-6e39-4ac8-b43f-dba345603f74-config-data\") pod \"swift-proxy-d6484bc47-hkzdw\" (UID: \"e4b812be-6e39-4ac8-b43f-dba345603f74\") " pod="openstack/swift-proxy-d6484bc47-hkzdw" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.065062 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4b812be-6e39-4ac8-b43f-dba345603f74-public-tls-certs\") pod \"swift-proxy-d6484bc47-hkzdw\" (UID: \"e4b812be-6e39-4ac8-b43f-dba345603f74\") " pod="openstack/swift-proxy-d6484bc47-hkzdw" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.065114 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4b812be-6e39-4ac8-b43f-dba345603f74-run-httpd\") pod \"swift-proxy-d6484bc47-hkzdw\" (UID: \"e4b812be-6e39-4ac8-b43f-dba345603f74\") " pod="openstack/swift-proxy-d6484bc47-hkzdw" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.065168 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6g7n\" (UniqueName: \"kubernetes.io/projected/e4b812be-6e39-4ac8-b43f-dba345603f74-kube-api-access-q6g7n\") pod \"swift-proxy-d6484bc47-hkzdw\" (UID: \"e4b812be-6e39-4ac8-b43f-dba345603f74\") " pod="openstack/swift-proxy-d6484bc47-hkzdw" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.065195 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e4b812be-6e39-4ac8-b43f-dba345603f74-etc-swift\") pod \"swift-proxy-d6484bc47-hkzdw\" (UID: \"e4b812be-6e39-4ac8-b43f-dba345603f74\") " pod="openstack/swift-proxy-d6484bc47-hkzdw" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.065226 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4b812be-6e39-4ac8-b43f-dba345603f74-log-httpd\") pod \"swift-proxy-d6484bc47-hkzdw\" (UID: \"e4b812be-6e39-4ac8-b43f-dba345603f74\") " pod="openstack/swift-proxy-d6484bc47-hkzdw" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.078856 4771 scope.go:117] "RemoveContainer" containerID="d48291e5e50b8a4cc2f0a97e00cba34d3e77cb75dfa7b65f2a93ab54743e77e5" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.092242 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.102003 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.123935 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.127925 4771 scope.go:117] "RemoveContainer" containerID="902492f5d1db20b120e1e3e344698dee4d3fb8ffd1ec2d0cd21ac3b22b036681" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.128107 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.130124 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 01 15:14:45 crc kubenswrapper[4771]: E1001 15:14:45.130138 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"902492f5d1db20b120e1e3e344698dee4d3fb8ffd1ec2d0cd21ac3b22b036681\": container with ID starting with 902492f5d1db20b120e1e3e344698dee4d3fb8ffd1ec2d0cd21ac3b22b036681 not found: ID does not exist" containerID="902492f5d1db20b120e1e3e344698dee4d3fb8ffd1ec2d0cd21ac3b22b036681" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.130300 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"902492f5d1db20b120e1e3e344698dee4d3fb8ffd1ec2d0cd21ac3b22b036681"} err="failed to get container status \"902492f5d1db20b120e1e3e344698dee4d3fb8ffd1ec2d0cd21ac3b22b036681\": rpc error: code = NotFound desc = could not find container \"902492f5d1db20b120e1e3e344698dee4d3fb8ffd1ec2d0cd21ac3b22b036681\": container with ID starting with 902492f5d1db20b120e1e3e344698dee4d3fb8ffd1ec2d0cd21ac3b22b036681 not found: ID does not exist" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.130405 4771 scope.go:117] "RemoveContainer" containerID="d48291e5e50b8a4cc2f0a97e00cba34d3e77cb75dfa7b65f2a93ab54743e77e5" Oct 01 15:14:45 crc kubenswrapper[4771]: E1001 15:14:45.131539 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d48291e5e50b8a4cc2f0a97e00cba34d3e77cb75dfa7b65f2a93ab54743e77e5\": container with ID starting with d48291e5e50b8a4cc2f0a97e00cba34d3e77cb75dfa7b65f2a93ab54743e77e5 not found: ID does not exist" containerID="d48291e5e50b8a4cc2f0a97e00cba34d3e77cb75dfa7b65f2a93ab54743e77e5" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.131596 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d48291e5e50b8a4cc2f0a97e00cba34d3e77cb75dfa7b65f2a93ab54743e77e5"} err="failed to get container status \"d48291e5e50b8a4cc2f0a97e00cba34d3e77cb75dfa7b65f2a93ab54743e77e5\": rpc error: code = NotFound desc = could not find container \"d48291e5e50b8a4cc2f0a97e00cba34d3e77cb75dfa7b65f2a93ab54743e77e5\": container with ID starting with d48291e5e50b8a4cc2f0a97e00cba34d3e77cb75dfa7b65f2a93ab54743e77e5 not found: ID does not exist" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.140403 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.166797 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4b812be-6e39-4ac8-b43f-dba345603f74-public-tls-certs\") pod \"swift-proxy-d6484bc47-hkzdw\" (UID: \"e4b812be-6e39-4ac8-b43f-dba345603f74\") " pod="openstack/swift-proxy-d6484bc47-hkzdw" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.166860 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4b812be-6e39-4ac8-b43f-dba345603f74-run-httpd\") pod \"swift-proxy-d6484bc47-hkzdw\" (UID: \"e4b812be-6e39-4ac8-b43f-dba345603f74\") " pod="openstack/swift-proxy-d6484bc47-hkzdw" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.166915 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6g7n\" (UniqueName: \"kubernetes.io/projected/e4b812be-6e39-4ac8-b43f-dba345603f74-kube-api-access-q6g7n\") pod \"swift-proxy-d6484bc47-hkzdw\" (UID: \"e4b812be-6e39-4ac8-b43f-dba345603f74\") " pod="openstack/swift-proxy-d6484bc47-hkzdw" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.166935 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e4b812be-6e39-4ac8-b43f-dba345603f74-etc-swift\") pod \"swift-proxy-d6484bc47-hkzdw\" (UID: \"e4b812be-6e39-4ac8-b43f-dba345603f74\") " pod="openstack/swift-proxy-d6484bc47-hkzdw" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.166966 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4b812be-6e39-4ac8-b43f-dba345603f74-log-httpd\") pod \"swift-proxy-d6484bc47-hkzdw\" (UID: \"e4b812be-6e39-4ac8-b43f-dba345603f74\") " pod="openstack/swift-proxy-d6484bc47-hkzdw" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.167003 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4b812be-6e39-4ac8-b43f-dba345603f74-internal-tls-certs\") pod \"swift-proxy-d6484bc47-hkzdw\" (UID: \"e4b812be-6e39-4ac8-b43f-dba345603f74\") " pod="openstack/swift-proxy-d6484bc47-hkzdw" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.167051 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4b812be-6e39-4ac8-b43f-dba345603f74-combined-ca-bundle\") pod \"swift-proxy-d6484bc47-hkzdw\" (UID: \"e4b812be-6e39-4ac8-b43f-dba345603f74\") " pod="openstack/swift-proxy-d6484bc47-hkzdw" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.167079 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4b812be-6e39-4ac8-b43f-dba345603f74-config-data\") pod \"swift-proxy-d6484bc47-hkzdw\" (UID: \"e4b812be-6e39-4ac8-b43f-dba345603f74\") " pod="openstack/swift-proxy-d6484bc47-hkzdw" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.170103 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4b812be-6e39-4ac8-b43f-dba345603f74-run-httpd\") pod \"swift-proxy-d6484bc47-hkzdw\" (UID: \"e4b812be-6e39-4ac8-b43f-dba345603f74\") " pod="openstack/swift-proxy-d6484bc47-hkzdw" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.179077 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4b812be-6e39-4ac8-b43f-dba345603f74-log-httpd\") pod \"swift-proxy-d6484bc47-hkzdw\" (UID: \"e4b812be-6e39-4ac8-b43f-dba345603f74\") " pod="openstack/swift-proxy-d6484bc47-hkzdw" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.179806 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4b812be-6e39-4ac8-b43f-dba345603f74-public-tls-certs\") pod \"swift-proxy-d6484bc47-hkzdw\" (UID: \"e4b812be-6e39-4ac8-b43f-dba345603f74\") " pod="openstack/swift-proxy-d6484bc47-hkzdw" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.181017 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4b812be-6e39-4ac8-b43f-dba345603f74-config-data\") pod \"swift-proxy-d6484bc47-hkzdw\" (UID: \"e4b812be-6e39-4ac8-b43f-dba345603f74\") " pod="openstack/swift-proxy-d6484bc47-hkzdw" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.188977 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4b812be-6e39-4ac8-b43f-dba345603f74-combined-ca-bundle\") pod \"swift-proxy-d6484bc47-hkzdw\" (UID: \"e4b812be-6e39-4ac8-b43f-dba345603f74\") " pod="openstack/swift-proxy-d6484bc47-hkzdw" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.189268 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6g7n\" (UniqueName: \"kubernetes.io/projected/e4b812be-6e39-4ac8-b43f-dba345603f74-kube-api-access-q6g7n\") pod \"swift-proxy-d6484bc47-hkzdw\" (UID: \"e4b812be-6e39-4ac8-b43f-dba345603f74\") " pod="openstack/swift-proxy-d6484bc47-hkzdw" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.191017 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4b812be-6e39-4ac8-b43f-dba345603f74-internal-tls-certs\") pod \"swift-proxy-d6484bc47-hkzdw\" (UID: \"e4b812be-6e39-4ac8-b43f-dba345603f74\") " pod="openstack/swift-proxy-d6484bc47-hkzdw" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.191575 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e4b812be-6e39-4ac8-b43f-dba345603f74-etc-swift\") pod \"swift-proxy-d6484bc47-hkzdw\" (UID: \"e4b812be-6e39-4ac8-b43f-dba345603f74\") " pod="openstack/swift-proxy-d6484bc47-hkzdw" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.268621 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tqfl\" (UniqueName: \"kubernetes.io/projected/3a36d28b-706e-4639-9d68-158427aaa655-kube-api-access-7tqfl\") pod \"cinder-scheduler-0\" (UID: \"3a36d28b-706e-4639-9d68-158427aaa655\") " pod="openstack/cinder-scheduler-0" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.269293 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a36d28b-706e-4639-9d68-158427aaa655-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3a36d28b-706e-4639-9d68-158427aaa655\") " pod="openstack/cinder-scheduler-0" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.269346 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a36d28b-706e-4639-9d68-158427aaa655-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3a36d28b-706e-4639-9d68-158427aaa655\") " pod="openstack/cinder-scheduler-0" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.269956 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a36d28b-706e-4639-9d68-158427aaa655-config-data\") pod \"cinder-scheduler-0\" (UID: \"3a36d28b-706e-4639-9d68-158427aaa655\") " pod="openstack/cinder-scheduler-0" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.270034 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a36d28b-706e-4639-9d68-158427aaa655-scripts\") pod \"cinder-scheduler-0\" (UID: \"3a36d28b-706e-4639-9d68-158427aaa655\") " pod="openstack/cinder-scheduler-0" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.270113 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3a36d28b-706e-4639-9d68-158427aaa655-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3a36d28b-706e-4639-9d68-158427aaa655\") " pod="openstack/cinder-scheduler-0" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.278969 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-d6484bc47-hkzdw" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.375910 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tqfl\" (UniqueName: \"kubernetes.io/projected/3a36d28b-706e-4639-9d68-158427aaa655-kube-api-access-7tqfl\") pod \"cinder-scheduler-0\" (UID: \"3a36d28b-706e-4639-9d68-158427aaa655\") " pod="openstack/cinder-scheduler-0" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.375987 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a36d28b-706e-4639-9d68-158427aaa655-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3a36d28b-706e-4639-9d68-158427aaa655\") " pod="openstack/cinder-scheduler-0" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.376024 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a36d28b-706e-4639-9d68-158427aaa655-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3a36d28b-706e-4639-9d68-158427aaa655\") " pod="openstack/cinder-scheduler-0" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.376191 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a36d28b-706e-4639-9d68-158427aaa655-config-data\") pod \"cinder-scheduler-0\" (UID: \"3a36d28b-706e-4639-9d68-158427aaa655\") " pod="openstack/cinder-scheduler-0" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.376257 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a36d28b-706e-4639-9d68-158427aaa655-scripts\") pod \"cinder-scheduler-0\" (UID: \"3a36d28b-706e-4639-9d68-158427aaa655\") " pod="openstack/cinder-scheduler-0" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.376300 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3a36d28b-706e-4639-9d68-158427aaa655-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3a36d28b-706e-4639-9d68-158427aaa655\") " pod="openstack/cinder-scheduler-0" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.376431 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3a36d28b-706e-4639-9d68-158427aaa655-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3a36d28b-706e-4639-9d68-158427aaa655\") " pod="openstack/cinder-scheduler-0" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.385117 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a36d28b-706e-4639-9d68-158427aaa655-config-data\") pod \"cinder-scheduler-0\" (UID: \"3a36d28b-706e-4639-9d68-158427aaa655\") " pod="openstack/cinder-scheduler-0" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.387252 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3a36d28b-706e-4639-9d68-158427aaa655-scripts\") pod \"cinder-scheduler-0\" (UID: \"3a36d28b-706e-4639-9d68-158427aaa655\") " pod="openstack/cinder-scheduler-0" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.389835 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a36d28b-706e-4639-9d68-158427aaa655-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3a36d28b-706e-4639-9d68-158427aaa655\") " pod="openstack/cinder-scheduler-0" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.392549 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a36d28b-706e-4639-9d68-158427aaa655-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3a36d28b-706e-4639-9d68-158427aaa655\") " pod="openstack/cinder-scheduler-0" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.400116 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tqfl\" (UniqueName: \"kubernetes.io/projected/3a36d28b-706e-4639-9d68-158427aaa655-kube-api-access-7tqfl\") pod \"cinder-scheduler-0\" (UID: \"3a36d28b-706e-4639-9d68-158427aaa655\") " pod="openstack/cinder-scheduler-0" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.449912 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 01 15:14:45 crc kubenswrapper[4771]: I1001 15:14:45.853491 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-d6484bc47-hkzdw"] Oct 01 15:14:46 crc kubenswrapper[4771]: I1001 15:14:46.008534 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f26bf4c-7205-40ae-b698-e4d8a796f8f6" path="/var/lib/kubelet/pods/4f26bf4c-7205-40ae-b698-e4d8a796f8f6/volumes" Oct 01 15:14:46 crc kubenswrapper[4771]: I1001 15:14:46.010260 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7323fb43-89a2-4be3-96fc-f8633f91fd8c" path="/var/lib/kubelet/pods/7323fb43-89a2-4be3-96fc-f8633f91fd8c/volumes" Oct 01 15:14:46 crc kubenswrapper[4771]: I1001 15:14:46.011262 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3ad98d7-c092-4ef4-96b7-194255e37e83" path="/var/lib/kubelet/pods/b3ad98d7-c092-4ef4-96b7-194255e37e83/volumes" Oct 01 15:14:46 crc kubenswrapper[4771]: I1001 15:14:46.013319 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 01 15:14:46 crc kubenswrapper[4771]: I1001 15:14:46.056387 4771 generic.go:334] "Generic (PLEG): container finished" podID="3afe16e8-e581-4752-9128-0516838132ae" containerID="d038276b3c071804af5139b55833ae2ef7f92b284760ce854c9d49c53f53ea19" exitCode=0 Oct 01 15:14:46 crc kubenswrapper[4771]: I1001 15:14:46.056442 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-6sk5b" event={"ID":"3afe16e8-e581-4752-9128-0516838132ae","Type":"ContainerDied","Data":"d038276b3c071804af5139b55833ae2ef7f92b284760ce854c9d49c53f53ea19"} Oct 01 15:14:46 crc kubenswrapper[4771]: I1001 15:14:46.059809 4771 generic.go:334] "Generic (PLEG): container finished" podID="f6003332-ade7-416e-8165-0b3768b94dc0" containerID="9f82b8ba173061e613facbd6e0767a3bb0a6f57a8ebd68abe3447c956660cf05" exitCode=0 Oct 01 15:14:46 crc kubenswrapper[4771]: I1001 15:14:46.059981 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-bshc5" event={"ID":"f6003332-ade7-416e-8165-0b3768b94dc0","Type":"ContainerDied","Data":"9f82b8ba173061e613facbd6e0767a3bb0a6f57a8ebd68abe3447c956660cf05"} Oct 01 15:14:46 crc kubenswrapper[4771]: I1001 15:14:46.067019 4771 generic.go:334] "Generic (PLEG): container finished" podID="8c4d2578-710e-45af-86c8-8c8677ecc0b6" containerID="4fc72c6cf9e93feafe7bc58a3482e603c61ba143774546119094c46592de66bf" exitCode=0 Oct 01 15:14:46 crc kubenswrapper[4771]: I1001 15:14:46.067132 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-bszpn" event={"ID":"8c4d2578-710e-45af-86c8-8c8677ecc0b6","Type":"ContainerDied","Data":"4fc72c6cf9e93feafe7bc58a3482e603c61ba143774546119094c46592de66bf"} Oct 01 15:14:46 crc kubenswrapper[4771]: I1001 15:14:46.537538 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 15:14:46 crc kubenswrapper[4771]: I1001 15:14:46.537885 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="501a1350-8a05-4e56-8e04-57cb1f4c721b" containerName="proxy-httpd" containerID="cri-o://711c0d9cbcec5dcb83a304ad264baccaf01817bedf2c05babbc4989523752a60" gracePeriod=30 Oct 01 15:14:46 crc kubenswrapper[4771]: I1001 15:14:46.537927 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="501a1350-8a05-4e56-8e04-57cb1f4c721b" containerName="sg-core" containerID="cri-o://0e3fec9e2d3321977f601130aa2e001e5c14828d6eaf2a799904298413c30b9d" gracePeriod=30 Oct 01 15:14:46 crc kubenswrapper[4771]: I1001 15:14:46.537988 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="501a1350-8a05-4e56-8e04-57cb1f4c721b" containerName="ceilometer-notification-agent" containerID="cri-o://9d16671ac2a88a6acf48095805e3d17e3f505860fd999bf1e626b6300d8239ba" gracePeriod=30 Oct 01 15:14:46 crc kubenswrapper[4771]: I1001 15:14:46.537835 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="501a1350-8a05-4e56-8e04-57cb1f4c721b" containerName="ceilometer-central-agent" containerID="cri-o://de20565d8b41e17201fed26d95a6eeea0e35a0cbbb74b4a088d33af68298dfbf" gracePeriod=30 Oct 01 15:14:46 crc kubenswrapper[4771]: I1001 15:14:46.643351 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="501a1350-8a05-4e56-8e04-57cb1f4c721b" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.154:3000/\": read tcp 10.217.0.2:60922->10.217.0.154:3000: read: connection reset by peer" Oct 01 15:14:47 crc kubenswrapper[4771]: I1001 15:14:47.091865 4771 generic.go:334] "Generic (PLEG): container finished" podID="501a1350-8a05-4e56-8e04-57cb1f4c721b" containerID="711c0d9cbcec5dcb83a304ad264baccaf01817bedf2c05babbc4989523752a60" exitCode=0 Oct 01 15:14:47 crc kubenswrapper[4771]: I1001 15:14:47.092164 4771 generic.go:334] "Generic (PLEG): container finished" podID="501a1350-8a05-4e56-8e04-57cb1f4c721b" containerID="0e3fec9e2d3321977f601130aa2e001e5c14828d6eaf2a799904298413c30b9d" exitCode=2 Oct 01 15:14:47 crc kubenswrapper[4771]: I1001 15:14:47.092174 4771 generic.go:334] "Generic (PLEG): container finished" podID="501a1350-8a05-4e56-8e04-57cb1f4c721b" containerID="de20565d8b41e17201fed26d95a6eeea0e35a0cbbb74b4a088d33af68298dfbf" exitCode=0 Oct 01 15:14:47 crc kubenswrapper[4771]: I1001 15:14:47.091947 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"501a1350-8a05-4e56-8e04-57cb1f4c721b","Type":"ContainerDied","Data":"711c0d9cbcec5dcb83a304ad264baccaf01817bedf2c05babbc4989523752a60"} Oct 01 15:14:47 crc kubenswrapper[4771]: I1001 15:14:47.092235 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"501a1350-8a05-4e56-8e04-57cb1f4c721b","Type":"ContainerDied","Data":"0e3fec9e2d3321977f601130aa2e001e5c14828d6eaf2a799904298413c30b9d"} Oct 01 15:14:47 crc kubenswrapper[4771]: I1001 15:14:47.092246 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"501a1350-8a05-4e56-8e04-57cb1f4c721b","Type":"ContainerDied","Data":"de20565d8b41e17201fed26d95a6eeea0e35a0cbbb74b4a088d33af68298dfbf"} Oct 01 15:14:47 crc kubenswrapper[4771]: I1001 15:14:47.095417 4771 generic.go:334] "Generic (PLEG): container finished" podID="f66ea972-6475-4624-a093-8884ead588f8" containerID="23622ff95d6dcef208b1ee5e0aa03a223373fc0f91fb4beeaf4078217eaab3c5" exitCode=0 Oct 01 15:14:47 crc kubenswrapper[4771]: I1001 15:14:47.095514 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c4877d5c6-thmgz" event={"ID":"f66ea972-6475-4624-a093-8884ead588f8","Type":"ContainerDied","Data":"23622ff95d6dcef208b1ee5e0aa03a223373fc0f91fb4beeaf4078217eaab3c5"} Oct 01 15:14:47 crc kubenswrapper[4771]: I1001 15:14:47.890116 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 01 15:14:47 crc kubenswrapper[4771]: I1001 15:14:47.890164 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 01 15:14:47 crc kubenswrapper[4771]: I1001 15:14:47.921090 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 01 15:14:47 crc kubenswrapper[4771]: I1001 15:14:47.943203 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 01 15:14:48 crc kubenswrapper[4771]: I1001 15:14:48.120710 4771 generic.go:334] "Generic (PLEG): container finished" podID="501a1350-8a05-4e56-8e04-57cb1f4c721b" containerID="9d16671ac2a88a6acf48095805e3d17e3f505860fd999bf1e626b6300d8239ba" exitCode=0 Oct 01 15:14:48 crc kubenswrapper[4771]: I1001 15:14:48.120904 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"501a1350-8a05-4e56-8e04-57cb1f4c721b","Type":"ContainerDied","Data":"9d16671ac2a88a6acf48095805e3d17e3f505860fd999bf1e626b6300d8239ba"} Oct 01 15:14:48 crc kubenswrapper[4771]: I1001 15:14:48.121380 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 01 15:14:48 crc kubenswrapper[4771]: I1001 15:14:48.121406 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 01 15:14:48 crc kubenswrapper[4771]: I1001 15:14:48.718556 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 01 15:14:48 crc kubenswrapper[4771]: I1001 15:14:48.718842 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 01 15:14:48 crc kubenswrapper[4771]: I1001 15:14:48.776072 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 01 15:14:48 crc kubenswrapper[4771]: I1001 15:14:48.781554 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 01 15:14:49 crc kubenswrapper[4771]: I1001 15:14:49.128338 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 01 15:14:49 crc kubenswrapper[4771]: I1001 15:14:49.128366 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 01 15:14:50 crc kubenswrapper[4771]: I1001 15:14:50.093184 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 01 15:14:50 crc kubenswrapper[4771]: I1001 15:14:50.095254 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 01 15:14:51 crc kubenswrapper[4771]: I1001 15:14:51.127787 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 01 15:14:51 crc kubenswrapper[4771]: I1001 15:14:51.130492 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 01 15:14:51 crc kubenswrapper[4771]: W1001 15:14:51.867527 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a36d28b_706e_4639_9d68_158427aaa655.slice/crio-4fc9f7d7f64d395e28f825dbd10083567a549b30f47b58d66913eee98bd2b288 WatchSource:0}: Error finding container 4fc9f7d7f64d395e28f825dbd10083567a549b30f47b58d66913eee98bd2b288: Status 404 returned error can't find the container with id 4fc9f7d7f64d395e28f825dbd10083567a549b30f47b58d66913eee98bd2b288 Oct 01 15:14:51 crc kubenswrapper[4771]: W1001 15:14:51.869566 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4b812be_6e39_4ac8_b43f_dba345603f74.slice/crio-f01cda89c84643ba04a737a5444a8d040e34ed167f0ddeee2a1c0d9e1372557f WatchSource:0}: Error finding container f01cda89c84643ba04a737a5444a8d040e34ed167f0ddeee2a1c0d9e1372557f: Status 404 returned error can't find the container with id f01cda89c84643ba04a737a5444a8d040e34ed167f0ddeee2a1c0d9e1372557f Oct 01 15:14:52 crc kubenswrapper[4771]: I1001 15:14:52.151303 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-bszpn" Oct 01 15:14:52 crc kubenswrapper[4771]: I1001 15:14:52.152637 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6sk5b" Oct 01 15:14:52 crc kubenswrapper[4771]: I1001 15:14:52.160277 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-bshc5" Oct 01 15:14:52 crc kubenswrapper[4771]: I1001 15:14:52.174096 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-6sk5b" event={"ID":"3afe16e8-e581-4752-9128-0516838132ae","Type":"ContainerDied","Data":"aef91e0e8aca5d0d8000e9e0ad7cbee2dbb7c1a5f92bbf26be2949a89c3988d3"} Oct 01 15:14:52 crc kubenswrapper[4771]: I1001 15:14:52.174136 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aef91e0e8aca5d0d8000e9e0ad7cbee2dbb7c1a5f92bbf26be2949a89c3988d3" Oct 01 15:14:52 crc kubenswrapper[4771]: I1001 15:14:52.174191 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-6sk5b" Oct 01 15:14:52 crc kubenswrapper[4771]: I1001 15:14:52.175948 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3a36d28b-706e-4639-9d68-158427aaa655","Type":"ContainerStarted","Data":"4fc9f7d7f64d395e28f825dbd10083567a549b30f47b58d66913eee98bd2b288"} Oct 01 15:14:52 crc kubenswrapper[4771]: I1001 15:14:52.190060 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-bshc5" event={"ID":"f6003332-ade7-416e-8165-0b3768b94dc0","Type":"ContainerDied","Data":"d7e7a0b560eb73c8aa369ceb4428adcfb0783e55c027e6a8cc0165b1c34206cb"} Oct 01 15:14:52 crc kubenswrapper[4771]: I1001 15:14:52.190114 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7e7a0b560eb73c8aa369ceb4428adcfb0783e55c027e6a8cc0165b1c34206cb" Oct 01 15:14:52 crc kubenswrapper[4771]: I1001 15:14:52.190180 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-bshc5" Oct 01 15:14:52 crc kubenswrapper[4771]: I1001 15:14:52.199502 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-bszpn" event={"ID":"8c4d2578-710e-45af-86c8-8c8677ecc0b6","Type":"ContainerDied","Data":"344a3b1e006a99e2ec27e49d61e9a928490724ebb3a40934299871f46cc21c95"} Oct 01 15:14:52 crc kubenswrapper[4771]: I1001 15:14:52.199542 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="344a3b1e006a99e2ec27e49d61e9a928490724ebb3a40934299871f46cc21c95" Oct 01 15:14:52 crc kubenswrapper[4771]: I1001 15:14:52.199599 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-bszpn" Oct 01 15:14:52 crc kubenswrapper[4771]: I1001 15:14:52.209850 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgf6m\" (UniqueName: \"kubernetes.io/projected/8c4d2578-710e-45af-86c8-8c8677ecc0b6-kube-api-access-rgf6m\") pod \"8c4d2578-710e-45af-86c8-8c8677ecc0b6\" (UID: \"8c4d2578-710e-45af-86c8-8c8677ecc0b6\") " Oct 01 15:14:52 crc kubenswrapper[4771]: I1001 15:14:52.209935 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pfwr2\" (UniqueName: \"kubernetes.io/projected/f6003332-ade7-416e-8165-0b3768b94dc0-kube-api-access-pfwr2\") pod \"f6003332-ade7-416e-8165-0b3768b94dc0\" (UID: \"f6003332-ade7-416e-8165-0b3768b94dc0\") " Oct 01 15:14:52 crc kubenswrapper[4771]: I1001 15:14:52.209977 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npl8p\" (UniqueName: \"kubernetes.io/projected/3afe16e8-e581-4752-9128-0516838132ae-kube-api-access-npl8p\") pod \"3afe16e8-e581-4752-9128-0516838132ae\" (UID: \"3afe16e8-e581-4752-9128-0516838132ae\") " Oct 01 15:14:52 crc kubenswrapper[4771]: I1001 15:14:52.211881 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-d6484bc47-hkzdw" event={"ID":"e4b812be-6e39-4ac8-b43f-dba345603f74","Type":"ContainerStarted","Data":"f01cda89c84643ba04a737a5444a8d040e34ed167f0ddeee2a1c0d9e1372557f"} Oct 01 15:14:52 crc kubenswrapper[4771]: I1001 15:14:52.214764 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3afe16e8-e581-4752-9128-0516838132ae-kube-api-access-npl8p" (OuterVolumeSpecName: "kube-api-access-npl8p") pod "3afe16e8-e581-4752-9128-0516838132ae" (UID: "3afe16e8-e581-4752-9128-0516838132ae"). InnerVolumeSpecName "kube-api-access-npl8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:14:52 crc kubenswrapper[4771]: I1001 15:14:52.218953 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c4d2578-710e-45af-86c8-8c8677ecc0b6-kube-api-access-rgf6m" (OuterVolumeSpecName: "kube-api-access-rgf6m") pod "8c4d2578-710e-45af-86c8-8c8677ecc0b6" (UID: "8c4d2578-710e-45af-86c8-8c8677ecc0b6"). InnerVolumeSpecName "kube-api-access-rgf6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:14:52 crc kubenswrapper[4771]: I1001 15:14:52.229991 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6003332-ade7-416e-8165-0b3768b94dc0-kube-api-access-pfwr2" (OuterVolumeSpecName: "kube-api-access-pfwr2") pod "f6003332-ade7-416e-8165-0b3768b94dc0" (UID: "f6003332-ade7-416e-8165-0b3768b94dc0"). InnerVolumeSpecName "kube-api-access-pfwr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:14:52 crc kubenswrapper[4771]: I1001 15:14:52.313297 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgf6m\" (UniqueName: \"kubernetes.io/projected/8c4d2578-710e-45af-86c8-8c8677ecc0b6-kube-api-access-rgf6m\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:52 crc kubenswrapper[4771]: I1001 15:14:52.313352 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pfwr2\" (UniqueName: \"kubernetes.io/projected/f6003332-ade7-416e-8165-0b3768b94dc0-kube-api-access-pfwr2\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:52 crc kubenswrapper[4771]: I1001 15:14:52.313366 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npl8p\" (UniqueName: \"kubernetes.io/projected/3afe16e8-e581-4752-9128-0516838132ae-kube-api-access-npl8p\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:52 crc kubenswrapper[4771]: I1001 15:14:52.976593 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.027927 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/501a1350-8a05-4e56-8e04-57cb1f4c721b-log-httpd\") pod \"501a1350-8a05-4e56-8e04-57cb1f4c721b\" (UID: \"501a1350-8a05-4e56-8e04-57cb1f4c721b\") " Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.028035 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2hjwd\" (UniqueName: \"kubernetes.io/projected/501a1350-8a05-4e56-8e04-57cb1f4c721b-kube-api-access-2hjwd\") pod \"501a1350-8a05-4e56-8e04-57cb1f4c721b\" (UID: \"501a1350-8a05-4e56-8e04-57cb1f4c721b\") " Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.028111 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/501a1350-8a05-4e56-8e04-57cb1f4c721b-scripts\") pod \"501a1350-8a05-4e56-8e04-57cb1f4c721b\" (UID: \"501a1350-8a05-4e56-8e04-57cb1f4c721b\") " Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.028152 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/501a1350-8a05-4e56-8e04-57cb1f4c721b-run-httpd\") pod \"501a1350-8a05-4e56-8e04-57cb1f4c721b\" (UID: \"501a1350-8a05-4e56-8e04-57cb1f4c721b\") " Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.028232 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/501a1350-8a05-4e56-8e04-57cb1f4c721b-config-data\") pod \"501a1350-8a05-4e56-8e04-57cb1f4c721b\" (UID: \"501a1350-8a05-4e56-8e04-57cb1f4c721b\") " Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.028255 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/501a1350-8a05-4e56-8e04-57cb1f4c721b-combined-ca-bundle\") pod \"501a1350-8a05-4e56-8e04-57cb1f4c721b\" (UID: \"501a1350-8a05-4e56-8e04-57cb1f4c721b\") " Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.028275 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/501a1350-8a05-4e56-8e04-57cb1f4c721b-sg-core-conf-yaml\") pod \"501a1350-8a05-4e56-8e04-57cb1f4c721b\" (UID: \"501a1350-8a05-4e56-8e04-57cb1f4c721b\") " Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.028558 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/501a1350-8a05-4e56-8e04-57cb1f4c721b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "501a1350-8a05-4e56-8e04-57cb1f4c721b" (UID: "501a1350-8a05-4e56-8e04-57cb1f4c721b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.028786 4771 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/501a1350-8a05-4e56-8e04-57cb1f4c721b-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.029308 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/501a1350-8a05-4e56-8e04-57cb1f4c721b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "501a1350-8a05-4e56-8e04-57cb1f4c721b" (UID: "501a1350-8a05-4e56-8e04-57cb1f4c721b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.033145 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/501a1350-8a05-4e56-8e04-57cb1f4c721b-scripts" (OuterVolumeSpecName: "scripts") pod "501a1350-8a05-4e56-8e04-57cb1f4c721b" (UID: "501a1350-8a05-4e56-8e04-57cb1f4c721b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.034977 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/501a1350-8a05-4e56-8e04-57cb1f4c721b-kube-api-access-2hjwd" (OuterVolumeSpecName: "kube-api-access-2hjwd") pod "501a1350-8a05-4e56-8e04-57cb1f4c721b" (UID: "501a1350-8a05-4e56-8e04-57cb1f4c721b"). InnerVolumeSpecName "kube-api-access-2hjwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.066970 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/501a1350-8a05-4e56-8e04-57cb1f4c721b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "501a1350-8a05-4e56-8e04-57cb1f4c721b" (UID: "501a1350-8a05-4e56-8e04-57cb1f4c721b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.116746 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/501a1350-8a05-4e56-8e04-57cb1f4c721b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "501a1350-8a05-4e56-8e04-57cb1f4c721b" (UID: "501a1350-8a05-4e56-8e04-57cb1f4c721b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.130468 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/501a1350-8a05-4e56-8e04-57cb1f4c721b-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.130502 4771 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/501a1350-8a05-4e56-8e04-57cb1f4c721b-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.130514 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/501a1350-8a05-4e56-8e04-57cb1f4c721b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.130523 4771 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/501a1350-8a05-4e56-8e04-57cb1f4c721b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.130532 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2hjwd\" (UniqueName: \"kubernetes.io/projected/501a1350-8a05-4e56-8e04-57cb1f4c721b-kube-api-access-2hjwd\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.151843 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/501a1350-8a05-4e56-8e04-57cb1f4c721b-config-data" (OuterVolumeSpecName: "config-data") pod "501a1350-8a05-4e56-8e04-57cb1f4c721b" (UID: "501a1350-8a05-4e56-8e04-57cb1f4c721b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.157440 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-67bb557f68-mz5cv" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.167264 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c4877d5c6-thmgz" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.200397 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-67bb557f68-mz5cv" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.232528 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f66ea972-6475-4624-a093-8884ead588f8-ovndb-tls-certs\") pod \"f66ea972-6475-4624-a093-8884ead588f8\" (UID: \"f66ea972-6475-4624-a093-8884ead588f8\") " Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.232578 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f66ea972-6475-4624-a093-8884ead588f8-combined-ca-bundle\") pod \"f66ea972-6475-4624-a093-8884ead588f8\" (UID: \"f66ea972-6475-4624-a093-8884ead588f8\") " Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.232601 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f66ea972-6475-4624-a093-8884ead588f8-config\") pod \"f66ea972-6475-4624-a093-8884ead588f8\" (UID: \"f66ea972-6475-4624-a093-8884ead588f8\") " Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.232617 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f66ea972-6475-4624-a093-8884ead588f8-httpd-config\") pod \"f66ea972-6475-4624-a093-8884ead588f8\" (UID: \"f66ea972-6475-4624-a093-8884ead588f8\") " Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.232691 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlcw5\" (UniqueName: \"kubernetes.io/projected/f66ea972-6475-4624-a093-8884ead588f8-kube-api-access-xlcw5\") pod \"f66ea972-6475-4624-a093-8884ead588f8\" (UID: \"f66ea972-6475-4624-a093-8884ead588f8\") " Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.233149 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/501a1350-8a05-4e56-8e04-57cb1f4c721b-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.242881 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f66ea972-6475-4624-a093-8884ead588f8-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "f66ea972-6475-4624-a093-8884ead588f8" (UID: "f66ea972-6475-4624-a093-8884ead588f8"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.255670 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3a36d28b-706e-4639-9d68-158427aaa655","Type":"ContainerStarted","Data":"78b582ae9a2a4aa57bf306985a59c55060692654efb43e08d57f8eb236d0b625"} Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.262054 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f66ea972-6475-4624-a093-8884ead588f8-kube-api-access-xlcw5" (OuterVolumeSpecName: "kube-api-access-xlcw5") pod "f66ea972-6475-4624-a093-8884ead588f8" (UID: "f66ea972-6475-4624-a093-8884ead588f8"). InnerVolumeSpecName "kube-api-access-xlcw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.262475 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6c4877d5c6-thmgz" event={"ID":"f66ea972-6475-4624-a093-8884ead588f8","Type":"ContainerDied","Data":"c12c40f91ddd483f75bcfb4e3d79ca5d1419d45516bc1bb57e17bc3bac935235"} Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.262557 4771 scope.go:117] "RemoveContainer" containerID="1295acd6636ce8c5c95b86fa2c5b81795c12be6d40eb7105163c265e27ddebc4" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.262909 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6c4877d5c6-thmgz" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.287514 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"501a1350-8a05-4e56-8e04-57cb1f4c721b","Type":"ContainerDied","Data":"f63089bc5904ee661f4e5ef92227246cbd0972f54e87c821455a99dc13010482"} Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.287639 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.317436 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-d6484bc47-hkzdw" event={"ID":"e4b812be-6e39-4ac8-b43f-dba345603f74","Type":"ContainerStarted","Data":"cd5c0b71abc2ed0497aa44478f7129294909ffb1b7c7af0c03b786dbdd000ade"} Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.317955 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-d6484bc47-hkzdw" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.317989 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-d6484bc47-hkzdw" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.318000 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-d6484bc47-hkzdw" event={"ID":"e4b812be-6e39-4ac8-b43f-dba345603f74","Type":"ContainerStarted","Data":"15232076d5391dc8125be16969b4d1ff2ad18421694fe088c38cdad735dfa0a9"} Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.337405 4771 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f66ea972-6475-4624-a093-8884ead588f8-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.337629 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlcw5\" (UniqueName: \"kubernetes.io/projected/f66ea972-6475-4624-a093-8884ead588f8-kube-api-access-xlcw5\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.384022 4771 scope.go:117] "RemoveContainer" containerID="23622ff95d6dcef208b1ee5e0aa03a223373fc0f91fb4beeaf4078217eaab3c5" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.400759 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f66ea972-6475-4624-a093-8884ead588f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f66ea972-6475-4624-a093-8884ead588f8" (UID: "f66ea972-6475-4624-a093-8884ead588f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.422039 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f66ea972-6475-4624-a093-8884ead588f8-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "f66ea972-6475-4624-a093-8884ead588f8" (UID: "f66ea972-6475-4624-a093-8884ead588f8"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.424913 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f66ea972-6475-4624-a093-8884ead588f8-config" (OuterVolumeSpecName: "config") pod "f66ea972-6475-4624-a093-8884ead588f8" (UID: "f66ea972-6475-4624-a093-8884ead588f8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.428420 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-d6484bc47-hkzdw" podStartSLOduration=9.428398575 podStartE2EDuration="9.428398575s" podCreationTimestamp="2025-10-01 15:14:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:14:53.34085706 +0000 UTC m=+1137.960032251" watchObservedRunningTime="2025-10-01 15:14:53.428398575 +0000 UTC m=+1138.047573746" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.440035 4771 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f66ea972-6475-4624-a093-8884ead588f8-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.440064 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f66ea972-6475-4624-a093-8884ead588f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.440074 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f66ea972-6475-4624-a093-8884ead588f8-config\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.453096 4771 scope.go:117] "RemoveContainer" containerID="711c0d9cbcec5dcb83a304ad264baccaf01817bedf2c05babbc4989523752a60" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.458529 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.479935 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.489823 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 15:14:53 crc kubenswrapper[4771]: E1001 15:14:53.490243 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="501a1350-8a05-4e56-8e04-57cb1f4c721b" containerName="ceilometer-central-agent" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.490263 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="501a1350-8a05-4e56-8e04-57cb1f4c721b" containerName="ceilometer-central-agent" Oct 01 15:14:53 crc kubenswrapper[4771]: E1001 15:14:53.490281 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f66ea972-6475-4624-a093-8884ead588f8" containerName="neutron-httpd" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.490288 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f66ea972-6475-4624-a093-8884ead588f8" containerName="neutron-httpd" Oct 01 15:14:53 crc kubenswrapper[4771]: E1001 15:14:53.490300 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="501a1350-8a05-4e56-8e04-57cb1f4c721b" containerName="ceilometer-notification-agent" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.490306 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="501a1350-8a05-4e56-8e04-57cb1f4c721b" containerName="ceilometer-notification-agent" Oct 01 15:14:53 crc kubenswrapper[4771]: E1001 15:14:53.490317 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6003332-ade7-416e-8165-0b3768b94dc0" containerName="mariadb-database-create" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.490323 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6003332-ade7-416e-8165-0b3768b94dc0" containerName="mariadb-database-create" Oct 01 15:14:53 crc kubenswrapper[4771]: E1001 15:14:53.490330 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c4d2578-710e-45af-86c8-8c8677ecc0b6" containerName="mariadb-database-create" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.490336 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c4d2578-710e-45af-86c8-8c8677ecc0b6" containerName="mariadb-database-create" Oct 01 15:14:53 crc kubenswrapper[4771]: E1001 15:14:53.490348 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="501a1350-8a05-4e56-8e04-57cb1f4c721b" containerName="sg-core" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.490354 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="501a1350-8a05-4e56-8e04-57cb1f4c721b" containerName="sg-core" Oct 01 15:14:53 crc kubenswrapper[4771]: E1001 15:14:53.490363 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3afe16e8-e581-4752-9128-0516838132ae" containerName="mariadb-database-create" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.490369 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="3afe16e8-e581-4752-9128-0516838132ae" containerName="mariadb-database-create" Oct 01 15:14:53 crc kubenswrapper[4771]: E1001 15:14:53.490379 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="501a1350-8a05-4e56-8e04-57cb1f4c721b" containerName="proxy-httpd" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.490385 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="501a1350-8a05-4e56-8e04-57cb1f4c721b" containerName="proxy-httpd" Oct 01 15:14:53 crc kubenswrapper[4771]: E1001 15:14:53.490404 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f66ea972-6475-4624-a093-8884ead588f8" containerName="neutron-api" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.490409 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f66ea972-6475-4624-a093-8884ead588f8" containerName="neutron-api" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.490579 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="501a1350-8a05-4e56-8e04-57cb1f4c721b" containerName="ceilometer-central-agent" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.490588 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="501a1350-8a05-4e56-8e04-57cb1f4c721b" containerName="sg-core" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.490597 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6003332-ade7-416e-8165-0b3768b94dc0" containerName="mariadb-database-create" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.490609 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c4d2578-710e-45af-86c8-8c8677ecc0b6" containerName="mariadb-database-create" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.490618 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="501a1350-8a05-4e56-8e04-57cb1f4c721b" containerName="proxy-httpd" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.490634 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f66ea972-6475-4624-a093-8884ead588f8" containerName="neutron-api" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.490642 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="501a1350-8a05-4e56-8e04-57cb1f4c721b" containerName="ceilometer-notification-agent" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.490651 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="3afe16e8-e581-4752-9128-0516838132ae" containerName="mariadb-database-create" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.490657 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f66ea972-6475-4624-a093-8884ead588f8" containerName="neutron-httpd" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.492318 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.496000 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.496177 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.500476 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.503755 4771 scope.go:117] "RemoveContainer" containerID="0e3fec9e2d3321977f601130aa2e001e5c14828d6eaf2a799904298413c30b9d" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.541601 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/14a91b80-f355-4418-869a-35b3080e7695-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"14a91b80-f355-4418-869a-35b3080e7695\") " pod="openstack/ceilometer-0" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.541662 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14a91b80-f355-4418-869a-35b3080e7695-scripts\") pod \"ceilometer-0\" (UID: \"14a91b80-f355-4418-869a-35b3080e7695\") " pod="openstack/ceilometer-0" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.541702 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fflfn\" (UniqueName: \"kubernetes.io/projected/14a91b80-f355-4418-869a-35b3080e7695-kube-api-access-fflfn\") pod \"ceilometer-0\" (UID: \"14a91b80-f355-4418-869a-35b3080e7695\") " pod="openstack/ceilometer-0" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.541724 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14a91b80-f355-4418-869a-35b3080e7695-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"14a91b80-f355-4418-869a-35b3080e7695\") " pod="openstack/ceilometer-0" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.541772 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14a91b80-f355-4418-869a-35b3080e7695-config-data\") pod \"ceilometer-0\" (UID: \"14a91b80-f355-4418-869a-35b3080e7695\") " pod="openstack/ceilometer-0" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.541788 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/14a91b80-f355-4418-869a-35b3080e7695-log-httpd\") pod \"ceilometer-0\" (UID: \"14a91b80-f355-4418-869a-35b3080e7695\") " pod="openstack/ceilometer-0" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.541829 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/14a91b80-f355-4418-869a-35b3080e7695-run-httpd\") pod \"ceilometer-0\" (UID: \"14a91b80-f355-4418-869a-35b3080e7695\") " pod="openstack/ceilometer-0" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.552594 4771 scope.go:117] "RemoveContainer" containerID="9d16671ac2a88a6acf48095805e3d17e3f505860fd999bf1e626b6300d8239ba" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.581300 4771 scope.go:117] "RemoveContainer" containerID="de20565d8b41e17201fed26d95a6eeea0e35a0cbbb74b4a088d33af68298dfbf" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.599409 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-1f68-account-create-d94sr"] Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.605583 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1f68-account-create-d94sr" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.608725 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.611747 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-1f68-account-create-d94sr"] Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.628724 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6c4877d5c6-thmgz"] Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.635806 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6c4877d5c6-thmgz"] Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.643566 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/14a91b80-f355-4418-869a-35b3080e7695-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"14a91b80-f355-4418-869a-35b3080e7695\") " pod="openstack/ceilometer-0" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.643621 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14a91b80-f355-4418-869a-35b3080e7695-scripts\") pod \"ceilometer-0\" (UID: \"14a91b80-f355-4418-869a-35b3080e7695\") " pod="openstack/ceilometer-0" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.643662 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fflfn\" (UniqueName: \"kubernetes.io/projected/14a91b80-f355-4418-869a-35b3080e7695-kube-api-access-fflfn\") pod \"ceilometer-0\" (UID: \"14a91b80-f355-4418-869a-35b3080e7695\") " pod="openstack/ceilometer-0" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.643683 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14a91b80-f355-4418-869a-35b3080e7695-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"14a91b80-f355-4418-869a-35b3080e7695\") " pod="openstack/ceilometer-0" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.643703 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14a91b80-f355-4418-869a-35b3080e7695-config-data\") pod \"ceilometer-0\" (UID: \"14a91b80-f355-4418-869a-35b3080e7695\") " pod="openstack/ceilometer-0" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.643722 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/14a91b80-f355-4418-869a-35b3080e7695-log-httpd\") pod \"ceilometer-0\" (UID: \"14a91b80-f355-4418-869a-35b3080e7695\") " pod="openstack/ceilometer-0" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.643828 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/14a91b80-f355-4418-869a-35b3080e7695-run-httpd\") pod \"ceilometer-0\" (UID: \"14a91b80-f355-4418-869a-35b3080e7695\") " pod="openstack/ceilometer-0" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.644303 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/14a91b80-f355-4418-869a-35b3080e7695-run-httpd\") pod \"ceilometer-0\" (UID: \"14a91b80-f355-4418-869a-35b3080e7695\") " pod="openstack/ceilometer-0" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.644535 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/14a91b80-f355-4418-869a-35b3080e7695-log-httpd\") pod \"ceilometer-0\" (UID: \"14a91b80-f355-4418-869a-35b3080e7695\") " pod="openstack/ceilometer-0" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.649354 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14a91b80-f355-4418-869a-35b3080e7695-config-data\") pod \"ceilometer-0\" (UID: \"14a91b80-f355-4418-869a-35b3080e7695\") " pod="openstack/ceilometer-0" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.652241 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14a91b80-f355-4418-869a-35b3080e7695-scripts\") pod \"ceilometer-0\" (UID: \"14a91b80-f355-4418-869a-35b3080e7695\") " pod="openstack/ceilometer-0" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.652300 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14a91b80-f355-4418-869a-35b3080e7695-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"14a91b80-f355-4418-869a-35b3080e7695\") " pod="openstack/ceilometer-0" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.653190 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/14a91b80-f355-4418-869a-35b3080e7695-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"14a91b80-f355-4418-869a-35b3080e7695\") " pod="openstack/ceilometer-0" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.665540 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fflfn\" (UniqueName: \"kubernetes.io/projected/14a91b80-f355-4418-869a-35b3080e7695-kube-api-access-fflfn\") pod \"ceilometer-0\" (UID: \"14a91b80-f355-4418-869a-35b3080e7695\") " pod="openstack/ceilometer-0" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.745516 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vnxb\" (UniqueName: \"kubernetes.io/projected/6eef84f4-0cf2-4a95-964d-f37667da25da-kube-api-access-2vnxb\") pod \"nova-api-1f68-account-create-d94sr\" (UID: \"6eef84f4-0cf2-4a95-964d-f37667da25da\") " pod="openstack/nova-api-1f68-account-create-d94sr" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.818036 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.847447 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vnxb\" (UniqueName: \"kubernetes.io/projected/6eef84f4-0cf2-4a95-964d-f37667da25da-kube-api-access-2vnxb\") pod \"nova-api-1f68-account-create-d94sr\" (UID: \"6eef84f4-0cf2-4a95-964d-f37667da25da\") " pod="openstack/nova-api-1f68-account-create-d94sr" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.871374 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vnxb\" (UniqueName: \"kubernetes.io/projected/6eef84f4-0cf2-4a95-964d-f37667da25da-kube-api-access-2vnxb\") pod \"nova-api-1f68-account-create-d94sr\" (UID: \"6eef84f4-0cf2-4a95-964d-f37667da25da\") " pod="openstack/nova-api-1f68-account-create-d94sr" Oct 01 15:14:53 crc kubenswrapper[4771]: I1001 15:14:53.931936 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1f68-account-create-d94sr" Oct 01 15:14:54 crc kubenswrapper[4771]: I1001 15:14:54.065216 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="501a1350-8a05-4e56-8e04-57cb1f4c721b" path="/var/lib/kubelet/pods/501a1350-8a05-4e56-8e04-57cb1f4c721b/volumes" Oct 01 15:14:54 crc kubenswrapper[4771]: I1001 15:14:54.073447 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f66ea972-6475-4624-a093-8884ead588f8" path="/var/lib/kubelet/pods/f66ea972-6475-4624-a093-8884ead588f8/volumes" Oct 01 15:14:54 crc kubenswrapper[4771]: I1001 15:14:54.315940 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a40ee0c4-9c6b-4ed0-9f06-1bb104bb9a11","Type":"ContainerStarted","Data":"a3bed1f5f229c4579d50a838136173e648e3e35e6372b9c4859af83d0b82b215"} Oct 01 15:14:54 crc kubenswrapper[4771]: I1001 15:14:54.328331 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3a36d28b-706e-4639-9d68-158427aaa655","Type":"ContainerStarted","Data":"2158a3aacc58bd13c2e15802dfe9e2573579b3e18462bed0c38b02698f12afea"} Oct 01 15:14:54 crc kubenswrapper[4771]: I1001 15:14:54.358915 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.070006628 podStartE2EDuration="17.35889586s" podCreationTimestamp="2025-10-01 15:14:37 +0000 UTC" firstStartedPulling="2025-10-01 15:14:38.961530833 +0000 UTC m=+1123.580706004" lastFinishedPulling="2025-10-01 15:14:53.250420065 +0000 UTC m=+1137.869595236" observedRunningTime="2025-10-01 15:14:54.332810939 +0000 UTC m=+1138.951986130" watchObservedRunningTime="2025-10-01 15:14:54.35889586 +0000 UTC m=+1138.978071031" Oct 01 15:14:54 crc kubenswrapper[4771]: I1001 15:14:54.370331 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=9.370309531 podStartE2EDuration="9.370309531s" podCreationTimestamp="2025-10-01 15:14:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:14:54.353468856 +0000 UTC m=+1138.972644037" watchObservedRunningTime="2025-10-01 15:14:54.370309531 +0000 UTC m=+1138.989484702" Oct 01 15:14:54 crc kubenswrapper[4771]: I1001 15:14:54.496909 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 15:14:54 crc kubenswrapper[4771]: I1001 15:14:54.625781 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-1f68-account-create-d94sr"] Oct 01 15:14:54 crc kubenswrapper[4771]: W1001 15:14:54.635521 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6eef84f4_0cf2_4a95_964d_f37667da25da.slice/crio-e5561f85efda2069f39d0e21a22dc6ee50f0d23af1f925e4d164951fcd7ae592 WatchSource:0}: Error finding container e5561f85efda2069f39d0e21a22dc6ee50f0d23af1f925e4d164951fcd7ae592: Status 404 returned error can't find the container with id e5561f85efda2069f39d0e21a22dc6ee50f0d23af1f925e4d164951fcd7ae592 Oct 01 15:14:55 crc kubenswrapper[4771]: I1001 15:14:55.354583 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1f68-account-create-d94sr" event={"ID":"6eef84f4-0cf2-4a95-964d-f37667da25da","Type":"ContainerStarted","Data":"70a1fd9d2c3a6eaff72432f6b177e5779069a3032e30900ecc29c59f70de7c24"} Oct 01 15:14:55 crc kubenswrapper[4771]: I1001 15:14:55.354936 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1f68-account-create-d94sr" event={"ID":"6eef84f4-0cf2-4a95-964d-f37667da25da","Type":"ContainerStarted","Data":"e5561f85efda2069f39d0e21a22dc6ee50f0d23af1f925e4d164951fcd7ae592"} Oct 01 15:14:55 crc kubenswrapper[4771]: I1001 15:14:55.359098 4771 generic.go:334] "Generic (PLEG): container finished" podID="a5159749-76e2-4e89-8a81-59d8ea1ab063" containerID="a90b6e35d0b25d1bee2280f135b1a7b7fbd994bb638560538cf71e22bdc7f88b" exitCode=137 Oct 01 15:14:55 crc kubenswrapper[4771]: I1001 15:14:55.359160 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b48cbbc84-ndtts" event={"ID":"a5159749-76e2-4e89-8a81-59d8ea1ab063","Type":"ContainerDied","Data":"a90b6e35d0b25d1bee2280f135b1a7b7fbd994bb638560538cf71e22bdc7f88b"} Oct 01 15:14:55 crc kubenswrapper[4771]: I1001 15:14:55.360986 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"14a91b80-f355-4418-869a-35b3080e7695","Type":"ContainerStarted","Data":"a7c29d14cf9d1256c23bd290dba8fe5674c126ec05501a2e86fadf8084940446"} Oct 01 15:14:55 crc kubenswrapper[4771]: I1001 15:14:55.377150 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-1f68-account-create-d94sr" podStartSLOduration=2.377128495 podStartE2EDuration="2.377128495s" podCreationTimestamp="2025-10-01 15:14:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:14:55.367134228 +0000 UTC m=+1139.986309439" watchObservedRunningTime="2025-10-01 15:14:55.377128495 +0000 UTC m=+1139.996303676" Oct 01 15:14:55 crc kubenswrapper[4771]: I1001 15:14:55.451314 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 01 15:14:55 crc kubenswrapper[4771]: I1001 15:14:55.972778 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b48cbbc84-ndtts" Oct 01 15:14:56 crc kubenswrapper[4771]: I1001 15:14:56.110929 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a5159749-76e2-4e89-8a81-59d8ea1ab063-horizon-secret-key\") pod \"a5159749-76e2-4e89-8a81-59d8ea1ab063\" (UID: \"a5159749-76e2-4e89-8a81-59d8ea1ab063\") " Oct 01 15:14:56 crc kubenswrapper[4771]: I1001 15:14:56.110977 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a5159749-76e2-4e89-8a81-59d8ea1ab063-config-data\") pod \"a5159749-76e2-4e89-8a81-59d8ea1ab063\" (UID: \"a5159749-76e2-4e89-8a81-59d8ea1ab063\") " Oct 01 15:14:56 crc kubenswrapper[4771]: I1001 15:14:56.111008 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5159749-76e2-4e89-8a81-59d8ea1ab063-horizon-tls-certs\") pod \"a5159749-76e2-4e89-8a81-59d8ea1ab063\" (UID: \"a5159749-76e2-4e89-8a81-59d8ea1ab063\") " Oct 01 15:14:56 crc kubenswrapper[4771]: I1001 15:14:56.111086 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5159749-76e2-4e89-8a81-59d8ea1ab063-logs\") pod \"a5159749-76e2-4e89-8a81-59d8ea1ab063\" (UID: \"a5159749-76e2-4e89-8a81-59d8ea1ab063\") " Oct 01 15:14:56 crc kubenswrapper[4771]: I1001 15:14:56.111121 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5159749-76e2-4e89-8a81-59d8ea1ab063-combined-ca-bundle\") pod \"a5159749-76e2-4e89-8a81-59d8ea1ab063\" (UID: \"a5159749-76e2-4e89-8a81-59d8ea1ab063\") " Oct 01 15:14:56 crc kubenswrapper[4771]: I1001 15:14:56.111145 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7trtk\" (UniqueName: \"kubernetes.io/projected/a5159749-76e2-4e89-8a81-59d8ea1ab063-kube-api-access-7trtk\") pod \"a5159749-76e2-4e89-8a81-59d8ea1ab063\" (UID: \"a5159749-76e2-4e89-8a81-59d8ea1ab063\") " Oct 01 15:14:56 crc kubenswrapper[4771]: I1001 15:14:56.111164 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a5159749-76e2-4e89-8a81-59d8ea1ab063-scripts\") pod \"a5159749-76e2-4e89-8a81-59d8ea1ab063\" (UID: \"a5159749-76e2-4e89-8a81-59d8ea1ab063\") " Oct 01 15:14:56 crc kubenswrapper[4771]: I1001 15:14:56.113375 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5159749-76e2-4e89-8a81-59d8ea1ab063-logs" (OuterVolumeSpecName: "logs") pod "a5159749-76e2-4e89-8a81-59d8ea1ab063" (UID: "a5159749-76e2-4e89-8a81-59d8ea1ab063"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:14:56 crc kubenswrapper[4771]: I1001 15:14:56.116104 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5159749-76e2-4e89-8a81-59d8ea1ab063-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "a5159749-76e2-4e89-8a81-59d8ea1ab063" (UID: "a5159749-76e2-4e89-8a81-59d8ea1ab063"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:56 crc kubenswrapper[4771]: I1001 15:14:56.121132 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5159749-76e2-4e89-8a81-59d8ea1ab063-kube-api-access-7trtk" (OuterVolumeSpecName: "kube-api-access-7trtk") pod "a5159749-76e2-4e89-8a81-59d8ea1ab063" (UID: "a5159749-76e2-4e89-8a81-59d8ea1ab063"). InnerVolumeSpecName "kube-api-access-7trtk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:14:56 crc kubenswrapper[4771]: I1001 15:14:56.146369 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5159749-76e2-4e89-8a81-59d8ea1ab063-scripts" (OuterVolumeSpecName: "scripts") pod "a5159749-76e2-4e89-8a81-59d8ea1ab063" (UID: "a5159749-76e2-4e89-8a81-59d8ea1ab063"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:14:56 crc kubenswrapper[4771]: I1001 15:14:56.157836 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5159749-76e2-4e89-8a81-59d8ea1ab063-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a5159749-76e2-4e89-8a81-59d8ea1ab063" (UID: "a5159749-76e2-4e89-8a81-59d8ea1ab063"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:56 crc kubenswrapper[4771]: I1001 15:14:56.158281 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5159749-76e2-4e89-8a81-59d8ea1ab063-config-data" (OuterVolumeSpecName: "config-data") pod "a5159749-76e2-4e89-8a81-59d8ea1ab063" (UID: "a5159749-76e2-4e89-8a81-59d8ea1ab063"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:14:56 crc kubenswrapper[4771]: I1001 15:14:56.176258 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 15:14:56 crc kubenswrapper[4771]: I1001 15:14:56.182458 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5159749-76e2-4e89-8a81-59d8ea1ab063-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "a5159749-76e2-4e89-8a81-59d8ea1ab063" (UID: "a5159749-76e2-4e89-8a81-59d8ea1ab063"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:14:56 crc kubenswrapper[4771]: I1001 15:14:56.213898 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a5159749-76e2-4e89-8a81-59d8ea1ab063-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:56 crc kubenswrapper[4771]: I1001 15:14:56.213930 4771 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5159749-76e2-4e89-8a81-59d8ea1ab063-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:56 crc kubenswrapper[4771]: I1001 15:14:56.213944 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5159749-76e2-4e89-8a81-59d8ea1ab063-logs\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:56 crc kubenswrapper[4771]: I1001 15:14:56.213956 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5159749-76e2-4e89-8a81-59d8ea1ab063-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:56 crc kubenswrapper[4771]: I1001 15:14:56.213968 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7trtk\" (UniqueName: \"kubernetes.io/projected/a5159749-76e2-4e89-8a81-59d8ea1ab063-kube-api-access-7trtk\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:56 crc kubenswrapper[4771]: I1001 15:14:56.213981 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a5159749-76e2-4e89-8a81-59d8ea1ab063-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:56 crc kubenswrapper[4771]: I1001 15:14:56.213991 4771 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a5159749-76e2-4e89-8a81-59d8ea1ab063-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:56 crc kubenswrapper[4771]: I1001 15:14:56.370560 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"14a91b80-f355-4418-869a-35b3080e7695","Type":"ContainerStarted","Data":"be2c19f64fe57c6c719685b5199a9694ee0e0521d0e74e8f634d6a6437d4291e"} Oct 01 15:14:56 crc kubenswrapper[4771]: I1001 15:14:56.372257 4771 generic.go:334] "Generic (PLEG): container finished" podID="6eef84f4-0cf2-4a95-964d-f37667da25da" containerID="70a1fd9d2c3a6eaff72432f6b177e5779069a3032e30900ecc29c59f70de7c24" exitCode=0 Oct 01 15:14:56 crc kubenswrapper[4771]: I1001 15:14:56.372299 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1f68-account-create-d94sr" event={"ID":"6eef84f4-0cf2-4a95-964d-f37667da25da","Type":"ContainerDied","Data":"70a1fd9d2c3a6eaff72432f6b177e5779069a3032e30900ecc29c59f70de7c24"} Oct 01 15:14:56 crc kubenswrapper[4771]: I1001 15:14:56.376266 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-b48cbbc84-ndtts" Oct 01 15:14:56 crc kubenswrapper[4771]: I1001 15:14:56.376343 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-b48cbbc84-ndtts" event={"ID":"a5159749-76e2-4e89-8a81-59d8ea1ab063","Type":"ContainerDied","Data":"c06305591c15a87728b3c19dea7ad2aaec3af99f9c8d51f5c79833a931704378"} Oct 01 15:14:56 crc kubenswrapper[4771]: I1001 15:14:56.376392 4771 scope.go:117] "RemoveContainer" containerID="2d01502bbf6fc58e63063d52d24bfa7a5cf14caa1fd2964932d63d8a121a9349" Oct 01 15:14:56 crc kubenswrapper[4771]: I1001 15:14:56.422818 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-b48cbbc84-ndtts"] Oct 01 15:14:56 crc kubenswrapper[4771]: I1001 15:14:56.431764 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-b48cbbc84-ndtts"] Oct 01 15:14:56 crc kubenswrapper[4771]: I1001 15:14:56.557351 4771 scope.go:117] "RemoveContainer" containerID="a90b6e35d0b25d1bee2280f135b1a7b7fbd994bb638560538cf71e22bdc7f88b" Oct 01 15:14:57 crc kubenswrapper[4771]: I1001 15:14:57.386507 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"14a91b80-f355-4418-869a-35b3080e7695","Type":"ContainerStarted","Data":"e0e39c3b7861ee1e97ad51566034d1cabeb1285bb506c8f020f3d12f0dd8ba2f"} Oct 01 15:14:57 crc kubenswrapper[4771]: I1001 15:14:57.861964 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1f68-account-create-d94sr" Oct 01 15:14:57 crc kubenswrapper[4771]: I1001 15:14:57.955643 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vnxb\" (UniqueName: \"kubernetes.io/projected/6eef84f4-0cf2-4a95-964d-f37667da25da-kube-api-access-2vnxb\") pod \"6eef84f4-0cf2-4a95-964d-f37667da25da\" (UID: \"6eef84f4-0cf2-4a95-964d-f37667da25da\") " Oct 01 15:14:57 crc kubenswrapper[4771]: I1001 15:14:57.961100 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6eef84f4-0cf2-4a95-964d-f37667da25da-kube-api-access-2vnxb" (OuterVolumeSpecName: "kube-api-access-2vnxb") pod "6eef84f4-0cf2-4a95-964d-f37667da25da" (UID: "6eef84f4-0cf2-4a95-964d-f37667da25da"). InnerVolumeSpecName "kube-api-access-2vnxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:14:58 crc kubenswrapper[4771]: I1001 15:14:58.017931 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5159749-76e2-4e89-8a81-59d8ea1ab063" path="/var/lib/kubelet/pods/a5159749-76e2-4e89-8a81-59d8ea1ab063/volumes" Oct 01 15:14:58 crc kubenswrapper[4771]: I1001 15:14:58.059567 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vnxb\" (UniqueName: \"kubernetes.io/projected/6eef84f4-0cf2-4a95-964d-f37667da25da-kube-api-access-2vnxb\") on node \"crc\" DevicePath \"\"" Oct 01 15:14:58 crc kubenswrapper[4771]: I1001 15:14:58.397396 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-1f68-account-create-d94sr" event={"ID":"6eef84f4-0cf2-4a95-964d-f37667da25da","Type":"ContainerDied","Data":"e5561f85efda2069f39d0e21a22dc6ee50f0d23af1f925e4d164951fcd7ae592"} Oct 01 15:14:58 crc kubenswrapper[4771]: I1001 15:14:58.397435 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5561f85efda2069f39d0e21a22dc6ee50f0d23af1f925e4d164951fcd7ae592" Oct 01 15:14:58 crc kubenswrapper[4771]: I1001 15:14:58.397417 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-1f68-account-create-d94sr" Oct 01 15:14:58 crc kubenswrapper[4771]: I1001 15:14:58.400128 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"14a91b80-f355-4418-869a-35b3080e7695","Type":"ContainerStarted","Data":"e899abcac415be6d3374df8a11ae22aa5e33d79c8e99f3b621012a4ea53498f6"} Oct 01 15:15:00 crc kubenswrapper[4771]: I1001 15:15:00.130648 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322195-9gkfv"] Oct 01 15:15:00 crc kubenswrapper[4771]: E1001 15:15:00.131384 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5159749-76e2-4e89-8a81-59d8ea1ab063" containerName="horizon" Oct 01 15:15:00 crc kubenswrapper[4771]: I1001 15:15:00.131395 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5159749-76e2-4e89-8a81-59d8ea1ab063" containerName="horizon" Oct 01 15:15:00 crc kubenswrapper[4771]: E1001 15:15:00.131418 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5159749-76e2-4e89-8a81-59d8ea1ab063" containerName="horizon-log" Oct 01 15:15:00 crc kubenswrapper[4771]: I1001 15:15:00.131424 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5159749-76e2-4e89-8a81-59d8ea1ab063" containerName="horizon-log" Oct 01 15:15:00 crc kubenswrapper[4771]: E1001 15:15:00.131438 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eef84f4-0cf2-4a95-964d-f37667da25da" containerName="mariadb-account-create" Oct 01 15:15:00 crc kubenswrapper[4771]: I1001 15:15:00.131444 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eef84f4-0cf2-4a95-964d-f37667da25da" containerName="mariadb-account-create" Oct 01 15:15:00 crc kubenswrapper[4771]: I1001 15:15:00.131621 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5159749-76e2-4e89-8a81-59d8ea1ab063" containerName="horizon" Oct 01 15:15:00 crc kubenswrapper[4771]: I1001 15:15:00.131634 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5159749-76e2-4e89-8a81-59d8ea1ab063" containerName="horizon-log" Oct 01 15:15:00 crc kubenswrapper[4771]: I1001 15:15:00.131646 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="6eef84f4-0cf2-4a95-964d-f37667da25da" containerName="mariadb-account-create" Oct 01 15:15:00 crc kubenswrapper[4771]: I1001 15:15:00.132199 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322195-9gkfv" Oct 01 15:15:00 crc kubenswrapper[4771]: I1001 15:15:00.134148 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 15:15:00 crc kubenswrapper[4771]: I1001 15:15:00.134248 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 15:15:00 crc kubenswrapper[4771]: I1001 15:15:00.151352 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322195-9gkfv"] Oct 01 15:15:00 crc kubenswrapper[4771]: I1001 15:15:00.198325 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5v47\" (UniqueName: \"kubernetes.io/projected/49d1d22e-d17a-40b5-b968-c4f1277ae9cb-kube-api-access-x5v47\") pod \"collect-profiles-29322195-9gkfv\" (UID: \"49d1d22e-d17a-40b5-b968-c4f1277ae9cb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322195-9gkfv" Oct 01 15:15:00 crc kubenswrapper[4771]: I1001 15:15:00.198494 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/49d1d22e-d17a-40b5-b968-c4f1277ae9cb-secret-volume\") pod \"collect-profiles-29322195-9gkfv\" (UID: \"49d1d22e-d17a-40b5-b968-c4f1277ae9cb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322195-9gkfv" Oct 01 15:15:00 crc kubenswrapper[4771]: I1001 15:15:00.198636 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/49d1d22e-d17a-40b5-b968-c4f1277ae9cb-config-volume\") pod \"collect-profiles-29322195-9gkfv\" (UID: \"49d1d22e-d17a-40b5-b968-c4f1277ae9cb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322195-9gkfv" Oct 01 15:15:00 crc kubenswrapper[4771]: I1001 15:15:00.288879 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-d6484bc47-hkzdw" Oct 01 15:15:00 crc kubenswrapper[4771]: I1001 15:15:00.301334 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/49d1d22e-d17a-40b5-b968-c4f1277ae9cb-secret-volume\") pod \"collect-profiles-29322195-9gkfv\" (UID: \"49d1d22e-d17a-40b5-b968-c4f1277ae9cb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322195-9gkfv" Oct 01 15:15:00 crc kubenswrapper[4771]: I1001 15:15:00.301476 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/49d1d22e-d17a-40b5-b968-c4f1277ae9cb-config-volume\") pod \"collect-profiles-29322195-9gkfv\" (UID: \"49d1d22e-d17a-40b5-b968-c4f1277ae9cb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322195-9gkfv" Oct 01 15:15:00 crc kubenswrapper[4771]: I1001 15:15:00.301526 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5v47\" (UniqueName: \"kubernetes.io/projected/49d1d22e-d17a-40b5-b968-c4f1277ae9cb-kube-api-access-x5v47\") pod \"collect-profiles-29322195-9gkfv\" (UID: \"49d1d22e-d17a-40b5-b968-c4f1277ae9cb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322195-9gkfv" Oct 01 15:15:00 crc kubenswrapper[4771]: I1001 15:15:00.303295 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/49d1d22e-d17a-40b5-b968-c4f1277ae9cb-config-volume\") pod \"collect-profiles-29322195-9gkfv\" (UID: \"49d1d22e-d17a-40b5-b968-c4f1277ae9cb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322195-9gkfv" Oct 01 15:15:00 crc kubenswrapper[4771]: I1001 15:15:00.303966 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-d6484bc47-hkzdw" Oct 01 15:15:00 crc kubenswrapper[4771]: I1001 15:15:00.307478 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/49d1d22e-d17a-40b5-b968-c4f1277ae9cb-secret-volume\") pod \"collect-profiles-29322195-9gkfv\" (UID: \"49d1d22e-d17a-40b5-b968-c4f1277ae9cb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322195-9gkfv" Oct 01 15:15:00 crc kubenswrapper[4771]: I1001 15:15:00.324357 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5v47\" (UniqueName: \"kubernetes.io/projected/49d1d22e-d17a-40b5-b968-c4f1277ae9cb-kube-api-access-x5v47\") pod \"collect-profiles-29322195-9gkfv\" (UID: \"49d1d22e-d17a-40b5-b968-c4f1277ae9cb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322195-9gkfv" Oct 01 15:15:00 crc kubenswrapper[4771]: I1001 15:15:00.453846 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322195-9gkfv" Oct 01 15:15:00 crc kubenswrapper[4771]: I1001 15:15:00.718918 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 01 15:15:00 crc kubenswrapper[4771]: I1001 15:15:00.903881 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322195-9gkfv"] Oct 01 15:15:01 crc kubenswrapper[4771]: I1001 15:15:01.425950 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322195-9gkfv" event={"ID":"49d1d22e-d17a-40b5-b968-c4f1277ae9cb","Type":"ContainerStarted","Data":"d23f1b75738297e3c10440f1c10adb24c572555ac4544f00d7eec0e592acaa26"} Oct 01 15:15:02 crc kubenswrapper[4771]: I1001 15:15:02.452420 4771 generic.go:334] "Generic (PLEG): container finished" podID="49d1d22e-d17a-40b5-b968-c4f1277ae9cb" containerID="0ab99eb779308fb00453a3d5a5ac906b56271e46379cdb79c78d6210c1984a29" exitCode=0 Oct 01 15:15:02 crc kubenswrapper[4771]: I1001 15:15:02.452975 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322195-9gkfv" event={"ID":"49d1d22e-d17a-40b5-b968-c4f1277ae9cb","Type":"ContainerDied","Data":"0ab99eb779308fb00453a3d5a5ac906b56271e46379cdb79c78d6210c1984a29"} Oct 01 15:15:02 crc kubenswrapper[4771]: I1001 15:15:02.456237 4771 generic.go:334] "Generic (PLEG): container finished" podID="d5b6a127-5e69-4245-bb95-b5aa136a80e5" containerID="84abb579fa2accc1e6928c8cfcaa03cacfd3aa073a677b804838ee82506ea95c" exitCode=137 Oct 01 15:15:02 crc kubenswrapper[4771]: I1001 15:15:02.456312 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d5b6a127-5e69-4245-bb95-b5aa136a80e5","Type":"ContainerDied","Data":"84abb579fa2accc1e6928c8cfcaa03cacfd3aa073a677b804838ee82506ea95c"} Oct 01 15:15:02 crc kubenswrapper[4771]: I1001 15:15:02.474252 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"14a91b80-f355-4418-869a-35b3080e7695","Type":"ContainerStarted","Data":"288a21b1c678f6d1e6bb63a9b7de8e022df54a1ece0bdf0e32d69ad3e267a8e1"} Oct 01 15:15:02 crc kubenswrapper[4771]: I1001 15:15:02.474469 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="14a91b80-f355-4418-869a-35b3080e7695" containerName="ceilometer-central-agent" containerID="cri-o://be2c19f64fe57c6c719685b5199a9694ee0e0521d0e74e8f634d6a6437d4291e" gracePeriod=30 Oct 01 15:15:02 crc kubenswrapper[4771]: I1001 15:15:02.474494 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 15:15:02 crc kubenswrapper[4771]: I1001 15:15:02.474846 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="14a91b80-f355-4418-869a-35b3080e7695" containerName="sg-core" containerID="cri-o://e899abcac415be6d3374df8a11ae22aa5e33d79c8e99f3b621012a4ea53498f6" gracePeriod=30 Oct 01 15:15:02 crc kubenswrapper[4771]: I1001 15:15:02.474932 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="14a91b80-f355-4418-869a-35b3080e7695" containerName="ceilometer-notification-agent" containerID="cri-o://e0e39c3b7861ee1e97ad51566034d1cabeb1285bb506c8f020f3d12f0dd8ba2f" gracePeriod=30 Oct 01 15:15:02 crc kubenswrapper[4771]: I1001 15:15:02.474976 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="14a91b80-f355-4418-869a-35b3080e7695" containerName="proxy-httpd" containerID="cri-o://288a21b1c678f6d1e6bb63a9b7de8e022df54a1ece0bdf0e32d69ad3e267a8e1" gracePeriod=30 Oct 01 15:15:02 crc kubenswrapper[4771]: I1001 15:15:02.507192 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.026351543 podStartE2EDuration="9.507148337s" podCreationTimestamp="2025-10-01 15:14:53 +0000 UTC" firstStartedPulling="2025-10-01 15:14:54.511151806 +0000 UTC m=+1139.130326977" lastFinishedPulling="2025-10-01 15:15:01.99194858 +0000 UTC m=+1146.611123771" observedRunningTime="2025-10-01 15:15:02.496255158 +0000 UTC m=+1147.115430339" watchObservedRunningTime="2025-10-01 15:15:02.507148337 +0000 UTC m=+1147.126323508" Oct 01 15:15:02 crc kubenswrapper[4771]: I1001 15:15:02.640822 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 01 15:15:02 crc kubenswrapper[4771]: I1001 15:15:02.749019 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5b6a127-5e69-4245-bb95-b5aa136a80e5-logs\") pod \"d5b6a127-5e69-4245-bb95-b5aa136a80e5\" (UID: \"d5b6a127-5e69-4245-bb95-b5aa136a80e5\") " Oct 01 15:15:02 crc kubenswrapper[4771]: I1001 15:15:02.749480 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5b6a127-5e69-4245-bb95-b5aa136a80e5-config-data\") pod \"d5b6a127-5e69-4245-bb95-b5aa136a80e5\" (UID: \"d5b6a127-5e69-4245-bb95-b5aa136a80e5\") " Oct 01 15:15:02 crc kubenswrapper[4771]: I1001 15:15:02.749579 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5b6a127-5e69-4245-bb95-b5aa136a80e5-combined-ca-bundle\") pod \"d5b6a127-5e69-4245-bb95-b5aa136a80e5\" (UID: \"d5b6a127-5e69-4245-bb95-b5aa136a80e5\") " Oct 01 15:15:02 crc kubenswrapper[4771]: I1001 15:15:02.749637 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d5b6a127-5e69-4245-bb95-b5aa136a80e5-config-data-custom\") pod \"d5b6a127-5e69-4245-bb95-b5aa136a80e5\" (UID: \"d5b6a127-5e69-4245-bb95-b5aa136a80e5\") " Oct 01 15:15:02 crc kubenswrapper[4771]: I1001 15:15:02.749679 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5b6a127-5e69-4245-bb95-b5aa136a80e5-scripts\") pod \"d5b6a127-5e69-4245-bb95-b5aa136a80e5\" (UID: \"d5b6a127-5e69-4245-bb95-b5aa136a80e5\") " Oct 01 15:15:02 crc kubenswrapper[4771]: I1001 15:15:02.749719 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5b6a127-5e69-4245-bb95-b5aa136a80e5-logs" (OuterVolumeSpecName: "logs") pod "d5b6a127-5e69-4245-bb95-b5aa136a80e5" (UID: "d5b6a127-5e69-4245-bb95-b5aa136a80e5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:15:02 crc kubenswrapper[4771]: I1001 15:15:02.749777 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdjpb\" (UniqueName: \"kubernetes.io/projected/d5b6a127-5e69-4245-bb95-b5aa136a80e5-kube-api-access-jdjpb\") pod \"d5b6a127-5e69-4245-bb95-b5aa136a80e5\" (UID: \"d5b6a127-5e69-4245-bb95-b5aa136a80e5\") " Oct 01 15:15:02 crc kubenswrapper[4771]: I1001 15:15:02.749824 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d5b6a127-5e69-4245-bb95-b5aa136a80e5-etc-machine-id\") pod \"d5b6a127-5e69-4245-bb95-b5aa136a80e5\" (UID: \"d5b6a127-5e69-4245-bb95-b5aa136a80e5\") " Oct 01 15:15:02 crc kubenswrapper[4771]: I1001 15:15:02.750385 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5b6a127-5e69-4245-bb95-b5aa136a80e5-logs\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:02 crc kubenswrapper[4771]: I1001 15:15:02.750435 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d5b6a127-5e69-4245-bb95-b5aa136a80e5-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d5b6a127-5e69-4245-bb95-b5aa136a80e5" (UID: "d5b6a127-5e69-4245-bb95-b5aa136a80e5"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 15:15:02 crc kubenswrapper[4771]: I1001 15:15:02.768231 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5b6a127-5e69-4245-bb95-b5aa136a80e5-kube-api-access-jdjpb" (OuterVolumeSpecName: "kube-api-access-jdjpb") pod "d5b6a127-5e69-4245-bb95-b5aa136a80e5" (UID: "d5b6a127-5e69-4245-bb95-b5aa136a80e5"). InnerVolumeSpecName "kube-api-access-jdjpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:15:02 crc kubenswrapper[4771]: I1001 15:15:02.769941 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5b6a127-5e69-4245-bb95-b5aa136a80e5-scripts" (OuterVolumeSpecName: "scripts") pod "d5b6a127-5e69-4245-bb95-b5aa136a80e5" (UID: "d5b6a127-5e69-4245-bb95-b5aa136a80e5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:15:02 crc kubenswrapper[4771]: I1001 15:15:02.778626 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5b6a127-5e69-4245-bb95-b5aa136a80e5-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d5b6a127-5e69-4245-bb95-b5aa136a80e5" (UID: "d5b6a127-5e69-4245-bb95-b5aa136a80e5"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:15:02 crc kubenswrapper[4771]: I1001 15:15:02.801463 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5b6a127-5e69-4245-bb95-b5aa136a80e5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d5b6a127-5e69-4245-bb95-b5aa136a80e5" (UID: "d5b6a127-5e69-4245-bb95-b5aa136a80e5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:15:02 crc kubenswrapper[4771]: I1001 15:15:02.827643 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5b6a127-5e69-4245-bb95-b5aa136a80e5-config-data" (OuterVolumeSpecName: "config-data") pod "d5b6a127-5e69-4245-bb95-b5aa136a80e5" (UID: "d5b6a127-5e69-4245-bb95-b5aa136a80e5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:15:02 crc kubenswrapper[4771]: I1001 15:15:02.853198 4771 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d5b6a127-5e69-4245-bb95-b5aa136a80e5-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:02 crc kubenswrapper[4771]: I1001 15:15:02.853236 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5b6a127-5e69-4245-bb95-b5aa136a80e5-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:02 crc kubenswrapper[4771]: I1001 15:15:02.853251 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdjpb\" (UniqueName: \"kubernetes.io/projected/d5b6a127-5e69-4245-bb95-b5aa136a80e5-kube-api-access-jdjpb\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:02 crc kubenswrapper[4771]: I1001 15:15:02.853266 4771 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d5b6a127-5e69-4245-bb95-b5aa136a80e5-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:02 crc kubenswrapper[4771]: I1001 15:15:02.853277 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5b6a127-5e69-4245-bb95-b5aa136a80e5-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:02 crc kubenswrapper[4771]: I1001 15:15:02.853287 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5b6a127-5e69-4245-bb95-b5aa136a80e5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.494516 4771 generic.go:334] "Generic (PLEG): container finished" podID="14a91b80-f355-4418-869a-35b3080e7695" containerID="288a21b1c678f6d1e6bb63a9b7de8e022df54a1ece0bdf0e32d69ad3e267a8e1" exitCode=0 Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.494559 4771 generic.go:334] "Generic (PLEG): container finished" podID="14a91b80-f355-4418-869a-35b3080e7695" containerID="e899abcac415be6d3374df8a11ae22aa5e33d79c8e99f3b621012a4ea53498f6" exitCode=2 Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.494572 4771 generic.go:334] "Generic (PLEG): container finished" podID="14a91b80-f355-4418-869a-35b3080e7695" containerID="e0e39c3b7861ee1e97ad51566034d1cabeb1285bb506c8f020f3d12f0dd8ba2f" exitCode=0 Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.494614 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"14a91b80-f355-4418-869a-35b3080e7695","Type":"ContainerDied","Data":"288a21b1c678f6d1e6bb63a9b7de8e022df54a1ece0bdf0e32d69ad3e267a8e1"} Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.494681 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"14a91b80-f355-4418-869a-35b3080e7695","Type":"ContainerDied","Data":"e899abcac415be6d3374df8a11ae22aa5e33d79c8e99f3b621012a4ea53498f6"} Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.494695 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"14a91b80-f355-4418-869a-35b3080e7695","Type":"ContainerDied","Data":"e0e39c3b7861ee1e97ad51566034d1cabeb1285bb506c8f020f3d12f0dd8ba2f"} Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.499856 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d5b6a127-5e69-4245-bb95-b5aa136a80e5","Type":"ContainerDied","Data":"d1bc830d5dca8cfc2f5d88569309135ee1c7b0416f31a94d3a65fe07bc60e9fe"} Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.499925 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.499972 4771 scope.go:117] "RemoveContainer" containerID="84abb579fa2accc1e6928c8cfcaa03cacfd3aa073a677b804838ee82506ea95c" Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.590786 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.608856 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.596367 4771 scope.go:117] "RemoveContainer" containerID="28e917088bcd60e0fdaf0aa6d0eff59e4c73b02d1d4bd8015e414f95abe783ce" Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.634764 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 01 15:15:03 crc kubenswrapper[4771]: E1001 15:15:03.635199 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5b6a127-5e69-4245-bb95-b5aa136a80e5" containerName="cinder-api" Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.635218 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5b6a127-5e69-4245-bb95-b5aa136a80e5" containerName="cinder-api" Oct 01 15:15:03 crc kubenswrapper[4771]: E1001 15:15:03.635235 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5b6a127-5e69-4245-bb95-b5aa136a80e5" containerName="cinder-api-log" Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.635242 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5b6a127-5e69-4245-bb95-b5aa136a80e5" containerName="cinder-api-log" Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.635436 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5b6a127-5e69-4245-bb95-b5aa136a80e5" containerName="cinder-api-log" Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.635452 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5b6a127-5e69-4245-bb95-b5aa136a80e5" containerName="cinder-api" Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.636347 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.642539 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.642743 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.642948 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.652534 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.670774 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/12fc4771-6958-4668-adcc-6aa10e36e1ea-public-tls-certs\") pod \"cinder-api-0\" (UID: \"12fc4771-6958-4668-adcc-6aa10e36e1ea\") " pod="openstack/cinder-api-0" Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.670865 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12fc4771-6958-4668-adcc-6aa10e36e1ea-scripts\") pod \"cinder-api-0\" (UID: \"12fc4771-6958-4668-adcc-6aa10e36e1ea\") " pod="openstack/cinder-api-0" Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.670906 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12fc4771-6958-4668-adcc-6aa10e36e1ea-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"12fc4771-6958-4668-adcc-6aa10e36e1ea\") " pod="openstack/cinder-api-0" Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.670955 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12fc4771-6958-4668-adcc-6aa10e36e1ea-logs\") pod \"cinder-api-0\" (UID: \"12fc4771-6958-4668-adcc-6aa10e36e1ea\") " pod="openstack/cinder-api-0" Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.670997 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12fc4771-6958-4668-adcc-6aa10e36e1ea-config-data\") pod \"cinder-api-0\" (UID: \"12fc4771-6958-4668-adcc-6aa10e36e1ea\") " pod="openstack/cinder-api-0" Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.671044 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9g4t6\" (UniqueName: \"kubernetes.io/projected/12fc4771-6958-4668-adcc-6aa10e36e1ea-kube-api-access-9g4t6\") pod \"cinder-api-0\" (UID: \"12fc4771-6958-4668-adcc-6aa10e36e1ea\") " pod="openstack/cinder-api-0" Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.671071 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/12fc4771-6958-4668-adcc-6aa10e36e1ea-config-data-custom\") pod \"cinder-api-0\" (UID: \"12fc4771-6958-4668-adcc-6aa10e36e1ea\") " pod="openstack/cinder-api-0" Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.671096 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/12fc4771-6958-4668-adcc-6aa10e36e1ea-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"12fc4771-6958-4668-adcc-6aa10e36e1ea\") " pod="openstack/cinder-api-0" Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.671127 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/12fc4771-6958-4668-adcc-6aa10e36e1ea-etc-machine-id\") pod \"cinder-api-0\" (UID: \"12fc4771-6958-4668-adcc-6aa10e36e1ea\") " pod="openstack/cinder-api-0" Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.774004 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/12fc4771-6958-4668-adcc-6aa10e36e1ea-public-tls-certs\") pod \"cinder-api-0\" (UID: \"12fc4771-6958-4668-adcc-6aa10e36e1ea\") " pod="openstack/cinder-api-0" Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.774471 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12fc4771-6958-4668-adcc-6aa10e36e1ea-scripts\") pod \"cinder-api-0\" (UID: \"12fc4771-6958-4668-adcc-6aa10e36e1ea\") " pod="openstack/cinder-api-0" Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.774520 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12fc4771-6958-4668-adcc-6aa10e36e1ea-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"12fc4771-6958-4668-adcc-6aa10e36e1ea\") " pod="openstack/cinder-api-0" Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.774545 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12fc4771-6958-4668-adcc-6aa10e36e1ea-logs\") pod \"cinder-api-0\" (UID: \"12fc4771-6958-4668-adcc-6aa10e36e1ea\") " pod="openstack/cinder-api-0" Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.774586 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12fc4771-6958-4668-adcc-6aa10e36e1ea-config-data\") pod \"cinder-api-0\" (UID: \"12fc4771-6958-4668-adcc-6aa10e36e1ea\") " pod="openstack/cinder-api-0" Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.774638 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9g4t6\" (UniqueName: \"kubernetes.io/projected/12fc4771-6958-4668-adcc-6aa10e36e1ea-kube-api-access-9g4t6\") pod \"cinder-api-0\" (UID: \"12fc4771-6958-4668-adcc-6aa10e36e1ea\") " pod="openstack/cinder-api-0" Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.774667 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/12fc4771-6958-4668-adcc-6aa10e36e1ea-config-data-custom\") pod \"cinder-api-0\" (UID: \"12fc4771-6958-4668-adcc-6aa10e36e1ea\") " pod="openstack/cinder-api-0" Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.774690 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/12fc4771-6958-4668-adcc-6aa10e36e1ea-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"12fc4771-6958-4668-adcc-6aa10e36e1ea\") " pod="openstack/cinder-api-0" Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.774719 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/12fc4771-6958-4668-adcc-6aa10e36e1ea-etc-machine-id\") pod \"cinder-api-0\" (UID: \"12fc4771-6958-4668-adcc-6aa10e36e1ea\") " pod="openstack/cinder-api-0" Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.774848 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/12fc4771-6958-4668-adcc-6aa10e36e1ea-etc-machine-id\") pod \"cinder-api-0\" (UID: \"12fc4771-6958-4668-adcc-6aa10e36e1ea\") " pod="openstack/cinder-api-0" Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.775798 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/12fc4771-6958-4668-adcc-6aa10e36e1ea-logs\") pod \"cinder-api-0\" (UID: \"12fc4771-6958-4668-adcc-6aa10e36e1ea\") " pod="openstack/cinder-api-0" Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.781234 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12fc4771-6958-4668-adcc-6aa10e36e1ea-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"12fc4771-6958-4668-adcc-6aa10e36e1ea\") " pod="openstack/cinder-api-0" Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.784333 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/12fc4771-6958-4668-adcc-6aa10e36e1ea-config-data-custom\") pod \"cinder-api-0\" (UID: \"12fc4771-6958-4668-adcc-6aa10e36e1ea\") " pod="openstack/cinder-api-0" Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.784344 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12fc4771-6958-4668-adcc-6aa10e36e1ea-config-data\") pod \"cinder-api-0\" (UID: \"12fc4771-6958-4668-adcc-6aa10e36e1ea\") " pod="openstack/cinder-api-0" Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.784422 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/12fc4771-6958-4668-adcc-6aa10e36e1ea-public-tls-certs\") pod \"cinder-api-0\" (UID: \"12fc4771-6958-4668-adcc-6aa10e36e1ea\") " pod="openstack/cinder-api-0" Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.787840 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/12fc4771-6958-4668-adcc-6aa10e36e1ea-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"12fc4771-6958-4668-adcc-6aa10e36e1ea\") " pod="openstack/cinder-api-0" Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.790161 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12fc4771-6958-4668-adcc-6aa10e36e1ea-scripts\") pod \"cinder-api-0\" (UID: \"12fc4771-6958-4668-adcc-6aa10e36e1ea\") " pod="openstack/cinder-api-0" Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.795421 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9g4t6\" (UniqueName: \"kubernetes.io/projected/12fc4771-6958-4668-adcc-6aa10e36e1ea-kube-api-access-9g4t6\") pod \"cinder-api-0\" (UID: \"12fc4771-6958-4668-adcc-6aa10e36e1ea\") " pod="openstack/cinder-api-0" Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.956352 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-7184-account-create-s58vj"] Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.958081 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7184-account-create-s58vj" Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.960657 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 01 15:15:03 crc kubenswrapper[4771]: I1001 15:15:03.982864 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.020477 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5b6a127-5e69-4245-bb95-b5aa136a80e5" path="/var/lib/kubelet/pods/d5b6a127-5e69-4245-bb95-b5aa136a80e5/volumes" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.021099 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-7184-account-create-s58vj"] Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.043536 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322195-9gkfv" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.054961 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.057276 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-3995-account-create-bqj22"] Oct 01 15:15:04 crc kubenswrapper[4771]: E1001 15:15:04.057791 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14a91b80-f355-4418-869a-35b3080e7695" containerName="ceilometer-central-agent" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.057812 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="14a91b80-f355-4418-869a-35b3080e7695" containerName="ceilometer-central-agent" Oct 01 15:15:04 crc kubenswrapper[4771]: E1001 15:15:04.057830 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14a91b80-f355-4418-869a-35b3080e7695" containerName="sg-core" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.057838 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="14a91b80-f355-4418-869a-35b3080e7695" containerName="sg-core" Oct 01 15:15:04 crc kubenswrapper[4771]: E1001 15:15:04.057857 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14a91b80-f355-4418-869a-35b3080e7695" containerName="ceilometer-notification-agent" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.057865 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="14a91b80-f355-4418-869a-35b3080e7695" containerName="ceilometer-notification-agent" Oct 01 15:15:04 crc kubenswrapper[4771]: E1001 15:15:04.057883 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49d1d22e-d17a-40b5-b968-c4f1277ae9cb" containerName="collect-profiles" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.057890 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="49d1d22e-d17a-40b5-b968-c4f1277ae9cb" containerName="collect-profiles" Oct 01 15:15:04 crc kubenswrapper[4771]: E1001 15:15:04.057904 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14a91b80-f355-4418-869a-35b3080e7695" containerName="proxy-httpd" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.057911 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="14a91b80-f355-4418-869a-35b3080e7695" containerName="proxy-httpd" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.058128 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="14a91b80-f355-4418-869a-35b3080e7695" containerName="ceilometer-central-agent" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.058146 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="14a91b80-f355-4418-869a-35b3080e7695" containerName="proxy-httpd" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.058162 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="49d1d22e-d17a-40b5-b968-c4f1277ae9cb" containerName="collect-profiles" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.058178 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="14a91b80-f355-4418-869a-35b3080e7695" containerName="ceilometer-notification-agent" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.058198 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="14a91b80-f355-4418-869a-35b3080e7695" containerName="sg-core" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.058942 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3995-account-create-bqj22" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.062608 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.065044 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-3995-account-create-bqj22"] Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.086981 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5v47\" (UniqueName: \"kubernetes.io/projected/49d1d22e-d17a-40b5-b968-c4f1277ae9cb-kube-api-access-x5v47\") pod \"49d1d22e-d17a-40b5-b968-c4f1277ae9cb\" (UID: \"49d1d22e-d17a-40b5-b968-c4f1277ae9cb\") " Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.087083 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/14a91b80-f355-4418-869a-35b3080e7695-log-httpd\") pod \"14a91b80-f355-4418-869a-35b3080e7695\" (UID: \"14a91b80-f355-4418-869a-35b3080e7695\") " Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.087102 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/14a91b80-f355-4418-869a-35b3080e7695-run-httpd\") pod \"14a91b80-f355-4418-869a-35b3080e7695\" (UID: \"14a91b80-f355-4418-869a-35b3080e7695\") " Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.087863 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fflfn\" (UniqueName: \"kubernetes.io/projected/14a91b80-f355-4418-869a-35b3080e7695-kube-api-access-fflfn\") pod \"14a91b80-f355-4418-869a-35b3080e7695\" (UID: \"14a91b80-f355-4418-869a-35b3080e7695\") " Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.087906 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/49d1d22e-d17a-40b5-b968-c4f1277ae9cb-config-volume\") pod \"49d1d22e-d17a-40b5-b968-c4f1277ae9cb\" (UID: \"49d1d22e-d17a-40b5-b968-c4f1277ae9cb\") " Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.087957 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14a91b80-f355-4418-869a-35b3080e7695-combined-ca-bundle\") pod \"14a91b80-f355-4418-869a-35b3080e7695\" (UID: \"14a91b80-f355-4418-869a-35b3080e7695\") " Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.088026 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14a91b80-f355-4418-869a-35b3080e7695-config-data\") pod \"14a91b80-f355-4418-869a-35b3080e7695\" (UID: \"14a91b80-f355-4418-869a-35b3080e7695\") " Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.088068 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/49d1d22e-d17a-40b5-b968-c4f1277ae9cb-secret-volume\") pod \"49d1d22e-d17a-40b5-b968-c4f1277ae9cb\" (UID: \"49d1d22e-d17a-40b5-b968-c4f1277ae9cb\") " Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.088085 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14a91b80-f355-4418-869a-35b3080e7695-scripts\") pod \"14a91b80-f355-4418-869a-35b3080e7695\" (UID: \"14a91b80-f355-4418-869a-35b3080e7695\") " Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.088109 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/14a91b80-f355-4418-869a-35b3080e7695-sg-core-conf-yaml\") pod \"14a91b80-f355-4418-869a-35b3080e7695\" (UID: \"14a91b80-f355-4418-869a-35b3080e7695\") " Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.088337 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-696x2\" (UniqueName: \"kubernetes.io/projected/9aa95246-7e3e-49cc-90df-6d96afa66bdb-kube-api-access-696x2\") pod \"nova-cell0-7184-account-create-s58vj\" (UID: \"9aa95246-7e3e-49cc-90df-6d96afa66bdb\") " pod="openstack/nova-cell0-7184-account-create-s58vj" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.089279 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14a91b80-f355-4418-869a-35b3080e7695-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "14a91b80-f355-4418-869a-35b3080e7695" (UID: "14a91b80-f355-4418-869a-35b3080e7695"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.089506 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14a91b80-f355-4418-869a-35b3080e7695-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "14a91b80-f355-4418-869a-35b3080e7695" (UID: "14a91b80-f355-4418-869a-35b3080e7695"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.090542 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49d1d22e-d17a-40b5-b968-c4f1277ae9cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "49d1d22e-d17a-40b5-b968-c4f1277ae9cb" (UID: "49d1d22e-d17a-40b5-b968-c4f1277ae9cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.094157 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14a91b80-f355-4418-869a-35b3080e7695-scripts" (OuterVolumeSpecName: "scripts") pod "14a91b80-f355-4418-869a-35b3080e7695" (UID: "14a91b80-f355-4418-869a-35b3080e7695"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.096593 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49d1d22e-d17a-40b5-b968-c4f1277ae9cb-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "49d1d22e-d17a-40b5-b968-c4f1277ae9cb" (UID: "49d1d22e-d17a-40b5-b968-c4f1277ae9cb"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.102941 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14a91b80-f355-4418-869a-35b3080e7695-kube-api-access-fflfn" (OuterVolumeSpecName: "kube-api-access-fflfn") pod "14a91b80-f355-4418-869a-35b3080e7695" (UID: "14a91b80-f355-4418-869a-35b3080e7695"). InnerVolumeSpecName "kube-api-access-fflfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.107783 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49d1d22e-d17a-40b5-b968-c4f1277ae9cb-kube-api-access-x5v47" (OuterVolumeSpecName: "kube-api-access-x5v47") pod "49d1d22e-d17a-40b5-b968-c4f1277ae9cb" (UID: "49d1d22e-d17a-40b5-b968-c4f1277ae9cb"). InnerVolumeSpecName "kube-api-access-x5v47". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.122763 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14a91b80-f355-4418-869a-35b3080e7695-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "14a91b80-f355-4418-869a-35b3080e7695" (UID: "14a91b80-f355-4418-869a-35b3080e7695"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.190376 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdwzz\" (UniqueName: \"kubernetes.io/projected/2a7e33d0-9533-4b7f-9bf3-d5b55185f04e-kube-api-access-rdwzz\") pod \"nova-cell1-3995-account-create-bqj22\" (UID: \"2a7e33d0-9533-4b7f-9bf3-d5b55185f04e\") " pod="openstack/nova-cell1-3995-account-create-bqj22" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.190770 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-696x2\" (UniqueName: \"kubernetes.io/projected/9aa95246-7e3e-49cc-90df-6d96afa66bdb-kube-api-access-696x2\") pod \"nova-cell0-7184-account-create-s58vj\" (UID: \"9aa95246-7e3e-49cc-90df-6d96afa66bdb\") " pod="openstack/nova-cell0-7184-account-create-s58vj" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.190887 4771 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/49d1d22e-d17a-40b5-b968-c4f1277ae9cb-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.190907 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/14a91b80-f355-4418-869a-35b3080e7695-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.190920 4771 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/14a91b80-f355-4418-869a-35b3080e7695-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.190934 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5v47\" (UniqueName: \"kubernetes.io/projected/49d1d22e-d17a-40b5-b968-c4f1277ae9cb-kube-api-access-x5v47\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.190945 4771 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/14a91b80-f355-4418-869a-35b3080e7695-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.190956 4771 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/14a91b80-f355-4418-869a-35b3080e7695-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.190968 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fflfn\" (UniqueName: \"kubernetes.io/projected/14a91b80-f355-4418-869a-35b3080e7695-kube-api-access-fflfn\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.190980 4771 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/49d1d22e-d17a-40b5-b968-c4f1277ae9cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.205358 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14a91b80-f355-4418-869a-35b3080e7695-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "14a91b80-f355-4418-869a-35b3080e7695" (UID: "14a91b80-f355-4418-869a-35b3080e7695"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.213669 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-696x2\" (UniqueName: \"kubernetes.io/projected/9aa95246-7e3e-49cc-90df-6d96afa66bdb-kube-api-access-696x2\") pod \"nova-cell0-7184-account-create-s58vj\" (UID: \"9aa95246-7e3e-49cc-90df-6d96afa66bdb\") " pod="openstack/nova-cell0-7184-account-create-s58vj" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.243849 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14a91b80-f355-4418-869a-35b3080e7695-config-data" (OuterVolumeSpecName: "config-data") pod "14a91b80-f355-4418-869a-35b3080e7695" (UID: "14a91b80-f355-4418-869a-35b3080e7695"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.296007 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwzz\" (UniqueName: \"kubernetes.io/projected/2a7e33d0-9533-4b7f-9bf3-d5b55185f04e-kube-api-access-rdwzz\") pod \"nova-cell1-3995-account-create-bqj22\" (UID: \"2a7e33d0-9533-4b7f-9bf3-d5b55185f04e\") " pod="openstack/nova-cell1-3995-account-create-bqj22" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.296166 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14a91b80-f355-4418-869a-35b3080e7695-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.296184 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/14a91b80-f355-4418-869a-35b3080e7695-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.335694 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwzz\" (UniqueName: \"kubernetes.io/projected/2a7e33d0-9533-4b7f-9bf3-d5b55185f04e-kube-api-access-rdwzz\") pod \"nova-cell1-3995-account-create-bqj22\" (UID: \"2a7e33d0-9533-4b7f-9bf3-d5b55185f04e\") " pod="openstack/nova-cell1-3995-account-create-bqj22" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.380916 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7184-account-create-s58vj" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.391145 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3995-account-create-bqj22" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.526762 4771 generic.go:334] "Generic (PLEG): container finished" podID="14a91b80-f355-4418-869a-35b3080e7695" containerID="be2c19f64fe57c6c719685b5199a9694ee0e0521d0e74e8f634d6a6437d4291e" exitCode=0 Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.526837 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"14a91b80-f355-4418-869a-35b3080e7695","Type":"ContainerDied","Data":"be2c19f64fe57c6c719685b5199a9694ee0e0521d0e74e8f634d6a6437d4291e"} Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.526885 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"14a91b80-f355-4418-869a-35b3080e7695","Type":"ContainerDied","Data":"a7c29d14cf9d1256c23bd290dba8fe5674c126ec05501a2e86fadf8084940446"} Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.526906 4771 scope.go:117] "RemoveContainer" containerID="288a21b1c678f6d1e6bb63a9b7de8e022df54a1ece0bdf0e32d69ad3e267a8e1" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.527067 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.540318 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322195-9gkfv" event={"ID":"49d1d22e-d17a-40b5-b968-c4f1277ae9cb","Type":"ContainerDied","Data":"d23f1b75738297e3c10440f1c10adb24c572555ac4544f00d7eec0e592acaa26"} Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.540351 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d23f1b75738297e3c10440f1c10adb24c572555ac4544f00d7eec0e592acaa26" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.540400 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322195-9gkfv" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.550477 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.782946 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.795425 4771 scope.go:117] "RemoveContainer" containerID="e899abcac415be6d3374df8a11ae22aa5e33d79c8e99f3b621012a4ea53498f6" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.811748 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.820164 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.822796 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.826103 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.826229 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.827219 4771 scope.go:117] "RemoveContainer" containerID="e0e39c3b7861ee1e97ad51566034d1cabeb1285bb506c8f020f3d12f0dd8ba2f" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.829932 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.879393 4771 scope.go:117] "RemoveContainer" containerID="be2c19f64fe57c6c719685b5199a9694ee0e0521d0e74e8f634d6a6437d4291e" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.913191 4771 scope.go:117] "RemoveContainer" containerID="288a21b1c678f6d1e6bb63a9b7de8e022df54a1ece0bdf0e32d69ad3e267a8e1" Oct 01 15:15:04 crc kubenswrapper[4771]: E1001 15:15:04.915552 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"288a21b1c678f6d1e6bb63a9b7de8e022df54a1ece0bdf0e32d69ad3e267a8e1\": container with ID starting with 288a21b1c678f6d1e6bb63a9b7de8e022df54a1ece0bdf0e32d69ad3e267a8e1 not found: ID does not exist" containerID="288a21b1c678f6d1e6bb63a9b7de8e022df54a1ece0bdf0e32d69ad3e267a8e1" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.915615 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"288a21b1c678f6d1e6bb63a9b7de8e022df54a1ece0bdf0e32d69ad3e267a8e1"} err="failed to get container status \"288a21b1c678f6d1e6bb63a9b7de8e022df54a1ece0bdf0e32d69ad3e267a8e1\": rpc error: code = NotFound desc = could not find container \"288a21b1c678f6d1e6bb63a9b7de8e022df54a1ece0bdf0e32d69ad3e267a8e1\": container with ID starting with 288a21b1c678f6d1e6bb63a9b7de8e022df54a1ece0bdf0e32d69ad3e267a8e1 not found: ID does not exist" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.915660 4771 scope.go:117] "RemoveContainer" containerID="e899abcac415be6d3374df8a11ae22aa5e33d79c8e99f3b621012a4ea53498f6" Oct 01 15:15:04 crc kubenswrapper[4771]: E1001 15:15:04.917631 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e899abcac415be6d3374df8a11ae22aa5e33d79c8e99f3b621012a4ea53498f6\": container with ID starting with e899abcac415be6d3374df8a11ae22aa5e33d79c8e99f3b621012a4ea53498f6 not found: ID does not exist" containerID="e899abcac415be6d3374df8a11ae22aa5e33d79c8e99f3b621012a4ea53498f6" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.917700 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e899abcac415be6d3374df8a11ae22aa5e33d79c8e99f3b621012a4ea53498f6"} err="failed to get container status \"e899abcac415be6d3374df8a11ae22aa5e33d79c8e99f3b621012a4ea53498f6\": rpc error: code = NotFound desc = could not find container \"e899abcac415be6d3374df8a11ae22aa5e33d79c8e99f3b621012a4ea53498f6\": container with ID starting with e899abcac415be6d3374df8a11ae22aa5e33d79c8e99f3b621012a4ea53498f6 not found: ID does not exist" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.917762 4771 scope.go:117] "RemoveContainer" containerID="e0e39c3b7861ee1e97ad51566034d1cabeb1285bb506c8f020f3d12f0dd8ba2f" Oct 01 15:15:04 crc kubenswrapper[4771]: E1001 15:15:04.919012 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0e39c3b7861ee1e97ad51566034d1cabeb1285bb506c8f020f3d12f0dd8ba2f\": container with ID starting with e0e39c3b7861ee1e97ad51566034d1cabeb1285bb506c8f020f3d12f0dd8ba2f not found: ID does not exist" containerID="e0e39c3b7861ee1e97ad51566034d1cabeb1285bb506c8f020f3d12f0dd8ba2f" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.919077 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0e39c3b7861ee1e97ad51566034d1cabeb1285bb506c8f020f3d12f0dd8ba2f"} err="failed to get container status \"e0e39c3b7861ee1e97ad51566034d1cabeb1285bb506c8f020f3d12f0dd8ba2f\": rpc error: code = NotFound desc = could not find container \"e0e39c3b7861ee1e97ad51566034d1cabeb1285bb506c8f020f3d12f0dd8ba2f\": container with ID starting with e0e39c3b7861ee1e97ad51566034d1cabeb1285bb506c8f020f3d12f0dd8ba2f not found: ID does not exist" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.919099 4771 scope.go:117] "RemoveContainer" containerID="be2c19f64fe57c6c719685b5199a9694ee0e0521d0e74e8f634d6a6437d4291e" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.919490 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/657585fd-f66c-49a8-b076-6b75dbb595e0-scripts\") pod \"ceilometer-0\" (UID: \"657585fd-f66c-49a8-b076-6b75dbb595e0\") " pod="openstack/ceilometer-0" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.919554 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsvk9\" (UniqueName: \"kubernetes.io/projected/657585fd-f66c-49a8-b076-6b75dbb595e0-kube-api-access-jsvk9\") pod \"ceilometer-0\" (UID: \"657585fd-f66c-49a8-b076-6b75dbb595e0\") " pod="openstack/ceilometer-0" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.919577 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/657585fd-f66c-49a8-b076-6b75dbb595e0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"657585fd-f66c-49a8-b076-6b75dbb595e0\") " pod="openstack/ceilometer-0" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.919598 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/657585fd-f66c-49a8-b076-6b75dbb595e0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"657585fd-f66c-49a8-b076-6b75dbb595e0\") " pod="openstack/ceilometer-0" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.919633 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/657585fd-f66c-49a8-b076-6b75dbb595e0-log-httpd\") pod \"ceilometer-0\" (UID: \"657585fd-f66c-49a8-b076-6b75dbb595e0\") " pod="openstack/ceilometer-0" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.919699 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/657585fd-f66c-49a8-b076-6b75dbb595e0-config-data\") pod \"ceilometer-0\" (UID: \"657585fd-f66c-49a8-b076-6b75dbb595e0\") " pod="openstack/ceilometer-0" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.919797 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/657585fd-f66c-49a8-b076-6b75dbb595e0-run-httpd\") pod \"ceilometer-0\" (UID: \"657585fd-f66c-49a8-b076-6b75dbb595e0\") " pod="openstack/ceilometer-0" Oct 01 15:15:04 crc kubenswrapper[4771]: E1001 15:15:04.921201 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be2c19f64fe57c6c719685b5199a9694ee0e0521d0e74e8f634d6a6437d4291e\": container with ID starting with be2c19f64fe57c6c719685b5199a9694ee0e0521d0e74e8f634d6a6437d4291e not found: ID does not exist" containerID="be2c19f64fe57c6c719685b5199a9694ee0e0521d0e74e8f634d6a6437d4291e" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.921240 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be2c19f64fe57c6c719685b5199a9694ee0e0521d0e74e8f634d6a6437d4291e"} err="failed to get container status \"be2c19f64fe57c6c719685b5199a9694ee0e0521d0e74e8f634d6a6437d4291e\": rpc error: code = NotFound desc = could not find container \"be2c19f64fe57c6c719685b5199a9694ee0e0521d0e74e8f634d6a6437d4291e\": container with ID starting with be2c19f64fe57c6c719685b5199a9694ee0e0521d0e74e8f634d6a6437d4291e not found: ID does not exist" Oct 01 15:15:04 crc kubenswrapper[4771]: I1001 15:15:04.955314 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-3995-account-create-bqj22"] Oct 01 15:15:04 crc kubenswrapper[4771]: W1001 15:15:04.962041 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a7e33d0_9533_4b7f_9bf3_d5b55185f04e.slice/crio-57ad4ed512c019a31f7118fd706f9ae94a11c6eaa805fc50ca235cf27d34fdac WatchSource:0}: Error finding container 57ad4ed512c019a31f7118fd706f9ae94a11c6eaa805fc50ca235cf27d34fdac: Status 404 returned error can't find the container with id 57ad4ed512c019a31f7118fd706f9ae94a11c6eaa805fc50ca235cf27d34fdac Oct 01 15:15:05 crc kubenswrapper[4771]: I1001 15:15:05.021398 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/657585fd-f66c-49a8-b076-6b75dbb595e0-config-data\") pod \"ceilometer-0\" (UID: \"657585fd-f66c-49a8-b076-6b75dbb595e0\") " pod="openstack/ceilometer-0" Oct 01 15:15:05 crc kubenswrapper[4771]: I1001 15:15:05.022045 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/657585fd-f66c-49a8-b076-6b75dbb595e0-run-httpd\") pod \"ceilometer-0\" (UID: \"657585fd-f66c-49a8-b076-6b75dbb595e0\") " pod="openstack/ceilometer-0" Oct 01 15:15:05 crc kubenswrapper[4771]: I1001 15:15:05.022188 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/657585fd-f66c-49a8-b076-6b75dbb595e0-scripts\") pod \"ceilometer-0\" (UID: \"657585fd-f66c-49a8-b076-6b75dbb595e0\") " pod="openstack/ceilometer-0" Oct 01 15:15:05 crc kubenswrapper[4771]: I1001 15:15:05.022267 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsvk9\" (UniqueName: \"kubernetes.io/projected/657585fd-f66c-49a8-b076-6b75dbb595e0-kube-api-access-jsvk9\") pod \"ceilometer-0\" (UID: \"657585fd-f66c-49a8-b076-6b75dbb595e0\") " pod="openstack/ceilometer-0" Oct 01 15:15:05 crc kubenswrapper[4771]: I1001 15:15:05.022309 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/657585fd-f66c-49a8-b076-6b75dbb595e0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"657585fd-f66c-49a8-b076-6b75dbb595e0\") " pod="openstack/ceilometer-0" Oct 01 15:15:05 crc kubenswrapper[4771]: I1001 15:15:05.022359 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/657585fd-f66c-49a8-b076-6b75dbb595e0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"657585fd-f66c-49a8-b076-6b75dbb595e0\") " pod="openstack/ceilometer-0" Oct 01 15:15:05 crc kubenswrapper[4771]: I1001 15:15:05.022457 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/657585fd-f66c-49a8-b076-6b75dbb595e0-log-httpd\") pod \"ceilometer-0\" (UID: \"657585fd-f66c-49a8-b076-6b75dbb595e0\") " pod="openstack/ceilometer-0" Oct 01 15:15:05 crc kubenswrapper[4771]: I1001 15:15:05.022618 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/657585fd-f66c-49a8-b076-6b75dbb595e0-run-httpd\") pod \"ceilometer-0\" (UID: \"657585fd-f66c-49a8-b076-6b75dbb595e0\") " pod="openstack/ceilometer-0" Oct 01 15:15:05 crc kubenswrapper[4771]: I1001 15:15:05.023343 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/657585fd-f66c-49a8-b076-6b75dbb595e0-log-httpd\") pod \"ceilometer-0\" (UID: \"657585fd-f66c-49a8-b076-6b75dbb595e0\") " pod="openstack/ceilometer-0" Oct 01 15:15:05 crc kubenswrapper[4771]: I1001 15:15:05.027479 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/657585fd-f66c-49a8-b076-6b75dbb595e0-scripts\") pod \"ceilometer-0\" (UID: \"657585fd-f66c-49a8-b076-6b75dbb595e0\") " pod="openstack/ceilometer-0" Oct 01 15:15:05 crc kubenswrapper[4771]: I1001 15:15:05.030016 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/657585fd-f66c-49a8-b076-6b75dbb595e0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"657585fd-f66c-49a8-b076-6b75dbb595e0\") " pod="openstack/ceilometer-0" Oct 01 15:15:05 crc kubenswrapper[4771]: I1001 15:15:05.030636 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/657585fd-f66c-49a8-b076-6b75dbb595e0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"657585fd-f66c-49a8-b076-6b75dbb595e0\") " pod="openstack/ceilometer-0" Oct 01 15:15:05 crc kubenswrapper[4771]: I1001 15:15:05.032617 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/657585fd-f66c-49a8-b076-6b75dbb595e0-config-data\") pod \"ceilometer-0\" (UID: \"657585fd-f66c-49a8-b076-6b75dbb595e0\") " pod="openstack/ceilometer-0" Oct 01 15:15:05 crc kubenswrapper[4771]: I1001 15:15:05.040773 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsvk9\" (UniqueName: \"kubernetes.io/projected/657585fd-f66c-49a8-b076-6b75dbb595e0-kube-api-access-jsvk9\") pod \"ceilometer-0\" (UID: \"657585fd-f66c-49a8-b076-6b75dbb595e0\") " pod="openstack/ceilometer-0" Oct 01 15:15:05 crc kubenswrapper[4771]: I1001 15:15:05.110510 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-7184-account-create-s58vj"] Oct 01 15:15:05 crc kubenswrapper[4771]: I1001 15:15:05.145509 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 15:15:05 crc kubenswrapper[4771]: I1001 15:15:05.558546 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"12fc4771-6958-4668-adcc-6aa10e36e1ea","Type":"ContainerStarted","Data":"7297b3bf37833d3e25599dfc555c53e8d5d21045b425485574d38fd98bab294b"} Oct 01 15:15:05 crc kubenswrapper[4771]: I1001 15:15:05.559090 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"12fc4771-6958-4668-adcc-6aa10e36e1ea","Type":"ContainerStarted","Data":"d48643853735833b27246cdc3503a505e6dfec5c13e48dc7234d7526b12e0439"} Oct 01 15:15:05 crc kubenswrapper[4771]: I1001 15:15:05.583639 4771 generic.go:334] "Generic (PLEG): container finished" podID="2a7e33d0-9533-4b7f-9bf3-d5b55185f04e" containerID="97995242163b40a7a899fdc97b76b359f163a71d228c41ee82476bdbe4f7f219" exitCode=0 Oct 01 15:15:05 crc kubenswrapper[4771]: I1001 15:15:05.583781 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3995-account-create-bqj22" event={"ID":"2a7e33d0-9533-4b7f-9bf3-d5b55185f04e","Type":"ContainerDied","Data":"97995242163b40a7a899fdc97b76b359f163a71d228c41ee82476bdbe4f7f219"} Oct 01 15:15:05 crc kubenswrapper[4771]: I1001 15:15:05.583854 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3995-account-create-bqj22" event={"ID":"2a7e33d0-9533-4b7f-9bf3-d5b55185f04e","Type":"ContainerStarted","Data":"57ad4ed512c019a31f7118fd706f9ae94a11c6eaa805fc50ca235cf27d34fdac"} Oct 01 15:15:05 crc kubenswrapper[4771]: I1001 15:15:05.594655 4771 generic.go:334] "Generic (PLEG): container finished" podID="9aa95246-7e3e-49cc-90df-6d96afa66bdb" containerID="1a5f5f9a60df4d9edeecf2be5ab3c48d1643e112880c8b38f1a80d0a3738b41d" exitCode=0 Oct 01 15:15:05 crc kubenswrapper[4771]: I1001 15:15:05.594961 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7184-account-create-s58vj" event={"ID":"9aa95246-7e3e-49cc-90df-6d96afa66bdb","Type":"ContainerDied","Data":"1a5f5f9a60df4d9edeecf2be5ab3c48d1643e112880c8b38f1a80d0a3738b41d"} Oct 01 15:15:05 crc kubenswrapper[4771]: I1001 15:15:05.595196 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7184-account-create-s58vj" event={"ID":"9aa95246-7e3e-49cc-90df-6d96afa66bdb","Type":"ContainerStarted","Data":"afb6b75b2e9d881155bf2489943848c275af82feb8defb55fe0d21bf4fa0447f"} Oct 01 15:15:05 crc kubenswrapper[4771]: I1001 15:15:05.620688 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 15:15:06 crc kubenswrapper[4771]: I1001 15:15:06.006531 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14a91b80-f355-4418-869a-35b3080e7695" path="/var/lib/kubelet/pods/14a91b80-f355-4418-869a-35b3080e7695/volumes" Oct 01 15:15:06 crc kubenswrapper[4771]: I1001 15:15:06.607320 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"12fc4771-6958-4668-adcc-6aa10e36e1ea","Type":"ContainerStarted","Data":"50af7d9b36eae619d1c466c8d6b3b5e5ed18204e71c12261c6d9b2962d4ae316"} Oct 01 15:15:06 crc kubenswrapper[4771]: I1001 15:15:06.609002 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 01 15:15:06 crc kubenswrapper[4771]: I1001 15:15:06.613292 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"657585fd-f66c-49a8-b076-6b75dbb595e0","Type":"ContainerStarted","Data":"e7883bc02c877e3b0faadbde73d24ad745ef3a5d4d200eff172ff50991003a3d"} Oct 01 15:15:06 crc kubenswrapper[4771]: I1001 15:15:06.613357 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"657585fd-f66c-49a8-b076-6b75dbb595e0","Type":"ContainerStarted","Data":"27ff0efc5b8367d7fd06446aa243fbbd35cdb98b0252b01e6d204c16f016e1b7"} Oct 01 15:15:06 crc kubenswrapper[4771]: I1001 15:15:06.636000 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.635985201 podStartE2EDuration="3.635985201s" podCreationTimestamp="2025-10-01 15:15:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:15:06.630522296 +0000 UTC m=+1151.249697467" watchObservedRunningTime="2025-10-01 15:15:06.635985201 +0000 UTC m=+1151.255160372" Oct 01 15:15:07 crc kubenswrapper[4771]: I1001 15:15:07.066810 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3995-account-create-bqj22" Oct 01 15:15:07 crc kubenswrapper[4771]: I1001 15:15:07.066961 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7184-account-create-s58vj" Oct 01 15:15:07 crc kubenswrapper[4771]: I1001 15:15:07.166797 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdwzz\" (UniqueName: \"kubernetes.io/projected/2a7e33d0-9533-4b7f-9bf3-d5b55185f04e-kube-api-access-rdwzz\") pod \"2a7e33d0-9533-4b7f-9bf3-d5b55185f04e\" (UID: \"2a7e33d0-9533-4b7f-9bf3-d5b55185f04e\") " Oct 01 15:15:07 crc kubenswrapper[4771]: I1001 15:15:07.166984 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-696x2\" (UniqueName: \"kubernetes.io/projected/9aa95246-7e3e-49cc-90df-6d96afa66bdb-kube-api-access-696x2\") pod \"9aa95246-7e3e-49cc-90df-6d96afa66bdb\" (UID: \"9aa95246-7e3e-49cc-90df-6d96afa66bdb\") " Oct 01 15:15:07 crc kubenswrapper[4771]: I1001 15:15:07.171973 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9aa95246-7e3e-49cc-90df-6d96afa66bdb-kube-api-access-696x2" (OuterVolumeSpecName: "kube-api-access-696x2") pod "9aa95246-7e3e-49cc-90df-6d96afa66bdb" (UID: "9aa95246-7e3e-49cc-90df-6d96afa66bdb"). InnerVolumeSpecName "kube-api-access-696x2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:15:07 crc kubenswrapper[4771]: I1001 15:15:07.172807 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a7e33d0-9533-4b7f-9bf3-d5b55185f04e-kube-api-access-rdwzz" (OuterVolumeSpecName: "kube-api-access-rdwzz") pod "2a7e33d0-9533-4b7f-9bf3-d5b55185f04e" (UID: "2a7e33d0-9533-4b7f-9bf3-d5b55185f04e"). InnerVolumeSpecName "kube-api-access-rdwzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:15:07 crc kubenswrapper[4771]: I1001 15:15:07.269061 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-696x2\" (UniqueName: \"kubernetes.io/projected/9aa95246-7e3e-49cc-90df-6d96afa66bdb-kube-api-access-696x2\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:07 crc kubenswrapper[4771]: I1001 15:15:07.269090 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdwzz\" (UniqueName: \"kubernetes.io/projected/2a7e33d0-9533-4b7f-9bf3-d5b55185f04e-kube-api-access-rdwzz\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:07 crc kubenswrapper[4771]: I1001 15:15:07.629108 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7184-account-create-s58vj" event={"ID":"9aa95246-7e3e-49cc-90df-6d96afa66bdb","Type":"ContainerDied","Data":"afb6b75b2e9d881155bf2489943848c275af82feb8defb55fe0d21bf4fa0447f"} Oct 01 15:15:07 crc kubenswrapper[4771]: I1001 15:15:07.629154 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="afb6b75b2e9d881155bf2489943848c275af82feb8defb55fe0d21bf4fa0447f" Oct 01 15:15:07 crc kubenswrapper[4771]: I1001 15:15:07.629112 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7184-account-create-s58vj" Oct 01 15:15:07 crc kubenswrapper[4771]: I1001 15:15:07.632203 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"657585fd-f66c-49a8-b076-6b75dbb595e0","Type":"ContainerStarted","Data":"512e1f6e7dcbb61af79b428a232574a25754417c45d05250c9ce0fc335c8cf7d"} Oct 01 15:15:07 crc kubenswrapper[4771]: I1001 15:15:07.634165 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-3995-account-create-bqj22" Oct 01 15:15:07 crc kubenswrapper[4771]: I1001 15:15:07.634239 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-3995-account-create-bqj22" event={"ID":"2a7e33d0-9533-4b7f-9bf3-d5b55185f04e","Type":"ContainerDied","Data":"57ad4ed512c019a31f7118fd706f9ae94a11c6eaa805fc50ca235cf27d34fdac"} Oct 01 15:15:07 crc kubenswrapper[4771]: I1001 15:15:07.634274 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57ad4ed512c019a31f7118fd706f9ae94a11c6eaa805fc50ca235cf27d34fdac" Oct 01 15:15:08 crc kubenswrapper[4771]: I1001 15:15:08.646124 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"657585fd-f66c-49a8-b076-6b75dbb595e0","Type":"ContainerStarted","Data":"97dc2b438d66d8c915161d33091b4c29d2e1a8daed279fc8afe1b9051f2ed1bb"} Oct 01 15:15:09 crc kubenswrapper[4771]: I1001 15:15:09.208266 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 15:15:09 crc kubenswrapper[4771]: I1001 15:15:09.208484 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f267338d-cce8-4abe-b836-519fcca98eed" containerName="glance-log" containerID="cri-o://11a86c2d50fbf4102cdb5f5fbef715c02373931b76a24fd501536adb701cd374" gracePeriod=30 Oct 01 15:15:09 crc kubenswrapper[4771]: I1001 15:15:09.208909 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="f267338d-cce8-4abe-b836-519fcca98eed" containerName="glance-httpd" containerID="cri-o://8df1a31fa585058e4faf5751ffdbf997240f309b7cb136fb0ddc8a85c0d1b7a0" gracePeriod=30 Oct 01 15:15:09 crc kubenswrapper[4771]: I1001 15:15:09.226643 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="f267338d-cce8-4abe-b836-519fcca98eed" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.166:9292/healthcheck\": EOF" Oct 01 15:15:09 crc kubenswrapper[4771]: I1001 15:15:09.227258 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/glance-default-external-api-0" podUID="f267338d-cce8-4abe-b836-519fcca98eed" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.166:9292/healthcheck\": EOF" Oct 01 15:15:09 crc kubenswrapper[4771]: I1001 15:15:09.227398 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="f267338d-cce8-4abe-b836-519fcca98eed" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.166:9292/healthcheck\": EOF" Oct 01 15:15:09 crc kubenswrapper[4771]: I1001 15:15:09.302854 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9xfmf"] Oct 01 15:15:09 crc kubenswrapper[4771]: E1001 15:15:09.303343 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aa95246-7e3e-49cc-90df-6d96afa66bdb" containerName="mariadb-account-create" Oct 01 15:15:09 crc kubenswrapper[4771]: I1001 15:15:09.303367 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aa95246-7e3e-49cc-90df-6d96afa66bdb" containerName="mariadb-account-create" Oct 01 15:15:09 crc kubenswrapper[4771]: E1001 15:15:09.303404 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a7e33d0-9533-4b7f-9bf3-d5b55185f04e" containerName="mariadb-account-create" Oct 01 15:15:09 crc kubenswrapper[4771]: I1001 15:15:09.303412 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a7e33d0-9533-4b7f-9bf3-d5b55185f04e" containerName="mariadb-account-create" Oct 01 15:15:09 crc kubenswrapper[4771]: I1001 15:15:09.303647 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a7e33d0-9533-4b7f-9bf3-d5b55185f04e" containerName="mariadb-account-create" Oct 01 15:15:09 crc kubenswrapper[4771]: I1001 15:15:09.303675 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="9aa95246-7e3e-49cc-90df-6d96afa66bdb" containerName="mariadb-account-create" Oct 01 15:15:09 crc kubenswrapper[4771]: I1001 15:15:09.304401 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-9xfmf" Oct 01 15:15:09 crc kubenswrapper[4771]: I1001 15:15:09.306151 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 01 15:15:09 crc kubenswrapper[4771]: I1001 15:15:09.307554 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-5mnmv" Oct 01 15:15:09 crc kubenswrapper[4771]: I1001 15:15:09.307760 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 01 15:15:09 crc kubenswrapper[4771]: I1001 15:15:09.325164 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9xfmf"] Oct 01 15:15:09 crc kubenswrapper[4771]: I1001 15:15:09.408065 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljvdj\" (UniqueName: \"kubernetes.io/projected/95f37934-8f6e-4013-b7a4-5563e5245c79-kube-api-access-ljvdj\") pod \"nova-cell0-conductor-db-sync-9xfmf\" (UID: \"95f37934-8f6e-4013-b7a4-5563e5245c79\") " pod="openstack/nova-cell0-conductor-db-sync-9xfmf" Oct 01 15:15:09 crc kubenswrapper[4771]: I1001 15:15:09.408124 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95f37934-8f6e-4013-b7a4-5563e5245c79-config-data\") pod \"nova-cell0-conductor-db-sync-9xfmf\" (UID: \"95f37934-8f6e-4013-b7a4-5563e5245c79\") " pod="openstack/nova-cell0-conductor-db-sync-9xfmf" Oct 01 15:15:09 crc kubenswrapper[4771]: I1001 15:15:09.408187 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95f37934-8f6e-4013-b7a4-5563e5245c79-scripts\") pod \"nova-cell0-conductor-db-sync-9xfmf\" (UID: \"95f37934-8f6e-4013-b7a4-5563e5245c79\") " pod="openstack/nova-cell0-conductor-db-sync-9xfmf" Oct 01 15:15:09 crc kubenswrapper[4771]: I1001 15:15:09.408242 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95f37934-8f6e-4013-b7a4-5563e5245c79-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-9xfmf\" (UID: \"95f37934-8f6e-4013-b7a4-5563e5245c79\") " pod="openstack/nova-cell0-conductor-db-sync-9xfmf" Oct 01 15:15:09 crc kubenswrapper[4771]: I1001 15:15:09.509905 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95f37934-8f6e-4013-b7a4-5563e5245c79-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-9xfmf\" (UID: \"95f37934-8f6e-4013-b7a4-5563e5245c79\") " pod="openstack/nova-cell0-conductor-db-sync-9xfmf" Oct 01 15:15:09 crc kubenswrapper[4771]: I1001 15:15:09.510068 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljvdj\" (UniqueName: \"kubernetes.io/projected/95f37934-8f6e-4013-b7a4-5563e5245c79-kube-api-access-ljvdj\") pod \"nova-cell0-conductor-db-sync-9xfmf\" (UID: \"95f37934-8f6e-4013-b7a4-5563e5245c79\") " pod="openstack/nova-cell0-conductor-db-sync-9xfmf" Oct 01 15:15:09 crc kubenswrapper[4771]: I1001 15:15:09.510094 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95f37934-8f6e-4013-b7a4-5563e5245c79-config-data\") pod \"nova-cell0-conductor-db-sync-9xfmf\" (UID: \"95f37934-8f6e-4013-b7a4-5563e5245c79\") " pod="openstack/nova-cell0-conductor-db-sync-9xfmf" Oct 01 15:15:09 crc kubenswrapper[4771]: I1001 15:15:09.510135 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95f37934-8f6e-4013-b7a4-5563e5245c79-scripts\") pod \"nova-cell0-conductor-db-sync-9xfmf\" (UID: \"95f37934-8f6e-4013-b7a4-5563e5245c79\") " pod="openstack/nova-cell0-conductor-db-sync-9xfmf" Oct 01 15:15:09 crc kubenswrapper[4771]: I1001 15:15:09.515317 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95f37934-8f6e-4013-b7a4-5563e5245c79-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-9xfmf\" (UID: \"95f37934-8f6e-4013-b7a4-5563e5245c79\") " pod="openstack/nova-cell0-conductor-db-sync-9xfmf" Oct 01 15:15:09 crc kubenswrapper[4771]: I1001 15:15:09.516303 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95f37934-8f6e-4013-b7a4-5563e5245c79-config-data\") pod \"nova-cell0-conductor-db-sync-9xfmf\" (UID: \"95f37934-8f6e-4013-b7a4-5563e5245c79\") " pod="openstack/nova-cell0-conductor-db-sync-9xfmf" Oct 01 15:15:09 crc kubenswrapper[4771]: I1001 15:15:09.521161 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95f37934-8f6e-4013-b7a4-5563e5245c79-scripts\") pod \"nova-cell0-conductor-db-sync-9xfmf\" (UID: \"95f37934-8f6e-4013-b7a4-5563e5245c79\") " pod="openstack/nova-cell0-conductor-db-sync-9xfmf" Oct 01 15:15:09 crc kubenswrapper[4771]: I1001 15:15:09.546089 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljvdj\" (UniqueName: \"kubernetes.io/projected/95f37934-8f6e-4013-b7a4-5563e5245c79-kube-api-access-ljvdj\") pod \"nova-cell0-conductor-db-sync-9xfmf\" (UID: \"95f37934-8f6e-4013-b7a4-5563e5245c79\") " pod="openstack/nova-cell0-conductor-db-sync-9xfmf" Oct 01 15:15:09 crc kubenswrapper[4771]: I1001 15:15:09.596975 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 15:15:09 crc kubenswrapper[4771]: I1001 15:15:09.621428 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-9xfmf" Oct 01 15:15:09 crc kubenswrapper[4771]: I1001 15:15:09.660865 4771 generic.go:334] "Generic (PLEG): container finished" podID="f267338d-cce8-4abe-b836-519fcca98eed" containerID="11a86c2d50fbf4102cdb5f5fbef715c02373931b76a24fd501536adb701cd374" exitCode=143 Oct 01 15:15:09 crc kubenswrapper[4771]: I1001 15:15:09.660918 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f267338d-cce8-4abe-b836-519fcca98eed","Type":"ContainerDied","Data":"11a86c2d50fbf4102cdb5f5fbef715c02373931b76a24fd501536adb701cd374"} Oct 01 15:15:10 crc kubenswrapper[4771]: W1001 15:15:10.082871 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95f37934_8f6e_4013_b7a4_5563e5245c79.slice/crio-433e3d0f762d387080d794e7f2f801f8f3f1a92c6288a83eadb30b7db5ea85ae WatchSource:0}: Error finding container 433e3d0f762d387080d794e7f2f801f8f3f1a92c6288a83eadb30b7db5ea85ae: Status 404 returned error can't find the container with id 433e3d0f762d387080d794e7f2f801f8f3f1a92c6288a83eadb30b7db5ea85ae Oct 01 15:15:10 crc kubenswrapper[4771]: I1001 15:15:10.083252 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9xfmf"] Oct 01 15:15:10 crc kubenswrapper[4771]: I1001 15:15:10.670724 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-9xfmf" event={"ID":"95f37934-8f6e-4013-b7a4-5563e5245c79","Type":"ContainerStarted","Data":"433e3d0f762d387080d794e7f2f801f8f3f1a92c6288a83eadb30b7db5ea85ae"} Oct 01 15:15:10 crc kubenswrapper[4771]: I1001 15:15:10.673226 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"657585fd-f66c-49a8-b076-6b75dbb595e0","Type":"ContainerStarted","Data":"9d69d5e519616e59a82b57158c316bfc849a80c1a281da81efb1edc40f90a644"} Oct 01 15:15:10 crc kubenswrapper[4771]: I1001 15:15:10.673376 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="657585fd-f66c-49a8-b076-6b75dbb595e0" containerName="ceilometer-central-agent" containerID="cri-o://e7883bc02c877e3b0faadbde73d24ad745ef3a5d4d200eff172ff50991003a3d" gracePeriod=30 Oct 01 15:15:10 crc kubenswrapper[4771]: I1001 15:15:10.673435 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 15:15:10 crc kubenswrapper[4771]: I1001 15:15:10.673443 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="657585fd-f66c-49a8-b076-6b75dbb595e0" containerName="ceilometer-notification-agent" containerID="cri-o://512e1f6e7dcbb61af79b428a232574a25754417c45d05250c9ce0fc335c8cf7d" gracePeriod=30 Oct 01 15:15:10 crc kubenswrapper[4771]: I1001 15:15:10.673461 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="657585fd-f66c-49a8-b076-6b75dbb595e0" containerName="sg-core" containerID="cri-o://97dc2b438d66d8c915161d33091b4c29d2e1a8daed279fc8afe1b9051f2ed1bb" gracePeriod=30 Oct 01 15:15:10 crc kubenswrapper[4771]: I1001 15:15:10.673797 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="657585fd-f66c-49a8-b076-6b75dbb595e0" containerName="proxy-httpd" containerID="cri-o://9d69d5e519616e59a82b57158c316bfc849a80c1a281da81efb1edc40f90a644" gracePeriod=30 Oct 01 15:15:10 crc kubenswrapper[4771]: I1001 15:15:10.710866 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.98774667 podStartE2EDuration="6.710818926s" podCreationTimestamp="2025-10-01 15:15:04 +0000 UTC" firstStartedPulling="2025-10-01 15:15:05.635871432 +0000 UTC m=+1150.255046603" lastFinishedPulling="2025-10-01 15:15:10.358943698 +0000 UTC m=+1154.978118859" observedRunningTime="2025-10-01 15:15:10.702095322 +0000 UTC m=+1155.321270523" watchObservedRunningTime="2025-10-01 15:15:10.710818926 +0000 UTC m=+1155.329994097" Oct 01 15:15:11 crc kubenswrapper[4771]: I1001 15:15:11.695194 4771 generic.go:334] "Generic (PLEG): container finished" podID="657585fd-f66c-49a8-b076-6b75dbb595e0" containerID="97dc2b438d66d8c915161d33091b4c29d2e1a8daed279fc8afe1b9051f2ed1bb" exitCode=2 Oct 01 15:15:11 crc kubenswrapper[4771]: I1001 15:15:11.695252 4771 generic.go:334] "Generic (PLEG): container finished" podID="657585fd-f66c-49a8-b076-6b75dbb595e0" containerID="512e1f6e7dcbb61af79b428a232574a25754417c45d05250c9ce0fc335c8cf7d" exitCode=0 Oct 01 15:15:11 crc kubenswrapper[4771]: I1001 15:15:11.695277 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"657585fd-f66c-49a8-b076-6b75dbb595e0","Type":"ContainerDied","Data":"97dc2b438d66d8c915161d33091b4c29d2e1a8daed279fc8afe1b9051f2ed1bb"} Oct 01 15:15:11 crc kubenswrapper[4771]: I1001 15:15:11.695324 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"657585fd-f66c-49a8-b076-6b75dbb595e0","Type":"ContainerDied","Data":"512e1f6e7dcbb61af79b428a232574a25754417c45d05250c9ce0fc335c8cf7d"} Oct 01 15:15:12 crc kubenswrapper[4771]: I1001 15:15:12.177218 4771 patch_prober.go:28] interesting pod/machine-config-daemon-vck47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 15:15:12 crc kubenswrapper[4771]: I1001 15:15:12.177581 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 15:15:12 crc kubenswrapper[4771]: I1001 15:15:12.177628 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vck47" Oct 01 15:15:12 crc kubenswrapper[4771]: I1001 15:15:12.178646 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a4ed1f9c5d09bf489c874da2478bf24ba55dbfc7c07deabe55036c8bafeb8e52"} pod="openshift-machine-config-operator/machine-config-daemon-vck47" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 15:15:12 crc kubenswrapper[4771]: I1001 15:15:12.178718 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" containerID="cri-o://a4ed1f9c5d09bf489c874da2478bf24ba55dbfc7c07deabe55036c8bafeb8e52" gracePeriod=600 Oct 01 15:15:12 crc kubenswrapper[4771]: I1001 15:15:12.709438 4771 generic.go:334] "Generic (PLEG): container finished" podID="657585fd-f66c-49a8-b076-6b75dbb595e0" containerID="e7883bc02c877e3b0faadbde73d24ad745ef3a5d4d200eff172ff50991003a3d" exitCode=0 Oct 01 15:15:12 crc kubenswrapper[4771]: I1001 15:15:12.709586 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"657585fd-f66c-49a8-b076-6b75dbb595e0","Type":"ContainerDied","Data":"e7883bc02c877e3b0faadbde73d24ad745ef3a5d4d200eff172ff50991003a3d"} Oct 01 15:15:12 crc kubenswrapper[4771]: I1001 15:15:12.716781 4771 generic.go:334] "Generic (PLEG): container finished" podID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerID="a4ed1f9c5d09bf489c874da2478bf24ba55dbfc7c07deabe55036c8bafeb8e52" exitCode=0 Oct 01 15:15:12 crc kubenswrapper[4771]: I1001 15:15:12.716843 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" event={"ID":"289ee6d3-fabe-417f-964c-76ca03c143cc","Type":"ContainerDied","Data":"a4ed1f9c5d09bf489c874da2478bf24ba55dbfc7c07deabe55036c8bafeb8e52"} Oct 01 15:15:12 crc kubenswrapper[4771]: I1001 15:15:12.716876 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" event={"ID":"289ee6d3-fabe-417f-964c-76ca03c143cc","Type":"ContainerStarted","Data":"6fb5c90115d7e5e2b881eca81e834dd62c83b4059c941e9b801dafa27eac271e"} Oct 01 15:15:12 crc kubenswrapper[4771]: I1001 15:15:12.716912 4771 scope.go:117] "RemoveContainer" containerID="a954616bb5027e4b658bf522da064c2dd70331be4152d83f2506f267347e29d3" Oct 01 15:15:12 crc kubenswrapper[4771]: I1001 15:15:12.898172 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 15:15:12 crc kubenswrapper[4771]: I1001 15:15:12.898453 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="6e1a8666-1d85-403b-88ac-8ee7417faba9" containerName="glance-log" containerID="cri-o://686e91eb3fb6a5d0be094dd11ce59a79a83eeddbf9e9c9a3a86df4a6f4c25305" gracePeriod=30 Oct 01 15:15:12 crc kubenswrapper[4771]: I1001 15:15:12.898939 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="6e1a8666-1d85-403b-88ac-8ee7417faba9" containerName="glance-httpd" containerID="cri-o://11ffecfc6813991424be80103fa3cc6c4c21baf81e0ca539d2bebb199a4b3356" gracePeriod=30 Oct 01 15:15:13 crc kubenswrapper[4771]: I1001 15:15:13.732654 4771 generic.go:334] "Generic (PLEG): container finished" podID="6e1a8666-1d85-403b-88ac-8ee7417faba9" containerID="686e91eb3fb6a5d0be094dd11ce59a79a83eeddbf9e9c9a3a86df4a6f4c25305" exitCode=143 Oct 01 15:15:13 crc kubenswrapper[4771]: I1001 15:15:13.733913 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6e1a8666-1d85-403b-88ac-8ee7417faba9","Type":"ContainerDied","Data":"686e91eb3fb6a5d0be094dd11ce59a79a83eeddbf9e9c9a3a86df4a6f4c25305"} Oct 01 15:15:14 crc kubenswrapper[4771]: I1001 15:15:14.747162 4771 generic.go:334] "Generic (PLEG): container finished" podID="f267338d-cce8-4abe-b836-519fcca98eed" containerID="8df1a31fa585058e4faf5751ffdbf997240f309b7cb136fb0ddc8a85c0d1b7a0" exitCode=0 Oct 01 15:15:14 crc kubenswrapper[4771]: I1001 15:15:14.747423 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f267338d-cce8-4abe-b836-519fcca98eed","Type":"ContainerDied","Data":"8df1a31fa585058e4faf5751ffdbf997240f309b7cb136fb0ddc8a85c0d1b7a0"} Oct 01 15:15:15 crc kubenswrapper[4771]: I1001 15:15:15.883682 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 01 15:15:16 crc kubenswrapper[4771]: I1001 15:15:16.770425 4771 generic.go:334] "Generic (PLEG): container finished" podID="6e1a8666-1d85-403b-88ac-8ee7417faba9" containerID="11ffecfc6813991424be80103fa3cc6c4c21baf81e0ca539d2bebb199a4b3356" exitCode=0 Oct 01 15:15:16 crc kubenswrapper[4771]: I1001 15:15:16.770709 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6e1a8666-1d85-403b-88ac-8ee7417faba9","Type":"ContainerDied","Data":"11ffecfc6813991424be80103fa3cc6c4c21baf81e0ca539d2bebb199a4b3356"} Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.067061 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.073468 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f267338d-cce8-4abe-b836-519fcca98eed-httpd-run\") pod \"f267338d-cce8-4abe-b836-519fcca98eed\" (UID: \"f267338d-cce8-4abe-b836-519fcca98eed\") " Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.073537 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f267338d-cce8-4abe-b836-519fcca98eed-scripts\") pod \"f267338d-cce8-4abe-b836-519fcca98eed\" (UID: \"f267338d-cce8-4abe-b836-519fcca98eed\") " Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.073575 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkmxd\" (UniqueName: \"kubernetes.io/projected/f267338d-cce8-4abe-b836-519fcca98eed-kube-api-access-hkmxd\") pod \"f267338d-cce8-4abe-b836-519fcca98eed\" (UID: \"f267338d-cce8-4abe-b836-519fcca98eed\") " Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.073604 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f267338d-cce8-4abe-b836-519fcca98eed-combined-ca-bundle\") pod \"f267338d-cce8-4abe-b836-519fcca98eed\" (UID: \"f267338d-cce8-4abe-b836-519fcca98eed\") " Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.073743 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f267338d-cce8-4abe-b836-519fcca98eed-public-tls-certs\") pod \"f267338d-cce8-4abe-b836-519fcca98eed\" (UID: \"f267338d-cce8-4abe-b836-519fcca98eed\") " Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.073803 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f267338d-cce8-4abe-b836-519fcca98eed-config-data\") pod \"f267338d-cce8-4abe-b836-519fcca98eed\" (UID: \"f267338d-cce8-4abe-b836-519fcca98eed\") " Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.073852 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"f267338d-cce8-4abe-b836-519fcca98eed\" (UID: \"f267338d-cce8-4abe-b836-519fcca98eed\") " Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.073884 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f267338d-cce8-4abe-b836-519fcca98eed-logs\") pod \"f267338d-cce8-4abe-b836-519fcca98eed\" (UID: \"f267338d-cce8-4abe-b836-519fcca98eed\") " Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.073992 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f267338d-cce8-4abe-b836-519fcca98eed-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f267338d-cce8-4abe-b836-519fcca98eed" (UID: "f267338d-cce8-4abe-b836-519fcca98eed"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.074567 4771 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f267338d-cce8-4abe-b836-519fcca98eed-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.074605 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f267338d-cce8-4abe-b836-519fcca98eed-logs" (OuterVolumeSpecName: "logs") pod "f267338d-cce8-4abe-b836-519fcca98eed" (UID: "f267338d-cce8-4abe-b836-519fcca98eed"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.080848 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f267338d-cce8-4abe-b836-519fcca98eed-scripts" (OuterVolumeSpecName: "scripts") pod "f267338d-cce8-4abe-b836-519fcca98eed" (UID: "f267338d-cce8-4abe-b836-519fcca98eed"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.081388 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f267338d-cce8-4abe-b836-519fcca98eed-kube-api-access-hkmxd" (OuterVolumeSpecName: "kube-api-access-hkmxd") pod "f267338d-cce8-4abe-b836-519fcca98eed" (UID: "f267338d-cce8-4abe-b836-519fcca98eed"). InnerVolumeSpecName "kube-api-access-hkmxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.083353 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "f267338d-cce8-4abe-b836-519fcca98eed" (UID: "f267338d-cce8-4abe-b836-519fcca98eed"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.123083 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f267338d-cce8-4abe-b836-519fcca98eed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f267338d-cce8-4abe-b836-519fcca98eed" (UID: "f267338d-cce8-4abe-b836-519fcca98eed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.146241 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f267338d-cce8-4abe-b836-519fcca98eed-config-data" (OuterVolumeSpecName: "config-data") pod "f267338d-cce8-4abe-b836-519fcca98eed" (UID: "f267338d-cce8-4abe-b836-519fcca98eed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.157468 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f267338d-cce8-4abe-b836-519fcca98eed-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f267338d-cce8-4abe-b836-519fcca98eed" (UID: "f267338d-cce8-4abe-b836-519fcca98eed"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.175949 4771 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f267338d-cce8-4abe-b836-519fcca98eed-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.175966 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f267338d-cce8-4abe-b836-519fcca98eed-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.175996 4771 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.176005 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f267338d-cce8-4abe-b836-519fcca98eed-logs\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.176013 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f267338d-cce8-4abe-b836-519fcca98eed-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.176022 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkmxd\" (UniqueName: \"kubernetes.io/projected/f267338d-cce8-4abe-b836-519fcca98eed-kube-api-access-hkmxd\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.176031 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f267338d-cce8-4abe-b836-519fcca98eed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.200719 4771 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.273976 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.277594 4771 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.378812 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fz64z\" (UniqueName: \"kubernetes.io/projected/6e1a8666-1d85-403b-88ac-8ee7417faba9-kube-api-access-fz64z\") pod \"6e1a8666-1d85-403b-88ac-8ee7417faba9\" (UID: \"6e1a8666-1d85-403b-88ac-8ee7417faba9\") " Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.378890 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e1a8666-1d85-403b-88ac-8ee7417faba9-config-data\") pod \"6e1a8666-1d85-403b-88ac-8ee7417faba9\" (UID: \"6e1a8666-1d85-403b-88ac-8ee7417faba9\") " Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.378913 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e1a8666-1d85-403b-88ac-8ee7417faba9-internal-tls-certs\") pod \"6e1a8666-1d85-403b-88ac-8ee7417faba9\" (UID: \"6e1a8666-1d85-403b-88ac-8ee7417faba9\") " Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.378976 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e1a8666-1d85-403b-88ac-8ee7417faba9-scripts\") pod \"6e1a8666-1d85-403b-88ac-8ee7417faba9\" (UID: \"6e1a8666-1d85-403b-88ac-8ee7417faba9\") " Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.379014 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"6e1a8666-1d85-403b-88ac-8ee7417faba9\" (UID: \"6e1a8666-1d85-403b-88ac-8ee7417faba9\") " Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.379037 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6e1a8666-1d85-403b-88ac-8ee7417faba9-httpd-run\") pod \"6e1a8666-1d85-403b-88ac-8ee7417faba9\" (UID: \"6e1a8666-1d85-403b-88ac-8ee7417faba9\") " Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.379073 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1a8666-1d85-403b-88ac-8ee7417faba9-combined-ca-bundle\") pod \"6e1a8666-1d85-403b-88ac-8ee7417faba9\" (UID: \"6e1a8666-1d85-403b-88ac-8ee7417faba9\") " Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.379150 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e1a8666-1d85-403b-88ac-8ee7417faba9-logs\") pod \"6e1a8666-1d85-403b-88ac-8ee7417faba9\" (UID: \"6e1a8666-1d85-403b-88ac-8ee7417faba9\") " Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.379677 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e1a8666-1d85-403b-88ac-8ee7417faba9-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6e1a8666-1d85-403b-88ac-8ee7417faba9" (UID: "6e1a8666-1d85-403b-88ac-8ee7417faba9"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.380139 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e1a8666-1d85-403b-88ac-8ee7417faba9-logs" (OuterVolumeSpecName: "logs") pod "6e1a8666-1d85-403b-88ac-8ee7417faba9" (UID: "6e1a8666-1d85-403b-88ac-8ee7417faba9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.383911 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "6e1a8666-1d85-403b-88ac-8ee7417faba9" (UID: "6e1a8666-1d85-403b-88ac-8ee7417faba9"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.385406 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e1a8666-1d85-403b-88ac-8ee7417faba9-scripts" (OuterVolumeSpecName: "scripts") pod "6e1a8666-1d85-403b-88ac-8ee7417faba9" (UID: "6e1a8666-1d85-403b-88ac-8ee7417faba9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.385891 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e1a8666-1d85-403b-88ac-8ee7417faba9-kube-api-access-fz64z" (OuterVolumeSpecName: "kube-api-access-fz64z") pod "6e1a8666-1d85-403b-88ac-8ee7417faba9" (UID: "6e1a8666-1d85-403b-88ac-8ee7417faba9"). InnerVolumeSpecName "kube-api-access-fz64z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.418600 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e1a8666-1d85-403b-88ac-8ee7417faba9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e1a8666-1d85-403b-88ac-8ee7417faba9" (UID: "6e1a8666-1d85-403b-88ac-8ee7417faba9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.429456 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e1a8666-1d85-403b-88ac-8ee7417faba9-config-data" (OuterVolumeSpecName: "config-data") pod "6e1a8666-1d85-403b-88ac-8ee7417faba9" (UID: "6e1a8666-1d85-403b-88ac-8ee7417faba9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.434886 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e1a8666-1d85-403b-88ac-8ee7417faba9-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6e1a8666-1d85-403b-88ac-8ee7417faba9" (UID: "6e1a8666-1d85-403b-88ac-8ee7417faba9"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.482020 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e1a8666-1d85-403b-88ac-8ee7417faba9-logs\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.482053 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fz64z\" (UniqueName: \"kubernetes.io/projected/6e1a8666-1d85-403b-88ac-8ee7417faba9-kube-api-access-fz64z\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.482065 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e1a8666-1d85-403b-88ac-8ee7417faba9-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.482076 4771 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e1a8666-1d85-403b-88ac-8ee7417faba9-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.482085 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e1a8666-1d85-403b-88ac-8ee7417faba9-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.482117 4771 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.482126 4771 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6e1a8666-1d85-403b-88ac-8ee7417faba9-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.482134 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e1a8666-1d85-403b-88ac-8ee7417faba9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.508877 4771 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.588842 4771 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.794439 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-9xfmf" event={"ID":"95f37934-8f6e-4013-b7a4-5563e5245c79","Type":"ContainerStarted","Data":"a6e3a0e105d23e6398c3b0c2daff658ed59e266b5c5977dfce20cf632cb6d692"} Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.797068 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"f267338d-cce8-4abe-b836-519fcca98eed","Type":"ContainerDied","Data":"28c9a7fe73f401cda6b2dbd5c92fa8e3d96e6b393d763fe9ae04f87bc0bb94c8"} Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.797394 4771 scope.go:117] "RemoveContainer" containerID="8df1a31fa585058e4faf5751ffdbf997240f309b7cb136fb0ddc8a85c0d1b7a0" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.797270 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.801145 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6e1a8666-1d85-403b-88ac-8ee7417faba9","Type":"ContainerDied","Data":"ade83319257f6b31faf018cada3ce391a4094fd0f817184736d785090453fef6"} Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.801253 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.831437 4771 scope.go:117] "RemoveContainer" containerID="11a86c2d50fbf4102cdb5f5fbef715c02373931b76a24fd501536adb701cd374" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.838079 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-9xfmf" podStartSLOduration=1.9766041159999999 podStartE2EDuration="9.838060025s" podCreationTimestamp="2025-10-01 15:15:09 +0000 UTC" firstStartedPulling="2025-10-01 15:15:10.085369896 +0000 UTC m=+1154.704545057" lastFinishedPulling="2025-10-01 15:15:17.946825765 +0000 UTC m=+1162.566000966" observedRunningTime="2025-10-01 15:15:18.81264504 +0000 UTC m=+1163.431820221" watchObservedRunningTime="2025-10-01 15:15:18.838060025 +0000 UTC m=+1163.457235196" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.858268 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.868601 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.870627 4771 scope.go:117] "RemoveContainer" containerID="11ffecfc6813991424be80103fa3cc6c4c21baf81e0ca539d2bebb199a4b3356" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.886667 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.901144 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.943969 4771 scope.go:117] "RemoveContainer" containerID="686e91eb3fb6a5d0be094dd11ce59a79a83eeddbf9e9c9a3a86df4a6f4c25305" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.944136 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 15:15:18 crc kubenswrapper[4771]: E1001 15:15:18.944572 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e1a8666-1d85-403b-88ac-8ee7417faba9" containerName="glance-log" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.944596 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e1a8666-1d85-403b-88ac-8ee7417faba9" containerName="glance-log" Oct 01 15:15:18 crc kubenswrapper[4771]: E1001 15:15:18.944619 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f267338d-cce8-4abe-b836-519fcca98eed" containerName="glance-httpd" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.944628 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f267338d-cce8-4abe-b836-519fcca98eed" containerName="glance-httpd" Oct 01 15:15:18 crc kubenswrapper[4771]: E1001 15:15:18.944651 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f267338d-cce8-4abe-b836-519fcca98eed" containerName="glance-log" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.944658 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f267338d-cce8-4abe-b836-519fcca98eed" containerName="glance-log" Oct 01 15:15:18 crc kubenswrapper[4771]: E1001 15:15:18.944684 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e1a8666-1d85-403b-88ac-8ee7417faba9" containerName="glance-httpd" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.944692 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e1a8666-1d85-403b-88ac-8ee7417faba9" containerName="glance-httpd" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.945144 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f267338d-cce8-4abe-b836-519fcca98eed" containerName="glance-log" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.945164 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e1a8666-1d85-403b-88ac-8ee7417faba9" containerName="glance-httpd" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.945184 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f267338d-cce8-4abe-b836-519fcca98eed" containerName="glance-httpd" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.945200 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e1a8666-1d85-403b-88ac-8ee7417faba9" containerName="glance-log" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.946389 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.951529 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.951843 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.952080 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.952267 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-hfxfb" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.969555 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.971603 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.980099 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.980502 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 01 15:15:18 crc kubenswrapper[4771]: I1001 15:15:18.991556 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.005867 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.103439 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd39907f-ab26-47d6-9d78-0b0437de4b04-scripts\") pod \"glance-default-external-api-0\" (UID: \"bd39907f-ab26-47d6-9d78-0b0437de4b04\") " pod="openstack/glance-default-external-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.104459 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd39907f-ab26-47d6-9d78-0b0437de4b04-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bd39907f-ab26-47d6-9d78-0b0437de4b04\") " pod="openstack/glance-default-external-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.104612 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd39907f-ab26-47d6-9d78-0b0437de4b04-logs\") pod \"glance-default-external-api-0\" (UID: \"bd39907f-ab26-47d6-9d78-0b0437de4b04\") " pod="openstack/glance-default-external-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.104749 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g464c\" (UniqueName: \"kubernetes.io/projected/83dc7d05-3ef5-4da2-b4ea-58d3c11d4528-kube-api-access-g464c\") pod \"glance-default-internal-api-0\" (UID: \"83dc7d05-3ef5-4da2-b4ea-58d3c11d4528\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.104864 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd39907f-ab26-47d6-9d78-0b0437de4b04-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bd39907f-ab26-47d6-9d78-0b0437de4b04\") " pod="openstack/glance-default-external-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.104980 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd39907f-ab26-47d6-9d78-0b0437de4b04-config-data\") pod \"glance-default-external-api-0\" (UID: \"bd39907f-ab26-47d6-9d78-0b0437de4b04\") " pod="openstack/glance-default-external-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.105081 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83dc7d05-3ef5-4da2-b4ea-58d3c11d4528-logs\") pod \"glance-default-internal-api-0\" (UID: \"83dc7d05-3ef5-4da2-b4ea-58d3c11d4528\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.105185 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83dc7d05-3ef5-4da2-b4ea-58d3c11d4528-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"83dc7d05-3ef5-4da2-b4ea-58d3c11d4528\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.105307 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd39907f-ab26-47d6-9d78-0b0437de4b04-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bd39907f-ab26-47d6-9d78-0b0437de4b04\") " pod="openstack/glance-default-external-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.105434 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83dc7d05-3ef5-4da2-b4ea-58d3c11d4528-config-data\") pod \"glance-default-internal-api-0\" (UID: \"83dc7d05-3ef5-4da2-b4ea-58d3c11d4528\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.105557 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwh5d\" (UniqueName: \"kubernetes.io/projected/bd39907f-ab26-47d6-9d78-0b0437de4b04-kube-api-access-dwh5d\") pod \"glance-default-external-api-0\" (UID: \"bd39907f-ab26-47d6-9d78-0b0437de4b04\") " pod="openstack/glance-default-external-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.105670 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"83dc7d05-3ef5-4da2-b4ea-58d3c11d4528\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.105888 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/83dc7d05-3ef5-4da2-b4ea-58d3c11d4528-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"83dc7d05-3ef5-4da2-b4ea-58d3c11d4528\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.106026 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83dc7d05-3ef5-4da2-b4ea-58d3c11d4528-scripts\") pod \"glance-default-internal-api-0\" (UID: \"83dc7d05-3ef5-4da2-b4ea-58d3c11d4528\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.106167 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/83dc7d05-3ef5-4da2-b4ea-58d3c11d4528-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"83dc7d05-3ef5-4da2-b4ea-58d3c11d4528\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.106283 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"bd39907f-ab26-47d6-9d78-0b0437de4b04\") " pod="openstack/glance-default-external-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.207528 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83dc7d05-3ef5-4da2-b4ea-58d3c11d4528-scripts\") pod \"glance-default-internal-api-0\" (UID: \"83dc7d05-3ef5-4da2-b4ea-58d3c11d4528\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.208475 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/83dc7d05-3ef5-4da2-b4ea-58d3c11d4528-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"83dc7d05-3ef5-4da2-b4ea-58d3c11d4528\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.208590 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"bd39907f-ab26-47d6-9d78-0b0437de4b04\") " pod="openstack/glance-default-external-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.208682 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd39907f-ab26-47d6-9d78-0b0437de4b04-scripts\") pod \"glance-default-external-api-0\" (UID: \"bd39907f-ab26-47d6-9d78-0b0437de4b04\") " pod="openstack/glance-default-external-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.208768 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd39907f-ab26-47d6-9d78-0b0437de4b04-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bd39907f-ab26-47d6-9d78-0b0437de4b04\") " pod="openstack/glance-default-external-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.208867 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd39907f-ab26-47d6-9d78-0b0437de4b04-logs\") pod \"glance-default-external-api-0\" (UID: \"bd39907f-ab26-47d6-9d78-0b0437de4b04\") " pod="openstack/glance-default-external-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.208972 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g464c\" (UniqueName: \"kubernetes.io/projected/83dc7d05-3ef5-4da2-b4ea-58d3c11d4528-kube-api-access-g464c\") pod \"glance-default-internal-api-0\" (UID: \"83dc7d05-3ef5-4da2-b4ea-58d3c11d4528\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.209052 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd39907f-ab26-47d6-9d78-0b0437de4b04-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bd39907f-ab26-47d6-9d78-0b0437de4b04\") " pod="openstack/glance-default-external-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.209127 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd39907f-ab26-47d6-9d78-0b0437de4b04-config-data\") pod \"glance-default-external-api-0\" (UID: \"bd39907f-ab26-47d6-9d78-0b0437de4b04\") " pod="openstack/glance-default-external-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.209189 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83dc7d05-3ef5-4da2-b4ea-58d3c11d4528-logs\") pod \"glance-default-internal-api-0\" (UID: \"83dc7d05-3ef5-4da2-b4ea-58d3c11d4528\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.209258 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83dc7d05-3ef5-4da2-b4ea-58d3c11d4528-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"83dc7d05-3ef5-4da2-b4ea-58d3c11d4528\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.209337 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd39907f-ab26-47d6-9d78-0b0437de4b04-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bd39907f-ab26-47d6-9d78-0b0437de4b04\") " pod="openstack/glance-default-external-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.209420 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83dc7d05-3ef5-4da2-b4ea-58d3c11d4528-config-data\") pod \"glance-default-internal-api-0\" (UID: \"83dc7d05-3ef5-4da2-b4ea-58d3c11d4528\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.209509 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwh5d\" (UniqueName: \"kubernetes.io/projected/bd39907f-ab26-47d6-9d78-0b0437de4b04-kube-api-access-dwh5d\") pod \"glance-default-external-api-0\" (UID: \"bd39907f-ab26-47d6-9d78-0b0437de4b04\") " pod="openstack/glance-default-external-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.209577 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"83dc7d05-3ef5-4da2-b4ea-58d3c11d4528\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.209648 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/83dc7d05-3ef5-4da2-b4ea-58d3c11d4528-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"83dc7d05-3ef5-4da2-b4ea-58d3c11d4528\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.213989 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd39907f-ab26-47d6-9d78-0b0437de4b04-logs\") pod \"glance-default-external-api-0\" (UID: \"bd39907f-ab26-47d6-9d78-0b0437de4b04\") " pod="openstack/glance-default-external-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.214251 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd39907f-ab26-47d6-9d78-0b0437de4b04-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bd39907f-ab26-47d6-9d78-0b0437de4b04\") " pod="openstack/glance-default-external-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.214533 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83dc7d05-3ef5-4da2-b4ea-58d3c11d4528-logs\") pod \"glance-default-internal-api-0\" (UID: \"83dc7d05-3ef5-4da2-b4ea-58d3c11d4528\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.214864 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"83dc7d05-3ef5-4da2-b4ea-58d3c11d4528\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.215086 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83dc7d05-3ef5-4da2-b4ea-58d3c11d4528-scripts\") pod \"glance-default-internal-api-0\" (UID: \"83dc7d05-3ef5-4da2-b4ea-58d3c11d4528\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.215124 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/83dc7d05-3ef5-4da2-b4ea-58d3c11d4528-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"83dc7d05-3ef5-4da2-b4ea-58d3c11d4528\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.215377 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"bd39907f-ab26-47d6-9d78-0b0437de4b04\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.232244 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd39907f-ab26-47d6-9d78-0b0437de4b04-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bd39907f-ab26-47d6-9d78-0b0437de4b04\") " pod="openstack/glance-default-external-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.236313 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd39907f-ab26-47d6-9d78-0b0437de4b04-config-data\") pod \"glance-default-external-api-0\" (UID: \"bd39907f-ab26-47d6-9d78-0b0437de4b04\") " pod="openstack/glance-default-external-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.237596 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/83dc7d05-3ef5-4da2-b4ea-58d3c11d4528-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"83dc7d05-3ef5-4da2-b4ea-58d3c11d4528\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.238157 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83dc7d05-3ef5-4da2-b4ea-58d3c11d4528-config-data\") pod \"glance-default-internal-api-0\" (UID: \"83dc7d05-3ef5-4da2-b4ea-58d3c11d4528\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.239812 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd39907f-ab26-47d6-9d78-0b0437de4b04-scripts\") pod \"glance-default-external-api-0\" (UID: \"bd39907f-ab26-47d6-9d78-0b0437de4b04\") " pod="openstack/glance-default-external-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.240333 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83dc7d05-3ef5-4da2-b4ea-58d3c11d4528-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"83dc7d05-3ef5-4da2-b4ea-58d3c11d4528\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.246540 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd39907f-ab26-47d6-9d78-0b0437de4b04-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bd39907f-ab26-47d6-9d78-0b0437de4b04\") " pod="openstack/glance-default-external-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.262662 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g464c\" (UniqueName: \"kubernetes.io/projected/83dc7d05-3ef5-4da2-b4ea-58d3c11d4528-kube-api-access-g464c\") pod \"glance-default-internal-api-0\" (UID: \"83dc7d05-3ef5-4da2-b4ea-58d3c11d4528\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.277117 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwh5d\" (UniqueName: \"kubernetes.io/projected/bd39907f-ab26-47d6-9d78-0b0437de4b04-kube-api-access-dwh5d\") pod \"glance-default-external-api-0\" (UID: \"bd39907f-ab26-47d6-9d78-0b0437de4b04\") " pod="openstack/glance-default-external-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.294245 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"83dc7d05-3ef5-4da2-b4ea-58d3c11d4528\") " pod="openstack/glance-default-internal-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.339921 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"bd39907f-ab26-47d6-9d78-0b0437de4b04\") " pod="openstack/glance-default-external-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.571797 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.597667 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.994717 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e1a8666-1d85-403b-88ac-8ee7417faba9" path="/var/lib/kubelet/pods/6e1a8666-1d85-403b-88ac-8ee7417faba9/volumes" Oct 01 15:15:19 crc kubenswrapper[4771]: I1001 15:15:19.995481 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f267338d-cce8-4abe-b836-519fcca98eed" path="/var/lib/kubelet/pods/f267338d-cce8-4abe-b836-519fcca98eed/volumes" Oct 01 15:15:20 crc kubenswrapper[4771]: I1001 15:15:20.114016 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 01 15:15:20 crc kubenswrapper[4771]: W1001 15:15:20.114109 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83dc7d05_3ef5_4da2_b4ea_58d3c11d4528.slice/crio-277000b9ae7d9fe4b41a406860d1f1b516223b41dca0ae60b196ba8249fecbbe WatchSource:0}: Error finding container 277000b9ae7d9fe4b41a406860d1f1b516223b41dca0ae60b196ba8249fecbbe: Status 404 returned error can't find the container with id 277000b9ae7d9fe4b41a406860d1f1b516223b41dca0ae60b196ba8249fecbbe Oct 01 15:15:20 crc kubenswrapper[4771]: I1001 15:15:20.244679 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 01 15:15:20 crc kubenswrapper[4771]: W1001 15:15:20.255140 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd39907f_ab26_47d6_9d78_0b0437de4b04.slice/crio-a609d2f92380c4f1ac5f0bd0c92c0aca271f904ccf88be1fab65cfc20bbd2623 WatchSource:0}: Error finding container a609d2f92380c4f1ac5f0bd0c92c0aca271f904ccf88be1fab65cfc20bbd2623: Status 404 returned error can't find the container with id a609d2f92380c4f1ac5f0bd0c92c0aca271f904ccf88be1fab65cfc20bbd2623 Oct 01 15:15:20 crc kubenswrapper[4771]: I1001 15:15:20.822910 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"83dc7d05-3ef5-4da2-b4ea-58d3c11d4528","Type":"ContainerStarted","Data":"2ae12469d25f58438534851205ef00213535ecb1185a538aca671e33ef8224c5"} Oct 01 15:15:20 crc kubenswrapper[4771]: I1001 15:15:20.822950 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"83dc7d05-3ef5-4da2-b4ea-58d3c11d4528","Type":"ContainerStarted","Data":"277000b9ae7d9fe4b41a406860d1f1b516223b41dca0ae60b196ba8249fecbbe"} Oct 01 15:15:20 crc kubenswrapper[4771]: I1001 15:15:20.824962 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bd39907f-ab26-47d6-9d78-0b0437de4b04","Type":"ContainerStarted","Data":"a609d2f92380c4f1ac5f0bd0c92c0aca271f904ccf88be1fab65cfc20bbd2623"} Oct 01 15:15:21 crc kubenswrapper[4771]: I1001 15:15:21.836352 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bd39907f-ab26-47d6-9d78-0b0437de4b04","Type":"ContainerStarted","Data":"eb739d851d57869457b0781954e5527eb47d4d64e82e2b0a46b960274dc07ae1"} Oct 01 15:15:21 crc kubenswrapper[4771]: I1001 15:15:21.836688 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bd39907f-ab26-47d6-9d78-0b0437de4b04","Type":"ContainerStarted","Data":"b67fc98d1e0d243dbb11d5c1df77f27d05469fe49cbc73f6b61dae387b2c6b39"} Oct 01 15:15:21 crc kubenswrapper[4771]: I1001 15:15:21.839496 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"83dc7d05-3ef5-4da2-b4ea-58d3c11d4528","Type":"ContainerStarted","Data":"dbe92e730bcfdff5385998dbc49dc67034c81d42dffa1fbba32d9aee4c4fda74"} Oct 01 15:15:21 crc kubenswrapper[4771]: I1001 15:15:21.856317 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.856297071 podStartE2EDuration="3.856297071s" podCreationTimestamp="2025-10-01 15:15:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:15:21.852651902 +0000 UTC m=+1166.471827073" watchObservedRunningTime="2025-10-01 15:15:21.856297071 +0000 UTC m=+1166.475472232" Oct 01 15:15:21 crc kubenswrapper[4771]: I1001 15:15:21.881482 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.88145701 podStartE2EDuration="3.88145701s" podCreationTimestamp="2025-10-01 15:15:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:15:21.876083448 +0000 UTC m=+1166.495258629" watchObservedRunningTime="2025-10-01 15:15:21.88145701 +0000 UTC m=+1166.500632201" Oct 01 15:15:29 crc kubenswrapper[4771]: I1001 15:15:29.572564 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 01 15:15:29 crc kubenswrapper[4771]: I1001 15:15:29.573306 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 01 15:15:29 crc kubenswrapper[4771]: I1001 15:15:29.598094 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 01 15:15:29 crc kubenswrapper[4771]: I1001 15:15:29.598138 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 01 15:15:29 crc kubenswrapper[4771]: I1001 15:15:29.609953 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 01 15:15:29 crc kubenswrapper[4771]: I1001 15:15:29.652791 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 01 15:15:29 crc kubenswrapper[4771]: I1001 15:15:29.655600 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 01 15:15:29 crc kubenswrapper[4771]: I1001 15:15:29.656955 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 01 15:15:29 crc kubenswrapper[4771]: I1001 15:15:29.935531 4771 generic.go:334] "Generic (PLEG): container finished" podID="95f37934-8f6e-4013-b7a4-5563e5245c79" containerID="a6e3a0e105d23e6398c3b0c2daff658ed59e266b5c5977dfce20cf632cb6d692" exitCode=0 Oct 01 15:15:29 crc kubenswrapper[4771]: I1001 15:15:29.935607 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-9xfmf" event={"ID":"95f37934-8f6e-4013-b7a4-5563e5245c79","Type":"ContainerDied","Data":"a6e3a0e105d23e6398c3b0c2daff658ed59e266b5c5977dfce20cf632cb6d692"} Oct 01 15:15:29 crc kubenswrapper[4771]: I1001 15:15:29.936451 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 01 15:15:29 crc kubenswrapper[4771]: I1001 15:15:29.936557 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 01 15:15:29 crc kubenswrapper[4771]: I1001 15:15:29.936751 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 01 15:15:29 crc kubenswrapper[4771]: I1001 15:15:29.937708 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 01 15:15:31 crc kubenswrapper[4771]: I1001 15:15:31.335532 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-9xfmf" Oct 01 15:15:31 crc kubenswrapper[4771]: I1001 15:15:31.446531 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95f37934-8f6e-4013-b7a4-5563e5245c79-scripts\") pod \"95f37934-8f6e-4013-b7a4-5563e5245c79\" (UID: \"95f37934-8f6e-4013-b7a4-5563e5245c79\") " Oct 01 15:15:31 crc kubenswrapper[4771]: I1001 15:15:31.446674 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95f37934-8f6e-4013-b7a4-5563e5245c79-config-data\") pod \"95f37934-8f6e-4013-b7a4-5563e5245c79\" (UID: \"95f37934-8f6e-4013-b7a4-5563e5245c79\") " Oct 01 15:15:31 crc kubenswrapper[4771]: I1001 15:15:31.447566 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljvdj\" (UniqueName: \"kubernetes.io/projected/95f37934-8f6e-4013-b7a4-5563e5245c79-kube-api-access-ljvdj\") pod \"95f37934-8f6e-4013-b7a4-5563e5245c79\" (UID: \"95f37934-8f6e-4013-b7a4-5563e5245c79\") " Oct 01 15:15:31 crc kubenswrapper[4771]: I1001 15:15:31.447599 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95f37934-8f6e-4013-b7a4-5563e5245c79-combined-ca-bundle\") pod \"95f37934-8f6e-4013-b7a4-5563e5245c79\" (UID: \"95f37934-8f6e-4013-b7a4-5563e5245c79\") " Oct 01 15:15:31 crc kubenswrapper[4771]: I1001 15:15:31.452257 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95f37934-8f6e-4013-b7a4-5563e5245c79-scripts" (OuterVolumeSpecName: "scripts") pod "95f37934-8f6e-4013-b7a4-5563e5245c79" (UID: "95f37934-8f6e-4013-b7a4-5563e5245c79"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:15:31 crc kubenswrapper[4771]: I1001 15:15:31.453005 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95f37934-8f6e-4013-b7a4-5563e5245c79-kube-api-access-ljvdj" (OuterVolumeSpecName: "kube-api-access-ljvdj") pod "95f37934-8f6e-4013-b7a4-5563e5245c79" (UID: "95f37934-8f6e-4013-b7a4-5563e5245c79"). InnerVolumeSpecName "kube-api-access-ljvdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:15:31 crc kubenswrapper[4771]: I1001 15:15:31.481135 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95f37934-8f6e-4013-b7a4-5563e5245c79-config-data" (OuterVolumeSpecName: "config-data") pod "95f37934-8f6e-4013-b7a4-5563e5245c79" (UID: "95f37934-8f6e-4013-b7a4-5563e5245c79"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:15:31 crc kubenswrapper[4771]: I1001 15:15:31.481375 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95f37934-8f6e-4013-b7a4-5563e5245c79-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95f37934-8f6e-4013-b7a4-5563e5245c79" (UID: "95f37934-8f6e-4013-b7a4-5563e5245c79"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:15:31 crc kubenswrapper[4771]: I1001 15:15:31.550014 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95f37934-8f6e-4013-b7a4-5563e5245c79-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:31 crc kubenswrapper[4771]: I1001 15:15:31.550051 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljvdj\" (UniqueName: \"kubernetes.io/projected/95f37934-8f6e-4013-b7a4-5563e5245c79-kube-api-access-ljvdj\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:31 crc kubenswrapper[4771]: I1001 15:15:31.550063 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95f37934-8f6e-4013-b7a4-5563e5245c79-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:31 crc kubenswrapper[4771]: I1001 15:15:31.550072 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95f37934-8f6e-4013-b7a4-5563e5245c79-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:31 crc kubenswrapper[4771]: I1001 15:15:31.862364 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 01 15:15:31 crc kubenswrapper[4771]: I1001 15:15:31.865688 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 01 15:15:31 crc kubenswrapper[4771]: I1001 15:15:31.991494 4771 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 01 15:15:31 crc kubenswrapper[4771]: I1001 15:15:31.992526 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-9xfmf" Oct 01 15:15:32 crc kubenswrapper[4771]: I1001 15:15:32.026944 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 01 15:15:32 crc kubenswrapper[4771]: I1001 15:15:32.026990 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-9xfmf" event={"ID":"95f37934-8f6e-4013-b7a4-5563e5245c79","Type":"ContainerDied","Data":"433e3d0f762d387080d794e7f2f801f8f3f1a92c6288a83eadb30b7db5ea85ae"} Oct 01 15:15:32 crc kubenswrapper[4771]: I1001 15:15:32.027021 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="433e3d0f762d387080d794e7f2f801f8f3f1a92c6288a83eadb30b7db5ea85ae" Oct 01 15:15:32 crc kubenswrapper[4771]: I1001 15:15:32.070484 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 01 15:15:32 crc kubenswrapper[4771]: I1001 15:15:32.148359 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 01 15:15:32 crc kubenswrapper[4771]: E1001 15:15:32.148799 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95f37934-8f6e-4013-b7a4-5563e5245c79" containerName="nova-cell0-conductor-db-sync" Oct 01 15:15:32 crc kubenswrapper[4771]: I1001 15:15:32.148819 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="95f37934-8f6e-4013-b7a4-5563e5245c79" containerName="nova-cell0-conductor-db-sync" Oct 01 15:15:32 crc kubenswrapper[4771]: I1001 15:15:32.149040 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="95f37934-8f6e-4013-b7a4-5563e5245c79" containerName="nova-cell0-conductor-db-sync" Oct 01 15:15:32 crc kubenswrapper[4771]: I1001 15:15:32.150340 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 01 15:15:32 crc kubenswrapper[4771]: I1001 15:15:32.154248 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 01 15:15:32 crc kubenswrapper[4771]: I1001 15:15:32.154445 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-5mnmv" Oct 01 15:15:32 crc kubenswrapper[4771]: I1001 15:15:32.164211 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 01 15:15:32 crc kubenswrapper[4771]: I1001 15:15:32.284315 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a01636b1-c705-4844-94b8-bb58e65faa1f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a01636b1-c705-4844-94b8-bb58e65faa1f\") " pod="openstack/nova-cell0-conductor-0" Oct 01 15:15:32 crc kubenswrapper[4771]: I1001 15:15:32.284394 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a01636b1-c705-4844-94b8-bb58e65faa1f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a01636b1-c705-4844-94b8-bb58e65faa1f\") " pod="openstack/nova-cell0-conductor-0" Oct 01 15:15:32 crc kubenswrapper[4771]: I1001 15:15:32.284457 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vmzv\" (UniqueName: \"kubernetes.io/projected/a01636b1-c705-4844-94b8-bb58e65faa1f-kube-api-access-4vmzv\") pod \"nova-cell0-conductor-0\" (UID: \"a01636b1-c705-4844-94b8-bb58e65faa1f\") " pod="openstack/nova-cell0-conductor-0" Oct 01 15:15:32 crc kubenswrapper[4771]: I1001 15:15:32.386353 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a01636b1-c705-4844-94b8-bb58e65faa1f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a01636b1-c705-4844-94b8-bb58e65faa1f\") " pod="openstack/nova-cell0-conductor-0" Oct 01 15:15:32 crc kubenswrapper[4771]: I1001 15:15:32.386438 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a01636b1-c705-4844-94b8-bb58e65faa1f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a01636b1-c705-4844-94b8-bb58e65faa1f\") " pod="openstack/nova-cell0-conductor-0" Oct 01 15:15:32 crc kubenswrapper[4771]: I1001 15:15:32.386524 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vmzv\" (UniqueName: \"kubernetes.io/projected/a01636b1-c705-4844-94b8-bb58e65faa1f-kube-api-access-4vmzv\") pod \"nova-cell0-conductor-0\" (UID: \"a01636b1-c705-4844-94b8-bb58e65faa1f\") " pod="openstack/nova-cell0-conductor-0" Oct 01 15:15:32 crc kubenswrapper[4771]: I1001 15:15:32.392520 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a01636b1-c705-4844-94b8-bb58e65faa1f-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a01636b1-c705-4844-94b8-bb58e65faa1f\") " pod="openstack/nova-cell0-conductor-0" Oct 01 15:15:32 crc kubenswrapper[4771]: I1001 15:15:32.392650 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a01636b1-c705-4844-94b8-bb58e65faa1f-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a01636b1-c705-4844-94b8-bb58e65faa1f\") " pod="openstack/nova-cell0-conductor-0" Oct 01 15:15:32 crc kubenswrapper[4771]: I1001 15:15:32.411344 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vmzv\" (UniqueName: \"kubernetes.io/projected/a01636b1-c705-4844-94b8-bb58e65faa1f-kube-api-access-4vmzv\") pod \"nova-cell0-conductor-0\" (UID: \"a01636b1-c705-4844-94b8-bb58e65faa1f\") " pod="openstack/nova-cell0-conductor-0" Oct 01 15:15:32 crc kubenswrapper[4771]: I1001 15:15:32.474532 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 01 15:15:33 crc kubenswrapper[4771]: I1001 15:15:33.269211 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 01 15:15:34 crc kubenswrapper[4771]: I1001 15:15:34.014148 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a01636b1-c705-4844-94b8-bb58e65faa1f","Type":"ContainerStarted","Data":"31eeb9bf004d37f33dc2c8995811af4aaaa3dfa4a334ef4c8245043fdd43df2a"} Oct 01 15:15:34 crc kubenswrapper[4771]: I1001 15:15:34.015453 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a01636b1-c705-4844-94b8-bb58e65faa1f","Type":"ContainerStarted","Data":"8f5c17d8d88e525fdb918f8b7e4d559afb4a2b29e43bc49748105f980cf781f2"} Oct 01 15:15:34 crc kubenswrapper[4771]: I1001 15:15:34.041301 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.041284795 podStartE2EDuration="2.041284795s" podCreationTimestamp="2025-10-01 15:15:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:15:34.032281844 +0000 UTC m=+1178.651457015" watchObservedRunningTime="2025-10-01 15:15:34.041284795 +0000 UTC m=+1178.660459966" Oct 01 15:15:35 crc kubenswrapper[4771]: I1001 15:15:35.029029 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 01 15:15:35 crc kubenswrapper[4771]: I1001 15:15:35.152136 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="657585fd-f66c-49a8-b076-6b75dbb595e0" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Oct 01 15:15:41 crc kubenswrapper[4771]: I1001 15:15:41.098962 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 15:15:41 crc kubenswrapper[4771]: I1001 15:15:41.100864 4771 generic.go:334] "Generic (PLEG): container finished" podID="657585fd-f66c-49a8-b076-6b75dbb595e0" containerID="9d69d5e519616e59a82b57158c316bfc849a80c1a281da81efb1edc40f90a644" exitCode=137 Oct 01 15:15:41 crc kubenswrapper[4771]: I1001 15:15:41.100915 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"657585fd-f66c-49a8-b076-6b75dbb595e0","Type":"ContainerDied","Data":"9d69d5e519616e59a82b57158c316bfc849a80c1a281da81efb1edc40f90a644"} Oct 01 15:15:41 crc kubenswrapper[4771]: I1001 15:15:41.100957 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"657585fd-f66c-49a8-b076-6b75dbb595e0","Type":"ContainerDied","Data":"27ff0efc5b8367d7fd06446aa243fbbd35cdb98b0252b01e6d204c16f016e1b7"} Oct 01 15:15:41 crc kubenswrapper[4771]: I1001 15:15:41.100977 4771 scope.go:117] "RemoveContainer" containerID="9d69d5e519616e59a82b57158c316bfc849a80c1a281da81efb1edc40f90a644" Oct 01 15:15:41 crc kubenswrapper[4771]: I1001 15:15:41.138628 4771 scope.go:117] "RemoveContainer" containerID="97dc2b438d66d8c915161d33091b4c29d2e1a8daed279fc8afe1b9051f2ed1bb" Oct 01 15:15:41 crc kubenswrapper[4771]: I1001 15:15:41.161205 4771 scope.go:117] "RemoveContainer" containerID="512e1f6e7dcbb61af79b428a232574a25754417c45d05250c9ce0fc335c8cf7d" Oct 01 15:15:41 crc kubenswrapper[4771]: I1001 15:15:41.172900 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/657585fd-f66c-49a8-b076-6b75dbb595e0-run-httpd\") pod \"657585fd-f66c-49a8-b076-6b75dbb595e0\" (UID: \"657585fd-f66c-49a8-b076-6b75dbb595e0\") " Oct 01 15:15:41 crc kubenswrapper[4771]: I1001 15:15:41.173011 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/657585fd-f66c-49a8-b076-6b75dbb595e0-scripts\") pod \"657585fd-f66c-49a8-b076-6b75dbb595e0\" (UID: \"657585fd-f66c-49a8-b076-6b75dbb595e0\") " Oct 01 15:15:41 crc kubenswrapper[4771]: I1001 15:15:41.173118 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsvk9\" (UniqueName: \"kubernetes.io/projected/657585fd-f66c-49a8-b076-6b75dbb595e0-kube-api-access-jsvk9\") pod \"657585fd-f66c-49a8-b076-6b75dbb595e0\" (UID: \"657585fd-f66c-49a8-b076-6b75dbb595e0\") " Oct 01 15:15:41 crc kubenswrapper[4771]: I1001 15:15:41.173150 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/657585fd-f66c-49a8-b076-6b75dbb595e0-sg-core-conf-yaml\") pod \"657585fd-f66c-49a8-b076-6b75dbb595e0\" (UID: \"657585fd-f66c-49a8-b076-6b75dbb595e0\") " Oct 01 15:15:41 crc kubenswrapper[4771]: I1001 15:15:41.173177 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/657585fd-f66c-49a8-b076-6b75dbb595e0-log-httpd\") pod \"657585fd-f66c-49a8-b076-6b75dbb595e0\" (UID: \"657585fd-f66c-49a8-b076-6b75dbb595e0\") " Oct 01 15:15:41 crc kubenswrapper[4771]: I1001 15:15:41.173198 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/657585fd-f66c-49a8-b076-6b75dbb595e0-config-data\") pod \"657585fd-f66c-49a8-b076-6b75dbb595e0\" (UID: \"657585fd-f66c-49a8-b076-6b75dbb595e0\") " Oct 01 15:15:41 crc kubenswrapper[4771]: I1001 15:15:41.173266 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/657585fd-f66c-49a8-b076-6b75dbb595e0-combined-ca-bundle\") pod \"657585fd-f66c-49a8-b076-6b75dbb595e0\" (UID: \"657585fd-f66c-49a8-b076-6b75dbb595e0\") " Oct 01 15:15:41 crc kubenswrapper[4771]: I1001 15:15:41.173718 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/657585fd-f66c-49a8-b076-6b75dbb595e0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "657585fd-f66c-49a8-b076-6b75dbb595e0" (UID: "657585fd-f66c-49a8-b076-6b75dbb595e0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:15:41 crc kubenswrapper[4771]: I1001 15:15:41.173853 4771 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/657585fd-f66c-49a8-b076-6b75dbb595e0-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:41 crc kubenswrapper[4771]: I1001 15:15:41.174247 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/657585fd-f66c-49a8-b076-6b75dbb595e0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "657585fd-f66c-49a8-b076-6b75dbb595e0" (UID: "657585fd-f66c-49a8-b076-6b75dbb595e0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:15:41 crc kubenswrapper[4771]: I1001 15:15:41.179062 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/657585fd-f66c-49a8-b076-6b75dbb595e0-scripts" (OuterVolumeSpecName: "scripts") pod "657585fd-f66c-49a8-b076-6b75dbb595e0" (UID: "657585fd-f66c-49a8-b076-6b75dbb595e0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:15:41 crc kubenswrapper[4771]: I1001 15:15:41.179266 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/657585fd-f66c-49a8-b076-6b75dbb595e0-kube-api-access-jsvk9" (OuterVolumeSpecName: "kube-api-access-jsvk9") pod "657585fd-f66c-49a8-b076-6b75dbb595e0" (UID: "657585fd-f66c-49a8-b076-6b75dbb595e0"). InnerVolumeSpecName "kube-api-access-jsvk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:15:41 crc kubenswrapper[4771]: I1001 15:15:41.183681 4771 scope.go:117] "RemoveContainer" containerID="e7883bc02c877e3b0faadbde73d24ad745ef3a5d4d200eff172ff50991003a3d" Oct 01 15:15:41 crc kubenswrapper[4771]: I1001 15:15:41.209764 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/657585fd-f66c-49a8-b076-6b75dbb595e0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "657585fd-f66c-49a8-b076-6b75dbb595e0" (UID: "657585fd-f66c-49a8-b076-6b75dbb595e0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:15:41 crc kubenswrapper[4771]: I1001 15:15:41.244234 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/657585fd-f66c-49a8-b076-6b75dbb595e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "657585fd-f66c-49a8-b076-6b75dbb595e0" (UID: "657585fd-f66c-49a8-b076-6b75dbb595e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:15:41 crc kubenswrapper[4771]: I1001 15:15:41.276228 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/657585fd-f66c-49a8-b076-6b75dbb595e0-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:41 crc kubenswrapper[4771]: I1001 15:15:41.276468 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsvk9\" (UniqueName: \"kubernetes.io/projected/657585fd-f66c-49a8-b076-6b75dbb595e0-kube-api-access-jsvk9\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:41 crc kubenswrapper[4771]: I1001 15:15:41.276556 4771 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/657585fd-f66c-49a8-b076-6b75dbb595e0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:41 crc kubenswrapper[4771]: I1001 15:15:41.276633 4771 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/657585fd-f66c-49a8-b076-6b75dbb595e0-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:41 crc kubenswrapper[4771]: I1001 15:15:41.276713 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/657585fd-f66c-49a8-b076-6b75dbb595e0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:41 crc kubenswrapper[4771]: I1001 15:15:41.280716 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/657585fd-f66c-49a8-b076-6b75dbb595e0-config-data" (OuterVolumeSpecName: "config-data") pod "657585fd-f66c-49a8-b076-6b75dbb595e0" (UID: "657585fd-f66c-49a8-b076-6b75dbb595e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:15:41 crc kubenswrapper[4771]: I1001 15:15:41.297779 4771 scope.go:117] "RemoveContainer" containerID="9d69d5e519616e59a82b57158c316bfc849a80c1a281da81efb1edc40f90a644" Oct 01 15:15:41 crc kubenswrapper[4771]: E1001 15:15:41.298360 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d69d5e519616e59a82b57158c316bfc849a80c1a281da81efb1edc40f90a644\": container with ID starting with 9d69d5e519616e59a82b57158c316bfc849a80c1a281da81efb1edc40f90a644 not found: ID does not exist" containerID="9d69d5e519616e59a82b57158c316bfc849a80c1a281da81efb1edc40f90a644" Oct 01 15:15:41 crc kubenswrapper[4771]: I1001 15:15:41.298420 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d69d5e519616e59a82b57158c316bfc849a80c1a281da81efb1edc40f90a644"} err="failed to get container status \"9d69d5e519616e59a82b57158c316bfc849a80c1a281da81efb1edc40f90a644\": rpc error: code = NotFound desc = could not find container \"9d69d5e519616e59a82b57158c316bfc849a80c1a281da81efb1edc40f90a644\": container with ID starting with 9d69d5e519616e59a82b57158c316bfc849a80c1a281da81efb1edc40f90a644 not found: ID does not exist" Oct 01 15:15:41 crc kubenswrapper[4771]: I1001 15:15:41.298454 4771 scope.go:117] "RemoveContainer" containerID="97dc2b438d66d8c915161d33091b4c29d2e1a8daed279fc8afe1b9051f2ed1bb" Oct 01 15:15:41 crc kubenswrapper[4771]: E1001 15:15:41.299053 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97dc2b438d66d8c915161d33091b4c29d2e1a8daed279fc8afe1b9051f2ed1bb\": container with ID starting with 97dc2b438d66d8c915161d33091b4c29d2e1a8daed279fc8afe1b9051f2ed1bb not found: ID does not exist" containerID="97dc2b438d66d8c915161d33091b4c29d2e1a8daed279fc8afe1b9051f2ed1bb" Oct 01 15:15:41 crc kubenswrapper[4771]: I1001 15:15:41.299083 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97dc2b438d66d8c915161d33091b4c29d2e1a8daed279fc8afe1b9051f2ed1bb"} err="failed to get container status \"97dc2b438d66d8c915161d33091b4c29d2e1a8daed279fc8afe1b9051f2ed1bb\": rpc error: code = NotFound desc = could not find container \"97dc2b438d66d8c915161d33091b4c29d2e1a8daed279fc8afe1b9051f2ed1bb\": container with ID starting with 97dc2b438d66d8c915161d33091b4c29d2e1a8daed279fc8afe1b9051f2ed1bb not found: ID does not exist" Oct 01 15:15:41 crc kubenswrapper[4771]: I1001 15:15:41.299103 4771 scope.go:117] "RemoveContainer" containerID="512e1f6e7dcbb61af79b428a232574a25754417c45d05250c9ce0fc335c8cf7d" Oct 01 15:15:41 crc kubenswrapper[4771]: E1001 15:15:41.299470 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"512e1f6e7dcbb61af79b428a232574a25754417c45d05250c9ce0fc335c8cf7d\": container with ID starting with 512e1f6e7dcbb61af79b428a232574a25754417c45d05250c9ce0fc335c8cf7d not found: ID does not exist" containerID="512e1f6e7dcbb61af79b428a232574a25754417c45d05250c9ce0fc335c8cf7d" Oct 01 15:15:41 crc kubenswrapper[4771]: I1001 15:15:41.299498 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"512e1f6e7dcbb61af79b428a232574a25754417c45d05250c9ce0fc335c8cf7d"} err="failed to get container status \"512e1f6e7dcbb61af79b428a232574a25754417c45d05250c9ce0fc335c8cf7d\": rpc error: code = NotFound desc = could not find container \"512e1f6e7dcbb61af79b428a232574a25754417c45d05250c9ce0fc335c8cf7d\": container with ID starting with 512e1f6e7dcbb61af79b428a232574a25754417c45d05250c9ce0fc335c8cf7d not found: ID does not exist" Oct 01 15:15:41 crc kubenswrapper[4771]: I1001 15:15:41.299515 4771 scope.go:117] "RemoveContainer" containerID="e7883bc02c877e3b0faadbde73d24ad745ef3a5d4d200eff172ff50991003a3d" Oct 01 15:15:41 crc kubenswrapper[4771]: E1001 15:15:41.300072 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7883bc02c877e3b0faadbde73d24ad745ef3a5d4d200eff172ff50991003a3d\": container with ID starting with e7883bc02c877e3b0faadbde73d24ad745ef3a5d4d200eff172ff50991003a3d not found: ID does not exist" containerID="e7883bc02c877e3b0faadbde73d24ad745ef3a5d4d200eff172ff50991003a3d" Oct 01 15:15:41 crc kubenswrapper[4771]: I1001 15:15:41.300099 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7883bc02c877e3b0faadbde73d24ad745ef3a5d4d200eff172ff50991003a3d"} err="failed to get container status \"e7883bc02c877e3b0faadbde73d24ad745ef3a5d4d200eff172ff50991003a3d\": rpc error: code = NotFound desc = could not find container \"e7883bc02c877e3b0faadbde73d24ad745ef3a5d4d200eff172ff50991003a3d\": container with ID starting with e7883bc02c877e3b0faadbde73d24ad745ef3a5d4d200eff172ff50991003a3d not found: ID does not exist" Oct 01 15:15:41 crc kubenswrapper[4771]: I1001 15:15:41.378934 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/657585fd-f66c-49a8-b076-6b75dbb595e0-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:42 crc kubenswrapper[4771]: I1001 15:15:42.112808 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 15:15:42 crc kubenswrapper[4771]: I1001 15:15:42.136927 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 15:15:42 crc kubenswrapper[4771]: I1001 15:15:42.144703 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 15:15:42 crc kubenswrapper[4771]: I1001 15:15:42.176218 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 15:15:42 crc kubenswrapper[4771]: E1001 15:15:42.176687 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="657585fd-f66c-49a8-b076-6b75dbb595e0" containerName="sg-core" Oct 01 15:15:42 crc kubenswrapper[4771]: I1001 15:15:42.176716 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="657585fd-f66c-49a8-b076-6b75dbb595e0" containerName="sg-core" Oct 01 15:15:42 crc kubenswrapper[4771]: E1001 15:15:42.176764 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="657585fd-f66c-49a8-b076-6b75dbb595e0" containerName="ceilometer-central-agent" Oct 01 15:15:42 crc kubenswrapper[4771]: I1001 15:15:42.176778 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="657585fd-f66c-49a8-b076-6b75dbb595e0" containerName="ceilometer-central-agent" Oct 01 15:15:42 crc kubenswrapper[4771]: E1001 15:15:42.176795 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="657585fd-f66c-49a8-b076-6b75dbb595e0" containerName="ceilometer-notification-agent" Oct 01 15:15:42 crc kubenswrapper[4771]: I1001 15:15:42.176805 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="657585fd-f66c-49a8-b076-6b75dbb595e0" containerName="ceilometer-notification-agent" Oct 01 15:15:42 crc kubenswrapper[4771]: E1001 15:15:42.176823 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="657585fd-f66c-49a8-b076-6b75dbb595e0" containerName="proxy-httpd" Oct 01 15:15:42 crc kubenswrapper[4771]: I1001 15:15:42.176830 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="657585fd-f66c-49a8-b076-6b75dbb595e0" containerName="proxy-httpd" Oct 01 15:15:42 crc kubenswrapper[4771]: I1001 15:15:42.177125 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="657585fd-f66c-49a8-b076-6b75dbb595e0" containerName="ceilometer-central-agent" Oct 01 15:15:42 crc kubenswrapper[4771]: I1001 15:15:42.177161 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="657585fd-f66c-49a8-b076-6b75dbb595e0" containerName="ceilometer-notification-agent" Oct 01 15:15:42 crc kubenswrapper[4771]: I1001 15:15:42.177178 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="657585fd-f66c-49a8-b076-6b75dbb595e0" containerName="proxy-httpd" Oct 01 15:15:42 crc kubenswrapper[4771]: I1001 15:15:42.177203 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="657585fd-f66c-49a8-b076-6b75dbb595e0" containerName="sg-core" Oct 01 15:15:42 crc kubenswrapper[4771]: I1001 15:15:42.181979 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 15:15:42 crc kubenswrapper[4771]: I1001 15:15:42.184367 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 15:15:42 crc kubenswrapper[4771]: I1001 15:15:42.185620 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 15:15:42 crc kubenswrapper[4771]: I1001 15:15:42.191002 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 15:15:42 crc kubenswrapper[4771]: I1001 15:15:42.306925 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f10e4587-5897-4af2-ae7f-5b4e3a8392d2-run-httpd\") pod \"ceilometer-0\" (UID: \"f10e4587-5897-4af2-ae7f-5b4e3a8392d2\") " pod="openstack/ceilometer-0" Oct 01 15:15:42 crc kubenswrapper[4771]: I1001 15:15:42.306978 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f10e4587-5897-4af2-ae7f-5b4e3a8392d2-config-data\") pod \"ceilometer-0\" (UID: \"f10e4587-5897-4af2-ae7f-5b4e3a8392d2\") " pod="openstack/ceilometer-0" Oct 01 15:15:42 crc kubenswrapper[4771]: I1001 15:15:42.307014 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f10e4587-5897-4af2-ae7f-5b4e3a8392d2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f10e4587-5897-4af2-ae7f-5b4e3a8392d2\") " pod="openstack/ceilometer-0" Oct 01 15:15:42 crc kubenswrapper[4771]: I1001 15:15:42.307042 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f10e4587-5897-4af2-ae7f-5b4e3a8392d2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f10e4587-5897-4af2-ae7f-5b4e3a8392d2\") " pod="openstack/ceilometer-0" Oct 01 15:15:42 crc kubenswrapper[4771]: I1001 15:15:42.307448 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f10e4587-5897-4af2-ae7f-5b4e3a8392d2-log-httpd\") pod \"ceilometer-0\" (UID: \"f10e4587-5897-4af2-ae7f-5b4e3a8392d2\") " pod="openstack/ceilometer-0" Oct 01 15:15:42 crc kubenswrapper[4771]: I1001 15:15:42.307803 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5b95\" (UniqueName: \"kubernetes.io/projected/f10e4587-5897-4af2-ae7f-5b4e3a8392d2-kube-api-access-l5b95\") pod \"ceilometer-0\" (UID: \"f10e4587-5897-4af2-ae7f-5b4e3a8392d2\") " pod="openstack/ceilometer-0" Oct 01 15:15:42 crc kubenswrapper[4771]: I1001 15:15:42.307909 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f10e4587-5897-4af2-ae7f-5b4e3a8392d2-scripts\") pod \"ceilometer-0\" (UID: \"f10e4587-5897-4af2-ae7f-5b4e3a8392d2\") " pod="openstack/ceilometer-0" Oct 01 15:15:42 crc kubenswrapper[4771]: I1001 15:15:42.409584 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f10e4587-5897-4af2-ae7f-5b4e3a8392d2-scripts\") pod \"ceilometer-0\" (UID: \"f10e4587-5897-4af2-ae7f-5b4e3a8392d2\") " pod="openstack/ceilometer-0" Oct 01 15:15:42 crc kubenswrapper[4771]: I1001 15:15:42.409720 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f10e4587-5897-4af2-ae7f-5b4e3a8392d2-run-httpd\") pod \"ceilometer-0\" (UID: \"f10e4587-5897-4af2-ae7f-5b4e3a8392d2\") " pod="openstack/ceilometer-0" Oct 01 15:15:42 crc kubenswrapper[4771]: I1001 15:15:42.409778 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f10e4587-5897-4af2-ae7f-5b4e3a8392d2-config-data\") pod \"ceilometer-0\" (UID: \"f10e4587-5897-4af2-ae7f-5b4e3a8392d2\") " pod="openstack/ceilometer-0" Oct 01 15:15:42 crc kubenswrapper[4771]: I1001 15:15:42.409804 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f10e4587-5897-4af2-ae7f-5b4e3a8392d2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f10e4587-5897-4af2-ae7f-5b4e3a8392d2\") " pod="openstack/ceilometer-0" Oct 01 15:15:42 crc kubenswrapper[4771]: I1001 15:15:42.409824 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f10e4587-5897-4af2-ae7f-5b4e3a8392d2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f10e4587-5897-4af2-ae7f-5b4e3a8392d2\") " pod="openstack/ceilometer-0" Oct 01 15:15:42 crc kubenswrapper[4771]: I1001 15:15:42.409887 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f10e4587-5897-4af2-ae7f-5b4e3a8392d2-log-httpd\") pod \"ceilometer-0\" (UID: \"f10e4587-5897-4af2-ae7f-5b4e3a8392d2\") " pod="openstack/ceilometer-0" Oct 01 15:15:42 crc kubenswrapper[4771]: I1001 15:15:42.409945 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5b95\" (UniqueName: \"kubernetes.io/projected/f10e4587-5897-4af2-ae7f-5b4e3a8392d2-kube-api-access-l5b95\") pod \"ceilometer-0\" (UID: \"f10e4587-5897-4af2-ae7f-5b4e3a8392d2\") " pod="openstack/ceilometer-0" Oct 01 15:15:42 crc kubenswrapper[4771]: I1001 15:15:42.410903 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f10e4587-5897-4af2-ae7f-5b4e3a8392d2-run-httpd\") pod \"ceilometer-0\" (UID: \"f10e4587-5897-4af2-ae7f-5b4e3a8392d2\") " pod="openstack/ceilometer-0" Oct 01 15:15:42 crc kubenswrapper[4771]: I1001 15:15:42.410955 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f10e4587-5897-4af2-ae7f-5b4e3a8392d2-log-httpd\") pod \"ceilometer-0\" (UID: \"f10e4587-5897-4af2-ae7f-5b4e3a8392d2\") " pod="openstack/ceilometer-0" Oct 01 15:15:42 crc kubenswrapper[4771]: I1001 15:15:42.414698 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f10e4587-5897-4af2-ae7f-5b4e3a8392d2-scripts\") pod \"ceilometer-0\" (UID: \"f10e4587-5897-4af2-ae7f-5b4e3a8392d2\") " pod="openstack/ceilometer-0" Oct 01 15:15:42 crc kubenswrapper[4771]: I1001 15:15:42.426499 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f10e4587-5897-4af2-ae7f-5b4e3a8392d2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f10e4587-5897-4af2-ae7f-5b4e3a8392d2\") " pod="openstack/ceilometer-0" Oct 01 15:15:42 crc kubenswrapper[4771]: I1001 15:15:42.426841 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f10e4587-5897-4af2-ae7f-5b4e3a8392d2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f10e4587-5897-4af2-ae7f-5b4e3a8392d2\") " pod="openstack/ceilometer-0" Oct 01 15:15:42 crc kubenswrapper[4771]: I1001 15:15:42.427214 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f10e4587-5897-4af2-ae7f-5b4e3a8392d2-config-data\") pod \"ceilometer-0\" (UID: \"f10e4587-5897-4af2-ae7f-5b4e3a8392d2\") " pod="openstack/ceilometer-0" Oct 01 15:15:42 crc kubenswrapper[4771]: I1001 15:15:42.432145 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5b95\" (UniqueName: \"kubernetes.io/projected/f10e4587-5897-4af2-ae7f-5b4e3a8392d2-kube-api-access-l5b95\") pod \"ceilometer-0\" (UID: \"f10e4587-5897-4af2-ae7f-5b4e3a8392d2\") " pod="openstack/ceilometer-0" Oct 01 15:15:42 crc kubenswrapper[4771]: I1001 15:15:42.508695 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 15:15:42 crc kubenswrapper[4771]: I1001 15:15:42.531897 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.005168 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 15:15:43 crc kubenswrapper[4771]: W1001 15:15:43.016648 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf10e4587_5897_4af2_ae7f_5b4e3a8392d2.slice/crio-cae200a7dddeafe5f0049eae651d42530627b7b7546f359abe45a88e4b63655f WatchSource:0}: Error finding container cae200a7dddeafe5f0049eae651d42530627b7b7546f359abe45a88e4b63655f: Status 404 returned error can't find the container with id cae200a7dddeafe5f0049eae651d42530627b7b7546f359abe45a88e4b63655f Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.048877 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-hqpb2"] Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.050100 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-hqpb2" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.053037 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.053311 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.095816 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-hqpb2"] Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.131221 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f10e4587-5897-4af2-ae7f-5b4e3a8392d2","Type":"ContainerStarted","Data":"cae200a7dddeafe5f0049eae651d42530627b7b7546f359abe45a88e4b63655f"} Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.131579 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97e141b7-bc02-4fbe-b918-6b31a4dea6cf-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-hqpb2\" (UID: \"97e141b7-bc02-4fbe-b918-6b31a4dea6cf\") " pod="openstack/nova-cell0-cell-mapping-hqpb2" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.131650 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97e141b7-bc02-4fbe-b918-6b31a4dea6cf-scripts\") pod \"nova-cell0-cell-mapping-hqpb2\" (UID: \"97e141b7-bc02-4fbe-b918-6b31a4dea6cf\") " pod="openstack/nova-cell0-cell-mapping-hqpb2" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.131689 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ds4df\" (UniqueName: \"kubernetes.io/projected/97e141b7-bc02-4fbe-b918-6b31a4dea6cf-kube-api-access-ds4df\") pod \"nova-cell0-cell-mapping-hqpb2\" (UID: \"97e141b7-bc02-4fbe-b918-6b31a4dea6cf\") " pod="openstack/nova-cell0-cell-mapping-hqpb2" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.132556 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97e141b7-bc02-4fbe-b918-6b31a4dea6cf-config-data\") pod \"nova-cell0-cell-mapping-hqpb2\" (UID: \"97e141b7-bc02-4fbe-b918-6b31a4dea6cf\") " pod="openstack/nova-cell0-cell-mapping-hqpb2" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.233814 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ds4df\" (UniqueName: \"kubernetes.io/projected/97e141b7-bc02-4fbe-b918-6b31a4dea6cf-kube-api-access-ds4df\") pod \"nova-cell0-cell-mapping-hqpb2\" (UID: \"97e141b7-bc02-4fbe-b918-6b31a4dea6cf\") " pod="openstack/nova-cell0-cell-mapping-hqpb2" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.234352 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97e141b7-bc02-4fbe-b918-6b31a4dea6cf-config-data\") pod \"nova-cell0-cell-mapping-hqpb2\" (UID: \"97e141b7-bc02-4fbe-b918-6b31a4dea6cf\") " pod="openstack/nova-cell0-cell-mapping-hqpb2" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.234450 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97e141b7-bc02-4fbe-b918-6b31a4dea6cf-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-hqpb2\" (UID: \"97e141b7-bc02-4fbe-b918-6b31a4dea6cf\") " pod="openstack/nova-cell0-cell-mapping-hqpb2" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.234513 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97e141b7-bc02-4fbe-b918-6b31a4dea6cf-scripts\") pod \"nova-cell0-cell-mapping-hqpb2\" (UID: \"97e141b7-bc02-4fbe-b918-6b31a4dea6cf\") " pod="openstack/nova-cell0-cell-mapping-hqpb2" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.246305 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97e141b7-bc02-4fbe-b918-6b31a4dea6cf-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-hqpb2\" (UID: \"97e141b7-bc02-4fbe-b918-6b31a4dea6cf\") " pod="openstack/nova-cell0-cell-mapping-hqpb2" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.246322 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97e141b7-bc02-4fbe-b918-6b31a4dea6cf-scripts\") pod \"nova-cell0-cell-mapping-hqpb2\" (UID: \"97e141b7-bc02-4fbe-b918-6b31a4dea6cf\") " pod="openstack/nova-cell0-cell-mapping-hqpb2" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.250576 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97e141b7-bc02-4fbe-b918-6b31a4dea6cf-config-data\") pod \"nova-cell0-cell-mapping-hqpb2\" (UID: \"97e141b7-bc02-4fbe-b918-6b31a4dea6cf\") " pod="openstack/nova-cell0-cell-mapping-hqpb2" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.263281 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ds4df\" (UniqueName: \"kubernetes.io/projected/97e141b7-bc02-4fbe-b918-6b31a4dea6cf-kube-api-access-ds4df\") pod \"nova-cell0-cell-mapping-hqpb2\" (UID: \"97e141b7-bc02-4fbe-b918-6b31a4dea6cf\") " pod="openstack/nova-cell0-cell-mapping-hqpb2" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.310803 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.312275 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.315147 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.328478 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.338405 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.340058 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.353601 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.368106 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.425138 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-hqpb2" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.438894 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d89ad9e9-e14e-4082-9121-39ae3de01ab3-logs\") pod \"nova-api-0\" (UID: \"d89ad9e9-e14e-4082-9121-39ae3de01ab3\") " pod="openstack/nova-api-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.438980 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfmjt\" (UniqueName: \"kubernetes.io/projected/d89ad9e9-e14e-4082-9121-39ae3de01ab3-kube-api-access-kfmjt\") pod \"nova-api-0\" (UID: \"d89ad9e9-e14e-4082-9121-39ae3de01ab3\") " pod="openstack/nova-api-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.439040 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj45z\" (UniqueName: \"kubernetes.io/projected/08a4e4bf-4009-4d69-b513-f821239be25c-kube-api-access-jj45z\") pod \"nova-scheduler-0\" (UID: \"08a4e4bf-4009-4d69-b513-f821239be25c\") " pod="openstack/nova-scheduler-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.439085 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08a4e4bf-4009-4d69-b513-f821239be25c-config-data\") pod \"nova-scheduler-0\" (UID: \"08a4e4bf-4009-4d69-b513-f821239be25c\") " pod="openstack/nova-scheduler-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.439163 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d89ad9e9-e14e-4082-9121-39ae3de01ab3-config-data\") pod \"nova-api-0\" (UID: \"d89ad9e9-e14e-4082-9121-39ae3de01ab3\") " pod="openstack/nova-api-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.439193 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08a4e4bf-4009-4d69-b513-f821239be25c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"08a4e4bf-4009-4d69-b513-f821239be25c\") " pod="openstack/nova-scheduler-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.439228 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d89ad9e9-e14e-4082-9121-39ae3de01ab3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d89ad9e9-e14e-4082-9121-39ae3de01ab3\") " pod="openstack/nova-api-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.446713 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.464613 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.476094 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.545079 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08a4e4bf-4009-4d69-b513-f821239be25c-config-data\") pod \"nova-scheduler-0\" (UID: \"08a4e4bf-4009-4d69-b513-f821239be25c\") " pod="openstack/nova-scheduler-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.545162 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d89ad9e9-e14e-4082-9121-39ae3de01ab3-config-data\") pod \"nova-api-0\" (UID: \"d89ad9e9-e14e-4082-9121-39ae3de01ab3\") " pod="openstack/nova-api-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.545191 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08a4e4bf-4009-4d69-b513-f821239be25c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"08a4e4bf-4009-4d69-b513-f821239be25c\") " pod="openstack/nova-scheduler-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.545221 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgckk\" (UniqueName: \"kubernetes.io/projected/4bfe5278-08a7-4ece-aaea-bc4e123f028d-kube-api-access-xgckk\") pod \"nova-metadata-0\" (UID: \"4bfe5278-08a7-4ece-aaea-bc4e123f028d\") " pod="openstack/nova-metadata-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.545257 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d89ad9e9-e14e-4082-9121-39ae3de01ab3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d89ad9e9-e14e-4082-9121-39ae3de01ab3\") " pod="openstack/nova-api-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.545305 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d89ad9e9-e14e-4082-9121-39ae3de01ab3-logs\") pod \"nova-api-0\" (UID: \"d89ad9e9-e14e-4082-9121-39ae3de01ab3\") " pod="openstack/nova-api-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.545341 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfmjt\" (UniqueName: \"kubernetes.io/projected/d89ad9e9-e14e-4082-9121-39ae3de01ab3-kube-api-access-kfmjt\") pod \"nova-api-0\" (UID: \"d89ad9e9-e14e-4082-9121-39ae3de01ab3\") " pod="openstack/nova-api-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.545364 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bfe5278-08a7-4ece-aaea-bc4e123f028d-config-data\") pod \"nova-metadata-0\" (UID: \"4bfe5278-08a7-4ece-aaea-bc4e123f028d\") " pod="openstack/nova-metadata-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.545386 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bfe5278-08a7-4ece-aaea-bc4e123f028d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4bfe5278-08a7-4ece-aaea-bc4e123f028d\") " pod="openstack/nova-metadata-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.545404 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bfe5278-08a7-4ece-aaea-bc4e123f028d-logs\") pod \"nova-metadata-0\" (UID: \"4bfe5278-08a7-4ece-aaea-bc4e123f028d\") " pod="openstack/nova-metadata-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.545429 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj45z\" (UniqueName: \"kubernetes.io/projected/08a4e4bf-4009-4d69-b513-f821239be25c-kube-api-access-jj45z\") pod \"nova-scheduler-0\" (UID: \"08a4e4bf-4009-4d69-b513-f821239be25c\") " pod="openstack/nova-scheduler-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.549259 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d89ad9e9-e14e-4082-9121-39ae3de01ab3-logs\") pod \"nova-api-0\" (UID: \"d89ad9e9-e14e-4082-9121-39ae3de01ab3\") " pod="openstack/nova-api-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.567299 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d89ad9e9-e14e-4082-9121-39ae3de01ab3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d89ad9e9-e14e-4082-9121-39ae3de01ab3\") " pod="openstack/nova-api-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.610647 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08a4e4bf-4009-4d69-b513-f821239be25c-config-data\") pod \"nova-scheduler-0\" (UID: \"08a4e4bf-4009-4d69-b513-f821239be25c\") " pod="openstack/nova-scheduler-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.611378 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08a4e4bf-4009-4d69-b513-f821239be25c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"08a4e4bf-4009-4d69-b513-f821239be25c\") " pod="openstack/nova-scheduler-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.621070 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfmjt\" (UniqueName: \"kubernetes.io/projected/d89ad9e9-e14e-4082-9121-39ae3de01ab3-kube-api-access-kfmjt\") pod \"nova-api-0\" (UID: \"d89ad9e9-e14e-4082-9121-39ae3de01ab3\") " pod="openstack/nova-api-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.623441 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d89ad9e9-e14e-4082-9121-39ae3de01ab3-config-data\") pod \"nova-api-0\" (UID: \"d89ad9e9-e14e-4082-9121-39ae3de01ab3\") " pod="openstack/nova-api-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.645379 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj45z\" (UniqueName: \"kubernetes.io/projected/08a4e4bf-4009-4d69-b513-f821239be25c-kube-api-access-jj45z\") pod \"nova-scheduler-0\" (UID: \"08a4e4bf-4009-4d69-b513-f821239be25c\") " pod="openstack/nova-scheduler-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.652969 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgckk\" (UniqueName: \"kubernetes.io/projected/4bfe5278-08a7-4ece-aaea-bc4e123f028d-kube-api-access-xgckk\") pod \"nova-metadata-0\" (UID: \"4bfe5278-08a7-4ece-aaea-bc4e123f028d\") " pod="openstack/nova-metadata-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.653148 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bfe5278-08a7-4ece-aaea-bc4e123f028d-config-data\") pod \"nova-metadata-0\" (UID: \"4bfe5278-08a7-4ece-aaea-bc4e123f028d\") " pod="openstack/nova-metadata-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.653185 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bfe5278-08a7-4ece-aaea-bc4e123f028d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4bfe5278-08a7-4ece-aaea-bc4e123f028d\") " pod="openstack/nova-metadata-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.653214 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bfe5278-08a7-4ece-aaea-bc4e123f028d-logs\") pod \"nova-metadata-0\" (UID: \"4bfe5278-08a7-4ece-aaea-bc4e123f028d\") " pod="openstack/nova-metadata-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.653656 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bfe5278-08a7-4ece-aaea-bc4e123f028d-logs\") pod \"nova-metadata-0\" (UID: \"4bfe5278-08a7-4ece-aaea-bc4e123f028d\") " pod="openstack/nova-metadata-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.656051 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.660256 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bfe5278-08a7-4ece-aaea-bc4e123f028d-config-data\") pod \"nova-metadata-0\" (UID: \"4bfe5278-08a7-4ece-aaea-bc4e123f028d\") " pod="openstack/nova-metadata-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.661177 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bfe5278-08a7-4ece-aaea-bc4e123f028d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4bfe5278-08a7-4ece-aaea-bc4e123f028d\") " pod="openstack/nova-metadata-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.679863 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.695352 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgckk\" (UniqueName: \"kubernetes.io/projected/4bfe5278-08a7-4ece-aaea-bc4e123f028d-kube-api-access-xgckk\") pod \"nova-metadata-0\" (UID: \"4bfe5278-08a7-4ece-aaea-bc4e123f028d\") " pod="openstack/nova-metadata-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.702826 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.704103 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.707310 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.721812 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.754118 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.755809 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rssbb\" (UniqueName: \"kubernetes.io/projected/7d74cde4-a358-4f81-9c52-7e4a9a1646f1-kube-api-access-rssbb\") pod \"nova-cell1-novncproxy-0\" (UID: \"7d74cde4-a358-4f81-9c52-7e4a9a1646f1\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.755865 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d74cde4-a358-4f81-9c52-7e4a9a1646f1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7d74cde4-a358-4f81-9c52-7e4a9a1646f1\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.755918 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d74cde4-a358-4f81-9c52-7e4a9a1646f1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7d74cde4-a358-4f81-9c52-7e4a9a1646f1\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.780001 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-4wgt5"] Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.782227 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-4wgt5" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.814656 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-4wgt5"] Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.861672 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3a0a923-de02-41f9-81ea-79f1529d12e4-config\") pod \"dnsmasq-dns-845d6d6f59-4wgt5\" (UID: \"b3a0a923-de02-41f9-81ea-79f1529d12e4\") " pod="openstack/dnsmasq-dns-845d6d6f59-4wgt5" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.861754 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d74cde4-a358-4f81-9c52-7e4a9a1646f1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7d74cde4-a358-4f81-9c52-7e4a9a1646f1\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.861797 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3a0a923-de02-41f9-81ea-79f1529d12e4-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-4wgt5\" (UID: \"b3a0a923-de02-41f9-81ea-79f1529d12e4\") " pod="openstack/dnsmasq-dns-845d6d6f59-4wgt5" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.861853 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3a0a923-de02-41f9-81ea-79f1529d12e4-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-4wgt5\" (UID: \"b3a0a923-de02-41f9-81ea-79f1529d12e4\") " pod="openstack/dnsmasq-dns-845d6d6f59-4wgt5" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.861901 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b8d7\" (UniqueName: \"kubernetes.io/projected/b3a0a923-de02-41f9-81ea-79f1529d12e4-kube-api-access-2b8d7\") pod \"dnsmasq-dns-845d6d6f59-4wgt5\" (UID: \"b3a0a923-de02-41f9-81ea-79f1529d12e4\") " pod="openstack/dnsmasq-dns-845d6d6f59-4wgt5" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.861937 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b3a0a923-de02-41f9-81ea-79f1529d12e4-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-4wgt5\" (UID: \"b3a0a923-de02-41f9-81ea-79f1529d12e4\") " pod="openstack/dnsmasq-dns-845d6d6f59-4wgt5" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.862011 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rssbb\" (UniqueName: \"kubernetes.io/projected/7d74cde4-a358-4f81-9c52-7e4a9a1646f1-kube-api-access-rssbb\") pod \"nova-cell1-novncproxy-0\" (UID: \"7d74cde4-a358-4f81-9c52-7e4a9a1646f1\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.862039 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d74cde4-a358-4f81-9c52-7e4a9a1646f1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7d74cde4-a358-4f81-9c52-7e4a9a1646f1\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.862057 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3a0a923-de02-41f9-81ea-79f1529d12e4-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-4wgt5\" (UID: \"b3a0a923-de02-41f9-81ea-79f1529d12e4\") " pod="openstack/dnsmasq-dns-845d6d6f59-4wgt5" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.866716 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d74cde4-a358-4f81-9c52-7e4a9a1646f1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7d74cde4-a358-4f81-9c52-7e4a9a1646f1\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.874322 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d74cde4-a358-4f81-9c52-7e4a9a1646f1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7d74cde4-a358-4f81-9c52-7e4a9a1646f1\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.901196 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rssbb\" (UniqueName: \"kubernetes.io/projected/7d74cde4-a358-4f81-9c52-7e4a9a1646f1-kube-api-access-rssbb\") pod \"nova-cell1-novncproxy-0\" (UID: \"7d74cde4-a358-4f81-9c52-7e4a9a1646f1\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.929350 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.964749 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3a0a923-de02-41f9-81ea-79f1529d12e4-config\") pod \"dnsmasq-dns-845d6d6f59-4wgt5\" (UID: \"b3a0a923-de02-41f9-81ea-79f1529d12e4\") " pod="openstack/dnsmasq-dns-845d6d6f59-4wgt5" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.965113 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3a0a923-de02-41f9-81ea-79f1529d12e4-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-4wgt5\" (UID: \"b3a0a923-de02-41f9-81ea-79f1529d12e4\") " pod="openstack/dnsmasq-dns-845d6d6f59-4wgt5" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.965947 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3a0a923-de02-41f9-81ea-79f1529d12e4-dns-svc\") pod \"dnsmasq-dns-845d6d6f59-4wgt5\" (UID: \"b3a0a923-de02-41f9-81ea-79f1529d12e4\") " pod="openstack/dnsmasq-dns-845d6d6f59-4wgt5" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.965150 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3a0a923-de02-41f9-81ea-79f1529d12e4-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-4wgt5\" (UID: \"b3a0a923-de02-41f9-81ea-79f1529d12e4\") " pod="openstack/dnsmasq-dns-845d6d6f59-4wgt5" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.966024 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2b8d7\" (UniqueName: \"kubernetes.io/projected/b3a0a923-de02-41f9-81ea-79f1529d12e4-kube-api-access-2b8d7\") pod \"dnsmasq-dns-845d6d6f59-4wgt5\" (UID: \"b3a0a923-de02-41f9-81ea-79f1529d12e4\") " pod="openstack/dnsmasq-dns-845d6d6f59-4wgt5" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.966049 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b3a0a923-de02-41f9-81ea-79f1529d12e4-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-4wgt5\" (UID: \"b3a0a923-de02-41f9-81ea-79f1529d12e4\") " pod="openstack/dnsmasq-dns-845d6d6f59-4wgt5" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.966111 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3a0a923-de02-41f9-81ea-79f1529d12e4-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-4wgt5\" (UID: \"b3a0a923-de02-41f9-81ea-79f1529d12e4\") " pod="openstack/dnsmasq-dns-845d6d6f59-4wgt5" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.966162 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3a0a923-de02-41f9-81ea-79f1529d12e4-ovsdbserver-nb\") pod \"dnsmasq-dns-845d6d6f59-4wgt5\" (UID: \"b3a0a923-de02-41f9-81ea-79f1529d12e4\") " pod="openstack/dnsmasq-dns-845d6d6f59-4wgt5" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.966894 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3a0a923-de02-41f9-81ea-79f1529d12e4-ovsdbserver-sb\") pod \"dnsmasq-dns-845d6d6f59-4wgt5\" (UID: \"b3a0a923-de02-41f9-81ea-79f1529d12e4\") " pod="openstack/dnsmasq-dns-845d6d6f59-4wgt5" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.967216 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b3a0a923-de02-41f9-81ea-79f1529d12e4-dns-swift-storage-0\") pod \"dnsmasq-dns-845d6d6f59-4wgt5\" (UID: \"b3a0a923-de02-41f9-81ea-79f1529d12e4\") " pod="openstack/dnsmasq-dns-845d6d6f59-4wgt5" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.967602 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3a0a923-de02-41f9-81ea-79f1529d12e4-config\") pod \"dnsmasq-dns-845d6d6f59-4wgt5\" (UID: \"b3a0a923-de02-41f9-81ea-79f1529d12e4\") " pod="openstack/dnsmasq-dns-845d6d6f59-4wgt5" Oct 01 15:15:43 crc kubenswrapper[4771]: I1001 15:15:43.986254 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b8d7\" (UniqueName: \"kubernetes.io/projected/b3a0a923-de02-41f9-81ea-79f1529d12e4-kube-api-access-2b8d7\") pod \"dnsmasq-dns-845d6d6f59-4wgt5\" (UID: \"b3a0a923-de02-41f9-81ea-79f1529d12e4\") " pod="openstack/dnsmasq-dns-845d6d6f59-4wgt5" Oct 01 15:15:44 crc kubenswrapper[4771]: I1001 15:15:44.000866 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="657585fd-f66c-49a8-b076-6b75dbb595e0" path="/var/lib/kubelet/pods/657585fd-f66c-49a8-b076-6b75dbb595e0/volumes" Oct 01 15:15:44 crc kubenswrapper[4771]: I1001 15:15:44.039159 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 01 15:15:44 crc kubenswrapper[4771]: I1001 15:15:44.101432 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-4wgt5" Oct 01 15:15:44 crc kubenswrapper[4771]: I1001 15:15:44.155266 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-hqpb2"] Oct 01 15:15:44 crc kubenswrapper[4771]: I1001 15:15:44.167268 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f10e4587-5897-4af2-ae7f-5b4e3a8392d2","Type":"ContainerStarted","Data":"23ae913d6ac1cbd4c2707c2b990cdcad7e906aa212dfc4b3e288e4caa4aa4c4f"} Oct 01 15:15:44 crc kubenswrapper[4771]: W1001 15:15:44.182927 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97e141b7_bc02_4fbe_b918_6b31a4dea6cf.slice/crio-8bd0480560c28893089b1e58ae180269ae22bebe47f3328d57c9c123ad8491bd WatchSource:0}: Error finding container 8bd0480560c28893089b1e58ae180269ae22bebe47f3328d57c9c123ad8491bd: Status 404 returned error can't find the container with id 8bd0480560c28893089b1e58ae180269ae22bebe47f3328d57c9c123ad8491bd Oct 01 15:15:44 crc kubenswrapper[4771]: I1001 15:15:44.278461 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 15:15:44 crc kubenswrapper[4771]: I1001 15:15:44.360887 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 15:15:44 crc kubenswrapper[4771]: I1001 15:15:44.399413 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 15:15:44 crc kubenswrapper[4771]: I1001 15:15:44.669452 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-4wgt5"] Oct 01 15:15:44 crc kubenswrapper[4771]: W1001 15:15:44.674649 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3a0a923_de02_41f9_81ea_79f1529d12e4.slice/crio-a6e3a2fda0e7ca42978b5f6defc8d67ddeedb2b0afee56e1c5226d4b96b57e89 WatchSource:0}: Error finding container a6e3a2fda0e7ca42978b5f6defc8d67ddeedb2b0afee56e1c5226d4b96b57e89: Status 404 returned error can't find the container with id a6e3a2fda0e7ca42978b5f6defc8d67ddeedb2b0afee56e1c5226d4b96b57e89 Oct 01 15:15:44 crc kubenswrapper[4771]: I1001 15:15:44.677108 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 15:15:44 crc kubenswrapper[4771]: W1001 15:15:44.678483 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d74cde4_a358_4f81_9c52_7e4a9a1646f1.slice/crio-39a5591c0b485e0269423a74303088d27a49e7a81b4932284c0750cc75e1f560 WatchSource:0}: Error finding container 39a5591c0b485e0269423a74303088d27a49e7a81b4932284c0750cc75e1f560: Status 404 returned error can't find the container with id 39a5591c0b485e0269423a74303088d27a49e7a81b4932284c0750cc75e1f560 Oct 01 15:15:44 crc kubenswrapper[4771]: I1001 15:15:44.684486 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-fb68r"] Oct 01 15:15:44 crc kubenswrapper[4771]: I1001 15:15:44.685613 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-fb68r" Oct 01 15:15:44 crc kubenswrapper[4771]: I1001 15:15:44.689229 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 01 15:15:44 crc kubenswrapper[4771]: I1001 15:15:44.694847 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 01 15:15:44 crc kubenswrapper[4771]: I1001 15:15:44.697464 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-fb68r"] Oct 01 15:15:44 crc kubenswrapper[4771]: I1001 15:15:44.784581 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abc660b6-f8c7-4a39-b5f1-861ecaa73e75-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-fb68r\" (UID: \"abc660b6-f8c7-4a39-b5f1-861ecaa73e75\") " pod="openstack/nova-cell1-conductor-db-sync-fb68r" Oct 01 15:15:44 crc kubenswrapper[4771]: I1001 15:15:44.784673 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abc660b6-f8c7-4a39-b5f1-861ecaa73e75-config-data\") pod \"nova-cell1-conductor-db-sync-fb68r\" (UID: \"abc660b6-f8c7-4a39-b5f1-861ecaa73e75\") " pod="openstack/nova-cell1-conductor-db-sync-fb68r" Oct 01 15:15:44 crc kubenswrapper[4771]: I1001 15:15:44.784714 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tpvl\" (UniqueName: \"kubernetes.io/projected/abc660b6-f8c7-4a39-b5f1-861ecaa73e75-kube-api-access-8tpvl\") pod \"nova-cell1-conductor-db-sync-fb68r\" (UID: \"abc660b6-f8c7-4a39-b5f1-861ecaa73e75\") " pod="openstack/nova-cell1-conductor-db-sync-fb68r" Oct 01 15:15:44 crc kubenswrapper[4771]: I1001 15:15:44.784800 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abc660b6-f8c7-4a39-b5f1-861ecaa73e75-scripts\") pod \"nova-cell1-conductor-db-sync-fb68r\" (UID: \"abc660b6-f8c7-4a39-b5f1-861ecaa73e75\") " pod="openstack/nova-cell1-conductor-db-sync-fb68r" Oct 01 15:15:44 crc kubenswrapper[4771]: I1001 15:15:44.887104 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abc660b6-f8c7-4a39-b5f1-861ecaa73e75-scripts\") pod \"nova-cell1-conductor-db-sync-fb68r\" (UID: \"abc660b6-f8c7-4a39-b5f1-861ecaa73e75\") " pod="openstack/nova-cell1-conductor-db-sync-fb68r" Oct 01 15:15:44 crc kubenswrapper[4771]: I1001 15:15:44.887193 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abc660b6-f8c7-4a39-b5f1-861ecaa73e75-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-fb68r\" (UID: \"abc660b6-f8c7-4a39-b5f1-861ecaa73e75\") " pod="openstack/nova-cell1-conductor-db-sync-fb68r" Oct 01 15:15:44 crc kubenswrapper[4771]: I1001 15:15:44.887276 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abc660b6-f8c7-4a39-b5f1-861ecaa73e75-config-data\") pod \"nova-cell1-conductor-db-sync-fb68r\" (UID: \"abc660b6-f8c7-4a39-b5f1-861ecaa73e75\") " pod="openstack/nova-cell1-conductor-db-sync-fb68r" Oct 01 15:15:44 crc kubenswrapper[4771]: I1001 15:15:44.887327 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tpvl\" (UniqueName: \"kubernetes.io/projected/abc660b6-f8c7-4a39-b5f1-861ecaa73e75-kube-api-access-8tpvl\") pod \"nova-cell1-conductor-db-sync-fb68r\" (UID: \"abc660b6-f8c7-4a39-b5f1-861ecaa73e75\") " pod="openstack/nova-cell1-conductor-db-sync-fb68r" Oct 01 15:15:44 crc kubenswrapper[4771]: I1001 15:15:44.891301 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abc660b6-f8c7-4a39-b5f1-861ecaa73e75-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-fb68r\" (UID: \"abc660b6-f8c7-4a39-b5f1-861ecaa73e75\") " pod="openstack/nova-cell1-conductor-db-sync-fb68r" Oct 01 15:15:44 crc kubenswrapper[4771]: I1001 15:15:44.891545 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abc660b6-f8c7-4a39-b5f1-861ecaa73e75-config-data\") pod \"nova-cell1-conductor-db-sync-fb68r\" (UID: \"abc660b6-f8c7-4a39-b5f1-861ecaa73e75\") " pod="openstack/nova-cell1-conductor-db-sync-fb68r" Oct 01 15:15:44 crc kubenswrapper[4771]: I1001 15:15:44.891692 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abc660b6-f8c7-4a39-b5f1-861ecaa73e75-scripts\") pod \"nova-cell1-conductor-db-sync-fb68r\" (UID: \"abc660b6-f8c7-4a39-b5f1-861ecaa73e75\") " pod="openstack/nova-cell1-conductor-db-sync-fb68r" Oct 01 15:15:44 crc kubenswrapper[4771]: I1001 15:15:44.911664 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tpvl\" (UniqueName: \"kubernetes.io/projected/abc660b6-f8c7-4a39-b5f1-861ecaa73e75-kube-api-access-8tpvl\") pod \"nova-cell1-conductor-db-sync-fb68r\" (UID: \"abc660b6-f8c7-4a39-b5f1-861ecaa73e75\") " pod="openstack/nova-cell1-conductor-db-sync-fb68r" Oct 01 15:15:45 crc kubenswrapper[4771]: I1001 15:15:45.019317 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-fb68r" Oct 01 15:15:45 crc kubenswrapper[4771]: I1001 15:15:45.201964 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f10e4587-5897-4af2-ae7f-5b4e3a8392d2","Type":"ContainerStarted","Data":"daa8b6d6c018fc866fc424dc0af864a8d02a7b70ed46211915e7f4a3716f8be0"} Oct 01 15:15:45 crc kubenswrapper[4771]: I1001 15:15:45.211392 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7d74cde4-a358-4f81-9c52-7e4a9a1646f1","Type":"ContainerStarted","Data":"39a5591c0b485e0269423a74303088d27a49e7a81b4932284c0750cc75e1f560"} Oct 01 15:15:45 crc kubenswrapper[4771]: I1001 15:15:45.215714 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d89ad9e9-e14e-4082-9121-39ae3de01ab3","Type":"ContainerStarted","Data":"d59b25de00fdaf17c470f1bd1e781302aa1b147721ea86c44081000e324fcbee"} Oct 01 15:15:45 crc kubenswrapper[4771]: I1001 15:15:45.219924 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4bfe5278-08a7-4ece-aaea-bc4e123f028d","Type":"ContainerStarted","Data":"29d16b258c13ade7831fbf638c983be836b06e4fda3f4ba217ed08800a936216"} Oct 01 15:15:45 crc kubenswrapper[4771]: I1001 15:15:45.224045 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"08a4e4bf-4009-4d69-b513-f821239be25c","Type":"ContainerStarted","Data":"cf8294e01a66578a2f8cd637bac8a9ba270c6e7dfe115b0ff163b9e32828a145"} Oct 01 15:15:45 crc kubenswrapper[4771]: I1001 15:15:45.225885 4771 generic.go:334] "Generic (PLEG): container finished" podID="b3a0a923-de02-41f9-81ea-79f1529d12e4" containerID="1ff57663ba973863ef573aa4564036f34adab79db261e35bb36d7cbfc77a2042" exitCode=0 Oct 01 15:15:45 crc kubenswrapper[4771]: I1001 15:15:45.225949 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-4wgt5" event={"ID":"b3a0a923-de02-41f9-81ea-79f1529d12e4","Type":"ContainerDied","Data":"1ff57663ba973863ef573aa4564036f34adab79db261e35bb36d7cbfc77a2042"} Oct 01 15:15:45 crc kubenswrapper[4771]: I1001 15:15:45.225968 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-4wgt5" event={"ID":"b3a0a923-de02-41f9-81ea-79f1529d12e4","Type":"ContainerStarted","Data":"a6e3a2fda0e7ca42978b5f6defc8d67ddeedb2b0afee56e1c5226d4b96b57e89"} Oct 01 15:15:45 crc kubenswrapper[4771]: I1001 15:15:45.229182 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-hqpb2" event={"ID":"97e141b7-bc02-4fbe-b918-6b31a4dea6cf","Type":"ContainerStarted","Data":"7ac0a44bd285907f9bc19f2bc2ec9d571c6507531c623cdf49edb9ca1b72fc57"} Oct 01 15:15:45 crc kubenswrapper[4771]: I1001 15:15:45.229228 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-hqpb2" event={"ID":"97e141b7-bc02-4fbe-b918-6b31a4dea6cf","Type":"ContainerStarted","Data":"8bd0480560c28893089b1e58ae180269ae22bebe47f3328d57c9c123ad8491bd"} Oct 01 15:15:45 crc kubenswrapper[4771]: I1001 15:15:45.309098 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-hqpb2" podStartSLOduration=2.309077832 podStartE2EDuration="2.309077832s" podCreationTimestamp="2025-10-01 15:15:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:15:45.293300163 +0000 UTC m=+1189.912475334" watchObservedRunningTime="2025-10-01 15:15:45.309077832 +0000 UTC m=+1189.928253003" Oct 01 15:15:45 crc kubenswrapper[4771]: I1001 15:15:45.555678 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-fb68r"] Oct 01 15:15:46 crc kubenswrapper[4771]: I1001 15:15:46.239836 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-fb68r" event={"ID":"abc660b6-f8c7-4a39-b5f1-861ecaa73e75","Type":"ContainerStarted","Data":"2bda283390b2179b92ccb2c1cdb4baeb170ec2009c6bf7a5860510b7057457d6"} Oct 01 15:15:46 crc kubenswrapper[4771]: I1001 15:15:46.240159 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-fb68r" event={"ID":"abc660b6-f8c7-4a39-b5f1-861ecaa73e75","Type":"ContainerStarted","Data":"c0c8ca476a05553249d06440eb6633beb2579b8efdc655007d751a9399adabdb"} Oct 01 15:15:46 crc kubenswrapper[4771]: I1001 15:15:46.242904 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-4wgt5" event={"ID":"b3a0a923-de02-41f9-81ea-79f1529d12e4","Type":"ContainerStarted","Data":"a4ccff7103fb4b263cb63d12ea6893044788b0999974860a95366abf99eedb0e"} Oct 01 15:15:46 crc kubenswrapper[4771]: I1001 15:15:46.243041 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-845d6d6f59-4wgt5" Oct 01 15:15:46 crc kubenswrapper[4771]: I1001 15:15:46.245258 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f10e4587-5897-4af2-ae7f-5b4e3a8392d2","Type":"ContainerStarted","Data":"d81312d3000de177a8e959024c071a0140f63f31b8ffbb4f615251640832a58f"} Oct 01 15:15:46 crc kubenswrapper[4771]: I1001 15:15:46.272110 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-fb68r" podStartSLOduration=2.272090067 podStartE2EDuration="2.272090067s" podCreationTimestamp="2025-10-01 15:15:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:15:46.257887547 +0000 UTC m=+1190.877062718" watchObservedRunningTime="2025-10-01 15:15:46.272090067 +0000 UTC m=+1190.891265248" Oct 01 15:15:46 crc kubenswrapper[4771]: I1001 15:15:46.277801 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-845d6d6f59-4wgt5" podStartSLOduration=3.277779598 podStartE2EDuration="3.277779598s" podCreationTimestamp="2025-10-01 15:15:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:15:46.275179543 +0000 UTC m=+1190.894354724" watchObservedRunningTime="2025-10-01 15:15:46.277779598 +0000 UTC m=+1190.896954779" Oct 01 15:15:47 crc kubenswrapper[4771]: I1001 15:15:47.013636 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 15:15:47 crc kubenswrapper[4771]: I1001 15:15:47.023965 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 15:15:49 crc kubenswrapper[4771]: I1001 15:15:49.275273 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7d74cde4-a358-4f81-9c52-7e4a9a1646f1","Type":"ContainerStarted","Data":"7e7a564aeaf570d53413068e4f2652c91526f4381e5e6d3fd8da85a657c78951"} Oct 01 15:15:49 crc kubenswrapper[4771]: I1001 15:15:49.276289 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="7d74cde4-a358-4f81-9c52-7e4a9a1646f1" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://7e7a564aeaf570d53413068e4f2652c91526f4381e5e6d3fd8da85a657c78951" gracePeriod=30 Oct 01 15:15:49 crc kubenswrapper[4771]: I1001 15:15:49.284932 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d89ad9e9-e14e-4082-9121-39ae3de01ab3","Type":"ContainerStarted","Data":"e49433c502f6816c19d599dd05f008d9bf78adb1b3b7f49c543c37bc8bee8470"} Oct 01 15:15:49 crc kubenswrapper[4771]: I1001 15:15:49.284978 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d89ad9e9-e14e-4082-9121-39ae3de01ab3","Type":"ContainerStarted","Data":"b2452803e43391328b6c482a1d96540a9930a0786aed260396b88bda9c598c4c"} Oct 01 15:15:49 crc kubenswrapper[4771]: I1001 15:15:49.288773 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4bfe5278-08a7-4ece-aaea-bc4e123f028d","Type":"ContainerStarted","Data":"521bdcfb3f0193028c6446bb8fddd27bbac8ca17f47c38897bedb817033c17c4"} Oct 01 15:15:49 crc kubenswrapper[4771]: I1001 15:15:49.288810 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4bfe5278-08a7-4ece-aaea-bc4e123f028d","Type":"ContainerStarted","Data":"fa57cb76bd6f8c0c37a8efb084a35029ebee5e8e822464bf394b1b9b4d945d28"} Oct 01 15:15:49 crc kubenswrapper[4771]: I1001 15:15:49.288928 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4bfe5278-08a7-4ece-aaea-bc4e123f028d" containerName="nova-metadata-log" containerID="cri-o://fa57cb76bd6f8c0c37a8efb084a35029ebee5e8e822464bf394b1b9b4d945d28" gracePeriod=30 Oct 01 15:15:49 crc kubenswrapper[4771]: I1001 15:15:49.289208 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="4bfe5278-08a7-4ece-aaea-bc4e123f028d" containerName="nova-metadata-metadata" containerID="cri-o://521bdcfb3f0193028c6446bb8fddd27bbac8ca17f47c38897bedb817033c17c4" gracePeriod=30 Oct 01 15:15:49 crc kubenswrapper[4771]: I1001 15:15:49.292939 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"08a4e4bf-4009-4d69-b513-f821239be25c","Type":"ContainerStarted","Data":"6f0f3aca62e00797f194aa1cb54c9a72e0a60c49ddb7a8f3df6150aa5b5dd95b"} Oct 01 15:15:49 crc kubenswrapper[4771]: I1001 15:15:49.302609 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.983576628 podStartE2EDuration="6.302583455s" podCreationTimestamp="2025-10-01 15:15:43 +0000 UTC" firstStartedPulling="2025-10-01 15:15:44.687017065 +0000 UTC m=+1189.306192246" lastFinishedPulling="2025-10-01 15:15:48.006023912 +0000 UTC m=+1192.625199073" observedRunningTime="2025-10-01 15:15:49.298578937 +0000 UTC m=+1193.917754108" watchObservedRunningTime="2025-10-01 15:15:49.302583455 +0000 UTC m=+1193.921758666" Oct 01 15:15:49 crc kubenswrapper[4771]: I1001 15:15:49.318820 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f10e4587-5897-4af2-ae7f-5b4e3a8392d2","Type":"ContainerStarted","Data":"743a4a90e2a8d1bba869c28314e97739392e0e2b236c4521ab60d98951ff670d"} Oct 01 15:15:49 crc kubenswrapper[4771]: I1001 15:15:49.319173 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 15:15:49 crc kubenswrapper[4771]: I1001 15:15:49.329640 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.759448042 podStartE2EDuration="6.32961811s" podCreationTimestamp="2025-10-01 15:15:43 +0000 UTC" firstStartedPulling="2025-10-01 15:15:44.421717437 +0000 UTC m=+1189.040892608" lastFinishedPulling="2025-10-01 15:15:47.991887505 +0000 UTC m=+1192.611062676" observedRunningTime="2025-10-01 15:15:49.320671281 +0000 UTC m=+1193.939846462" watchObservedRunningTime="2025-10-01 15:15:49.32961811 +0000 UTC m=+1193.948793291" Oct 01 15:15:49 crc kubenswrapper[4771]: I1001 15:15:49.355581 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.755501796 podStartE2EDuration="6.355562269s" podCreationTimestamp="2025-10-01 15:15:43 +0000 UTC" firstStartedPulling="2025-10-01 15:15:44.391930574 +0000 UTC m=+1189.011105745" lastFinishedPulling="2025-10-01 15:15:47.991991047 +0000 UTC m=+1192.611166218" observedRunningTime="2025-10-01 15:15:49.344457426 +0000 UTC m=+1193.963632627" watchObservedRunningTime="2025-10-01 15:15:49.355562269 +0000 UTC m=+1193.974737440" Oct 01 15:15:49 crc kubenswrapper[4771]: I1001 15:15:49.401094 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.736094419 podStartE2EDuration="6.401076469s" podCreationTimestamp="2025-10-01 15:15:43 +0000 UTC" firstStartedPulling="2025-10-01 15:15:44.316269773 +0000 UTC m=+1188.935444944" lastFinishedPulling="2025-10-01 15:15:47.981251803 +0000 UTC m=+1192.600426994" observedRunningTime="2025-10-01 15:15:49.372887455 +0000 UTC m=+1193.992062646" watchObservedRunningTime="2025-10-01 15:15:49.401076469 +0000 UTC m=+1194.020251640" Oct 01 15:15:50 crc kubenswrapper[4771]: I1001 15:15:50.330812 4771 generic.go:334] "Generic (PLEG): container finished" podID="4bfe5278-08a7-4ece-aaea-bc4e123f028d" containerID="521bdcfb3f0193028c6446bb8fddd27bbac8ca17f47c38897bedb817033c17c4" exitCode=0 Oct 01 15:15:50 crc kubenswrapper[4771]: I1001 15:15:50.331085 4771 generic.go:334] "Generic (PLEG): container finished" podID="4bfe5278-08a7-4ece-aaea-bc4e123f028d" containerID="fa57cb76bd6f8c0c37a8efb084a35029ebee5e8e822464bf394b1b9b4d945d28" exitCode=143 Oct 01 15:15:50 crc kubenswrapper[4771]: I1001 15:15:50.330884 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4bfe5278-08a7-4ece-aaea-bc4e123f028d","Type":"ContainerDied","Data":"521bdcfb3f0193028c6446bb8fddd27bbac8ca17f47c38897bedb817033c17c4"} Oct 01 15:15:50 crc kubenswrapper[4771]: I1001 15:15:50.331224 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4bfe5278-08a7-4ece-aaea-bc4e123f028d","Type":"ContainerDied","Data":"fa57cb76bd6f8c0c37a8efb084a35029ebee5e8e822464bf394b1b9b4d945d28"} Oct 01 15:15:50 crc kubenswrapper[4771]: I1001 15:15:50.331250 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4bfe5278-08a7-4ece-aaea-bc4e123f028d","Type":"ContainerDied","Data":"29d16b258c13ade7831fbf638c983be836b06e4fda3f4ba217ed08800a936216"} Oct 01 15:15:50 crc kubenswrapper[4771]: I1001 15:15:50.331263 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="29d16b258c13ade7831fbf638c983be836b06e4fda3f4ba217ed08800a936216" Oct 01 15:15:50 crc kubenswrapper[4771]: I1001 15:15:50.335120 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 15:15:50 crc kubenswrapper[4771]: I1001 15:15:50.354049 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.363214222 podStartE2EDuration="8.354028067s" podCreationTimestamp="2025-10-01 15:15:42 +0000 UTC" firstStartedPulling="2025-10-01 15:15:43.020121398 +0000 UTC m=+1187.639296569" lastFinishedPulling="2025-10-01 15:15:48.010935243 +0000 UTC m=+1192.630110414" observedRunningTime="2025-10-01 15:15:49.398887765 +0000 UTC m=+1194.018062956" watchObservedRunningTime="2025-10-01 15:15:50.354028067 +0000 UTC m=+1194.973203228" Oct 01 15:15:50 crc kubenswrapper[4771]: I1001 15:15:50.417665 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bfe5278-08a7-4ece-aaea-bc4e123f028d-combined-ca-bundle\") pod \"4bfe5278-08a7-4ece-aaea-bc4e123f028d\" (UID: \"4bfe5278-08a7-4ece-aaea-bc4e123f028d\") " Oct 01 15:15:50 crc kubenswrapper[4771]: I1001 15:15:50.417751 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bfe5278-08a7-4ece-aaea-bc4e123f028d-logs\") pod \"4bfe5278-08a7-4ece-aaea-bc4e123f028d\" (UID: \"4bfe5278-08a7-4ece-aaea-bc4e123f028d\") " Oct 01 15:15:50 crc kubenswrapper[4771]: I1001 15:15:50.417829 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bfe5278-08a7-4ece-aaea-bc4e123f028d-config-data\") pod \"4bfe5278-08a7-4ece-aaea-bc4e123f028d\" (UID: \"4bfe5278-08a7-4ece-aaea-bc4e123f028d\") " Oct 01 15:15:50 crc kubenswrapper[4771]: I1001 15:15:50.417915 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgckk\" (UniqueName: \"kubernetes.io/projected/4bfe5278-08a7-4ece-aaea-bc4e123f028d-kube-api-access-xgckk\") pod \"4bfe5278-08a7-4ece-aaea-bc4e123f028d\" (UID: \"4bfe5278-08a7-4ece-aaea-bc4e123f028d\") " Oct 01 15:15:50 crc kubenswrapper[4771]: I1001 15:15:50.419416 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4bfe5278-08a7-4ece-aaea-bc4e123f028d-logs" (OuterVolumeSpecName: "logs") pod "4bfe5278-08a7-4ece-aaea-bc4e123f028d" (UID: "4bfe5278-08a7-4ece-aaea-bc4e123f028d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:15:50 crc kubenswrapper[4771]: I1001 15:15:50.424292 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bfe5278-08a7-4ece-aaea-bc4e123f028d-kube-api-access-xgckk" (OuterVolumeSpecName: "kube-api-access-xgckk") pod "4bfe5278-08a7-4ece-aaea-bc4e123f028d" (UID: "4bfe5278-08a7-4ece-aaea-bc4e123f028d"). InnerVolumeSpecName "kube-api-access-xgckk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:15:50 crc kubenswrapper[4771]: I1001 15:15:50.447797 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bfe5278-08a7-4ece-aaea-bc4e123f028d-config-data" (OuterVolumeSpecName: "config-data") pod "4bfe5278-08a7-4ece-aaea-bc4e123f028d" (UID: "4bfe5278-08a7-4ece-aaea-bc4e123f028d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:15:50 crc kubenswrapper[4771]: I1001 15:15:50.449183 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bfe5278-08a7-4ece-aaea-bc4e123f028d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4bfe5278-08a7-4ece-aaea-bc4e123f028d" (UID: "4bfe5278-08a7-4ece-aaea-bc4e123f028d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:15:50 crc kubenswrapper[4771]: I1001 15:15:50.520039 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bfe5278-08a7-4ece-aaea-bc4e123f028d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:50 crc kubenswrapper[4771]: I1001 15:15:50.520063 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4bfe5278-08a7-4ece-aaea-bc4e123f028d-logs\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:50 crc kubenswrapper[4771]: I1001 15:15:50.520072 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4bfe5278-08a7-4ece-aaea-bc4e123f028d-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:50 crc kubenswrapper[4771]: I1001 15:15:50.520080 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgckk\" (UniqueName: \"kubernetes.io/projected/4bfe5278-08a7-4ece-aaea-bc4e123f028d-kube-api-access-xgckk\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:51 crc kubenswrapper[4771]: I1001 15:15:51.346240 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 15:15:51 crc kubenswrapper[4771]: I1001 15:15:51.418327 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 15:15:51 crc kubenswrapper[4771]: I1001 15:15:51.426276 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 15:15:51 crc kubenswrapper[4771]: I1001 15:15:51.432684 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 01 15:15:51 crc kubenswrapper[4771]: E1001 15:15:51.433290 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bfe5278-08a7-4ece-aaea-bc4e123f028d" containerName="nova-metadata-metadata" Oct 01 15:15:51 crc kubenswrapper[4771]: I1001 15:15:51.433335 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bfe5278-08a7-4ece-aaea-bc4e123f028d" containerName="nova-metadata-metadata" Oct 01 15:15:51 crc kubenswrapper[4771]: E1001 15:15:51.433376 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bfe5278-08a7-4ece-aaea-bc4e123f028d" containerName="nova-metadata-log" Oct 01 15:15:51 crc kubenswrapper[4771]: I1001 15:15:51.433382 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bfe5278-08a7-4ece-aaea-bc4e123f028d" containerName="nova-metadata-log" Oct 01 15:15:51 crc kubenswrapper[4771]: I1001 15:15:51.433986 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bfe5278-08a7-4ece-aaea-bc4e123f028d" containerName="nova-metadata-log" Oct 01 15:15:51 crc kubenswrapper[4771]: I1001 15:15:51.434042 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bfe5278-08a7-4ece-aaea-bc4e123f028d" containerName="nova-metadata-metadata" Oct 01 15:15:51 crc kubenswrapper[4771]: I1001 15:15:51.435499 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 15:15:51 crc kubenswrapper[4771]: I1001 15:15:51.446032 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 01 15:15:51 crc kubenswrapper[4771]: I1001 15:15:51.446274 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 01 15:15:51 crc kubenswrapper[4771]: I1001 15:15:51.453686 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 15:15:51 crc kubenswrapper[4771]: I1001 15:15:51.539611 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/187e842b-e759-4c93-a584-53d3a9cc4bc0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"187e842b-e759-4c93-a584-53d3a9cc4bc0\") " pod="openstack/nova-metadata-0" Oct 01 15:15:51 crc kubenswrapper[4771]: I1001 15:15:51.539720 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/187e842b-e759-4c93-a584-53d3a9cc4bc0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"187e842b-e759-4c93-a584-53d3a9cc4bc0\") " pod="openstack/nova-metadata-0" Oct 01 15:15:51 crc kubenswrapper[4771]: I1001 15:15:51.539800 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/187e842b-e759-4c93-a584-53d3a9cc4bc0-logs\") pod \"nova-metadata-0\" (UID: \"187e842b-e759-4c93-a584-53d3a9cc4bc0\") " pod="openstack/nova-metadata-0" Oct 01 15:15:51 crc kubenswrapper[4771]: I1001 15:15:51.539850 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/187e842b-e759-4c93-a584-53d3a9cc4bc0-config-data\") pod \"nova-metadata-0\" (UID: \"187e842b-e759-4c93-a584-53d3a9cc4bc0\") " pod="openstack/nova-metadata-0" Oct 01 15:15:51 crc kubenswrapper[4771]: I1001 15:15:51.539866 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss8w5\" (UniqueName: \"kubernetes.io/projected/187e842b-e759-4c93-a584-53d3a9cc4bc0-kube-api-access-ss8w5\") pod \"nova-metadata-0\" (UID: \"187e842b-e759-4c93-a584-53d3a9cc4bc0\") " pod="openstack/nova-metadata-0" Oct 01 15:15:51 crc kubenswrapper[4771]: I1001 15:15:51.641043 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/187e842b-e759-4c93-a584-53d3a9cc4bc0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"187e842b-e759-4c93-a584-53d3a9cc4bc0\") " pod="openstack/nova-metadata-0" Oct 01 15:15:51 crc kubenswrapper[4771]: I1001 15:15:51.641099 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/187e842b-e759-4c93-a584-53d3a9cc4bc0-logs\") pod \"nova-metadata-0\" (UID: \"187e842b-e759-4c93-a584-53d3a9cc4bc0\") " pod="openstack/nova-metadata-0" Oct 01 15:15:51 crc kubenswrapper[4771]: I1001 15:15:51.641142 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/187e842b-e759-4c93-a584-53d3a9cc4bc0-config-data\") pod \"nova-metadata-0\" (UID: \"187e842b-e759-4c93-a584-53d3a9cc4bc0\") " pod="openstack/nova-metadata-0" Oct 01 15:15:51 crc kubenswrapper[4771]: I1001 15:15:51.641162 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss8w5\" (UniqueName: \"kubernetes.io/projected/187e842b-e759-4c93-a584-53d3a9cc4bc0-kube-api-access-ss8w5\") pod \"nova-metadata-0\" (UID: \"187e842b-e759-4c93-a584-53d3a9cc4bc0\") " pod="openstack/nova-metadata-0" Oct 01 15:15:51 crc kubenswrapper[4771]: I1001 15:15:51.641224 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/187e842b-e759-4c93-a584-53d3a9cc4bc0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"187e842b-e759-4c93-a584-53d3a9cc4bc0\") " pod="openstack/nova-metadata-0" Oct 01 15:15:51 crc kubenswrapper[4771]: I1001 15:15:51.641770 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/187e842b-e759-4c93-a584-53d3a9cc4bc0-logs\") pod \"nova-metadata-0\" (UID: \"187e842b-e759-4c93-a584-53d3a9cc4bc0\") " pod="openstack/nova-metadata-0" Oct 01 15:15:51 crc kubenswrapper[4771]: I1001 15:15:51.646414 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/187e842b-e759-4c93-a584-53d3a9cc4bc0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"187e842b-e759-4c93-a584-53d3a9cc4bc0\") " pod="openstack/nova-metadata-0" Oct 01 15:15:51 crc kubenswrapper[4771]: I1001 15:15:51.647511 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/187e842b-e759-4c93-a584-53d3a9cc4bc0-config-data\") pod \"nova-metadata-0\" (UID: \"187e842b-e759-4c93-a584-53d3a9cc4bc0\") " pod="openstack/nova-metadata-0" Oct 01 15:15:51 crc kubenswrapper[4771]: I1001 15:15:51.648002 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/187e842b-e759-4c93-a584-53d3a9cc4bc0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"187e842b-e759-4c93-a584-53d3a9cc4bc0\") " pod="openstack/nova-metadata-0" Oct 01 15:15:51 crc kubenswrapper[4771]: I1001 15:15:51.661663 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss8w5\" (UniqueName: \"kubernetes.io/projected/187e842b-e759-4c93-a584-53d3a9cc4bc0-kube-api-access-ss8w5\") pod \"nova-metadata-0\" (UID: \"187e842b-e759-4c93-a584-53d3a9cc4bc0\") " pod="openstack/nova-metadata-0" Oct 01 15:15:51 crc kubenswrapper[4771]: I1001 15:15:51.762903 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 15:15:51 crc kubenswrapper[4771]: I1001 15:15:51.998477 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bfe5278-08a7-4ece-aaea-bc4e123f028d" path="/var/lib/kubelet/pods/4bfe5278-08a7-4ece-aaea-bc4e123f028d/volumes" Oct 01 15:15:52 crc kubenswrapper[4771]: W1001 15:15:52.223656 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod187e842b_e759_4c93_a584_53d3a9cc4bc0.slice/crio-e499f65674ea0bd298c56009a1b9de36a662360c1be78d63d3e85c36eedc705d WatchSource:0}: Error finding container e499f65674ea0bd298c56009a1b9de36a662360c1be78d63d3e85c36eedc705d: Status 404 returned error can't find the container with id e499f65674ea0bd298c56009a1b9de36a662360c1be78d63d3e85c36eedc705d Oct 01 15:15:52 crc kubenswrapper[4771]: I1001 15:15:52.231002 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 15:15:52 crc kubenswrapper[4771]: I1001 15:15:52.360781 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"187e842b-e759-4c93-a584-53d3a9cc4bc0","Type":"ContainerStarted","Data":"e499f65674ea0bd298c56009a1b9de36a662360c1be78d63d3e85c36eedc705d"} Oct 01 15:15:52 crc kubenswrapper[4771]: I1001 15:15:52.362934 4771 generic.go:334] "Generic (PLEG): container finished" podID="97e141b7-bc02-4fbe-b918-6b31a4dea6cf" containerID="7ac0a44bd285907f9bc19f2bc2ec9d571c6507531c623cdf49edb9ca1b72fc57" exitCode=0 Oct 01 15:15:52 crc kubenswrapper[4771]: I1001 15:15:52.362963 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-hqpb2" event={"ID":"97e141b7-bc02-4fbe-b918-6b31a4dea6cf","Type":"ContainerDied","Data":"7ac0a44bd285907f9bc19f2bc2ec9d571c6507531c623cdf49edb9ca1b72fc57"} Oct 01 15:15:53 crc kubenswrapper[4771]: I1001 15:15:53.376380 4771 generic.go:334] "Generic (PLEG): container finished" podID="abc660b6-f8c7-4a39-b5f1-861ecaa73e75" containerID="2bda283390b2179b92ccb2c1cdb4baeb170ec2009c6bf7a5860510b7057457d6" exitCode=0 Oct 01 15:15:53 crc kubenswrapper[4771]: I1001 15:15:53.376665 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-fb68r" event={"ID":"abc660b6-f8c7-4a39-b5f1-861ecaa73e75","Type":"ContainerDied","Data":"2bda283390b2179b92ccb2c1cdb4baeb170ec2009c6bf7a5860510b7057457d6"} Oct 01 15:15:53 crc kubenswrapper[4771]: I1001 15:15:53.380969 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"187e842b-e759-4c93-a584-53d3a9cc4bc0","Type":"ContainerStarted","Data":"08985d829f44174b2f22ff2a936a00f59cf5f6a6b810708b6c15f0c2c7d1ccfb"} Oct 01 15:15:53 crc kubenswrapper[4771]: I1001 15:15:53.381003 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"187e842b-e759-4c93-a584-53d3a9cc4bc0","Type":"ContainerStarted","Data":"8a3cc4764203b1f30ae5aa2d311b3b73591951229c35bdb9ae5c14ff5e47d640"} Oct 01 15:15:53 crc kubenswrapper[4771]: I1001 15:15:53.429637 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.429619185 podStartE2EDuration="2.429619185s" podCreationTimestamp="2025-10-01 15:15:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:15:53.427712998 +0000 UTC m=+1198.046888169" watchObservedRunningTime="2025-10-01 15:15:53.429619185 +0000 UTC m=+1198.048794356" Oct 01 15:15:53 crc kubenswrapper[4771]: I1001 15:15:53.656998 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 15:15:53 crc kubenswrapper[4771]: I1001 15:15:53.657416 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 15:15:53 crc kubenswrapper[4771]: I1001 15:15:53.682044 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 01 15:15:53 crc kubenswrapper[4771]: I1001 15:15:53.682131 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 01 15:15:53 crc kubenswrapper[4771]: I1001 15:15:53.727097 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 01 15:15:53 crc kubenswrapper[4771]: I1001 15:15:53.777386 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-hqpb2" Oct 01 15:15:53 crc kubenswrapper[4771]: I1001 15:15:53.882762 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97e141b7-bc02-4fbe-b918-6b31a4dea6cf-scripts\") pod \"97e141b7-bc02-4fbe-b918-6b31a4dea6cf\" (UID: \"97e141b7-bc02-4fbe-b918-6b31a4dea6cf\") " Oct 01 15:15:53 crc kubenswrapper[4771]: I1001 15:15:53.882936 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ds4df\" (UniqueName: \"kubernetes.io/projected/97e141b7-bc02-4fbe-b918-6b31a4dea6cf-kube-api-access-ds4df\") pod \"97e141b7-bc02-4fbe-b918-6b31a4dea6cf\" (UID: \"97e141b7-bc02-4fbe-b918-6b31a4dea6cf\") " Oct 01 15:15:53 crc kubenswrapper[4771]: I1001 15:15:53.882989 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97e141b7-bc02-4fbe-b918-6b31a4dea6cf-combined-ca-bundle\") pod \"97e141b7-bc02-4fbe-b918-6b31a4dea6cf\" (UID: \"97e141b7-bc02-4fbe-b918-6b31a4dea6cf\") " Oct 01 15:15:53 crc kubenswrapper[4771]: I1001 15:15:53.883080 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97e141b7-bc02-4fbe-b918-6b31a4dea6cf-config-data\") pod \"97e141b7-bc02-4fbe-b918-6b31a4dea6cf\" (UID: \"97e141b7-bc02-4fbe-b918-6b31a4dea6cf\") " Oct 01 15:15:53 crc kubenswrapper[4771]: I1001 15:15:53.887528 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97e141b7-bc02-4fbe-b918-6b31a4dea6cf-scripts" (OuterVolumeSpecName: "scripts") pod "97e141b7-bc02-4fbe-b918-6b31a4dea6cf" (UID: "97e141b7-bc02-4fbe-b918-6b31a4dea6cf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:15:53 crc kubenswrapper[4771]: I1001 15:15:53.887912 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97e141b7-bc02-4fbe-b918-6b31a4dea6cf-kube-api-access-ds4df" (OuterVolumeSpecName: "kube-api-access-ds4df") pod "97e141b7-bc02-4fbe-b918-6b31a4dea6cf" (UID: "97e141b7-bc02-4fbe-b918-6b31a4dea6cf"). InnerVolumeSpecName "kube-api-access-ds4df". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:15:53 crc kubenswrapper[4771]: I1001 15:15:53.914237 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97e141b7-bc02-4fbe-b918-6b31a4dea6cf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "97e141b7-bc02-4fbe-b918-6b31a4dea6cf" (UID: "97e141b7-bc02-4fbe-b918-6b31a4dea6cf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:15:53 crc kubenswrapper[4771]: I1001 15:15:53.916365 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97e141b7-bc02-4fbe-b918-6b31a4dea6cf-config-data" (OuterVolumeSpecName: "config-data") pod "97e141b7-bc02-4fbe-b918-6b31a4dea6cf" (UID: "97e141b7-bc02-4fbe-b918-6b31a4dea6cf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:15:53 crc kubenswrapper[4771]: I1001 15:15:53.985852 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ds4df\" (UniqueName: \"kubernetes.io/projected/97e141b7-bc02-4fbe-b918-6b31a4dea6cf-kube-api-access-ds4df\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:53 crc kubenswrapper[4771]: I1001 15:15:53.985885 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97e141b7-bc02-4fbe-b918-6b31a4dea6cf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:53 crc kubenswrapper[4771]: I1001 15:15:53.985895 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97e141b7-bc02-4fbe-b918-6b31a4dea6cf-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:53 crc kubenswrapper[4771]: I1001 15:15:53.985903 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97e141b7-bc02-4fbe-b918-6b31a4dea6cf-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:54 crc kubenswrapper[4771]: I1001 15:15:54.037213 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 01 15:15:54 crc kubenswrapper[4771]: I1001 15:15:54.105122 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-845d6d6f59-4wgt5" Oct 01 15:15:54 crc kubenswrapper[4771]: I1001 15:15:54.169657 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-6rfg4"] Oct 01 15:15:54 crc kubenswrapper[4771]: I1001 15:15:54.169955 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5784cf869f-6rfg4" podUID="a9e21fb0-30b7-41e6-b40b-b2f1bca3043a" containerName="dnsmasq-dns" containerID="cri-o://88e46f77c24fe812a9282ebcc93064318f0b3fbbb76eb9a355bcf862301ba3d0" gracePeriod=10 Oct 01 15:15:54 crc kubenswrapper[4771]: I1001 15:15:54.390089 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-hqpb2" event={"ID":"97e141b7-bc02-4fbe-b918-6b31a4dea6cf","Type":"ContainerDied","Data":"8bd0480560c28893089b1e58ae180269ae22bebe47f3328d57c9c123ad8491bd"} Oct 01 15:15:54 crc kubenswrapper[4771]: I1001 15:15:54.390127 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8bd0480560c28893089b1e58ae180269ae22bebe47f3328d57c9c123ad8491bd" Oct 01 15:15:54 crc kubenswrapper[4771]: I1001 15:15:54.390179 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-hqpb2" Oct 01 15:15:54 crc kubenswrapper[4771]: I1001 15:15:54.407406 4771 generic.go:334] "Generic (PLEG): container finished" podID="a9e21fb0-30b7-41e6-b40b-b2f1bca3043a" containerID="88e46f77c24fe812a9282ebcc93064318f0b3fbbb76eb9a355bcf862301ba3d0" exitCode=0 Oct 01 15:15:54 crc kubenswrapper[4771]: I1001 15:15:54.407782 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-6rfg4" event={"ID":"a9e21fb0-30b7-41e6-b40b-b2f1bca3043a","Type":"ContainerDied","Data":"88e46f77c24fe812a9282ebcc93064318f0b3fbbb76eb9a355bcf862301ba3d0"} Oct 01 15:15:54 crc kubenswrapper[4771]: I1001 15:15:54.481187 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 01 15:15:54 crc kubenswrapper[4771]: I1001 15:15:54.595828 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 01 15:15:54 crc kubenswrapper[4771]: I1001 15:15:54.608620 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 15:15:54 crc kubenswrapper[4771]: I1001 15:15:54.627062 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-6rfg4" Oct 01 15:15:54 crc kubenswrapper[4771]: I1001 15:15:54.731956 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9e21fb0-30b7-41e6-b40b-b2f1bca3043a-config\") pod \"a9e21fb0-30b7-41e6-b40b-b2f1bca3043a\" (UID: \"a9e21fb0-30b7-41e6-b40b-b2f1bca3043a\") " Oct 01 15:15:54 crc kubenswrapper[4771]: I1001 15:15:54.732695 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a9e21fb0-30b7-41e6-b40b-b2f1bca3043a-ovsdbserver-nb\") pod \"a9e21fb0-30b7-41e6-b40b-b2f1bca3043a\" (UID: \"a9e21fb0-30b7-41e6-b40b-b2f1bca3043a\") " Oct 01 15:15:54 crc kubenswrapper[4771]: I1001 15:15:54.732756 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6v42j\" (UniqueName: \"kubernetes.io/projected/a9e21fb0-30b7-41e6-b40b-b2f1bca3043a-kube-api-access-6v42j\") pod \"a9e21fb0-30b7-41e6-b40b-b2f1bca3043a\" (UID: \"a9e21fb0-30b7-41e6-b40b-b2f1bca3043a\") " Oct 01 15:15:54 crc kubenswrapper[4771]: I1001 15:15:54.732791 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9e21fb0-30b7-41e6-b40b-b2f1bca3043a-dns-svc\") pod \"a9e21fb0-30b7-41e6-b40b-b2f1bca3043a\" (UID: \"a9e21fb0-30b7-41e6-b40b-b2f1bca3043a\") " Oct 01 15:15:54 crc kubenswrapper[4771]: I1001 15:15:54.732899 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9e21fb0-30b7-41e6-b40b-b2f1bca3043a-ovsdbserver-sb\") pod \"a9e21fb0-30b7-41e6-b40b-b2f1bca3043a\" (UID: \"a9e21fb0-30b7-41e6-b40b-b2f1bca3043a\") " Oct 01 15:15:54 crc kubenswrapper[4771]: I1001 15:15:54.732970 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a9e21fb0-30b7-41e6-b40b-b2f1bca3043a-dns-swift-storage-0\") pod \"a9e21fb0-30b7-41e6-b40b-b2f1bca3043a\" (UID: \"a9e21fb0-30b7-41e6-b40b-b2f1bca3043a\") " Oct 01 15:15:54 crc kubenswrapper[4771]: I1001 15:15:54.737598 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9e21fb0-30b7-41e6-b40b-b2f1bca3043a-kube-api-access-6v42j" (OuterVolumeSpecName: "kube-api-access-6v42j") pod "a9e21fb0-30b7-41e6-b40b-b2f1bca3043a" (UID: "a9e21fb0-30b7-41e6-b40b-b2f1bca3043a"). InnerVolumeSpecName "kube-api-access-6v42j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:15:54 crc kubenswrapper[4771]: I1001 15:15:54.739909 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d89ad9e9-e14e-4082-9121-39ae3de01ab3" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.188:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 01 15:15:54 crc kubenswrapper[4771]: I1001 15:15:54.739906 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d89ad9e9-e14e-4082-9121-39ae3de01ab3" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.188:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 01 15:15:54 crc kubenswrapper[4771]: I1001 15:15:54.834770 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6v42j\" (UniqueName: \"kubernetes.io/projected/a9e21fb0-30b7-41e6-b40b-b2f1bca3043a-kube-api-access-6v42j\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:54 crc kubenswrapper[4771]: I1001 15:15:54.846341 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9e21fb0-30b7-41e6-b40b-b2f1bca3043a-config" (OuterVolumeSpecName: "config") pod "a9e21fb0-30b7-41e6-b40b-b2f1bca3043a" (UID: "a9e21fb0-30b7-41e6-b40b-b2f1bca3043a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:15:54 crc kubenswrapper[4771]: I1001 15:15:54.866844 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9e21fb0-30b7-41e6-b40b-b2f1bca3043a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a9e21fb0-30b7-41e6-b40b-b2f1bca3043a" (UID: "a9e21fb0-30b7-41e6-b40b-b2f1bca3043a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:15:54 crc kubenswrapper[4771]: I1001 15:15:54.868182 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9e21fb0-30b7-41e6-b40b-b2f1bca3043a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a9e21fb0-30b7-41e6-b40b-b2f1bca3043a" (UID: "a9e21fb0-30b7-41e6-b40b-b2f1bca3043a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:15:54 crc kubenswrapper[4771]: I1001 15:15:54.871106 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9e21fb0-30b7-41e6-b40b-b2f1bca3043a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a9e21fb0-30b7-41e6-b40b-b2f1bca3043a" (UID: "a9e21fb0-30b7-41e6-b40b-b2f1bca3043a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:15:54 crc kubenswrapper[4771]: I1001 15:15:54.886175 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9e21fb0-30b7-41e6-b40b-b2f1bca3043a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a9e21fb0-30b7-41e6-b40b-b2f1bca3043a" (UID: "a9e21fb0-30b7-41e6-b40b-b2f1bca3043a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:15:54 crc kubenswrapper[4771]: I1001 15:15:54.934831 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-fb68r" Oct 01 15:15:54 crc kubenswrapper[4771]: I1001 15:15:54.935940 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9e21fb0-30b7-41e6-b40b-b2f1bca3043a-config\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:54 crc kubenswrapper[4771]: I1001 15:15:54.935966 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a9e21fb0-30b7-41e6-b40b-b2f1bca3043a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:54 crc kubenswrapper[4771]: I1001 15:15:54.935975 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9e21fb0-30b7-41e6-b40b-b2f1bca3043a-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:54 crc kubenswrapper[4771]: I1001 15:15:54.935985 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a9e21fb0-30b7-41e6-b40b-b2f1bca3043a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:54 crc kubenswrapper[4771]: I1001 15:15:54.935993 4771 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a9e21fb0-30b7-41e6-b40b-b2f1bca3043a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:55 crc kubenswrapper[4771]: I1001 15:15:55.037010 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abc660b6-f8c7-4a39-b5f1-861ecaa73e75-config-data\") pod \"abc660b6-f8c7-4a39-b5f1-861ecaa73e75\" (UID: \"abc660b6-f8c7-4a39-b5f1-861ecaa73e75\") " Oct 01 15:15:55 crc kubenswrapper[4771]: I1001 15:15:55.037109 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abc660b6-f8c7-4a39-b5f1-861ecaa73e75-combined-ca-bundle\") pod \"abc660b6-f8c7-4a39-b5f1-861ecaa73e75\" (UID: \"abc660b6-f8c7-4a39-b5f1-861ecaa73e75\") " Oct 01 15:15:55 crc kubenswrapper[4771]: I1001 15:15:55.037143 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tpvl\" (UniqueName: \"kubernetes.io/projected/abc660b6-f8c7-4a39-b5f1-861ecaa73e75-kube-api-access-8tpvl\") pod \"abc660b6-f8c7-4a39-b5f1-861ecaa73e75\" (UID: \"abc660b6-f8c7-4a39-b5f1-861ecaa73e75\") " Oct 01 15:15:55 crc kubenswrapper[4771]: I1001 15:15:55.037211 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abc660b6-f8c7-4a39-b5f1-861ecaa73e75-scripts\") pod \"abc660b6-f8c7-4a39-b5f1-861ecaa73e75\" (UID: \"abc660b6-f8c7-4a39-b5f1-861ecaa73e75\") " Oct 01 15:15:55 crc kubenswrapper[4771]: I1001 15:15:55.040723 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abc660b6-f8c7-4a39-b5f1-861ecaa73e75-kube-api-access-8tpvl" (OuterVolumeSpecName: "kube-api-access-8tpvl") pod "abc660b6-f8c7-4a39-b5f1-861ecaa73e75" (UID: "abc660b6-f8c7-4a39-b5f1-861ecaa73e75"). InnerVolumeSpecName "kube-api-access-8tpvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:15:55 crc kubenswrapper[4771]: I1001 15:15:55.052740 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abc660b6-f8c7-4a39-b5f1-861ecaa73e75-scripts" (OuterVolumeSpecName: "scripts") pod "abc660b6-f8c7-4a39-b5f1-861ecaa73e75" (UID: "abc660b6-f8c7-4a39-b5f1-861ecaa73e75"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:15:55 crc kubenswrapper[4771]: I1001 15:15:55.070941 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abc660b6-f8c7-4a39-b5f1-861ecaa73e75-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "abc660b6-f8c7-4a39-b5f1-861ecaa73e75" (UID: "abc660b6-f8c7-4a39-b5f1-861ecaa73e75"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:15:55 crc kubenswrapper[4771]: I1001 15:15:55.072241 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 15:15:55 crc kubenswrapper[4771]: I1001 15:15:55.091309 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/abc660b6-f8c7-4a39-b5f1-861ecaa73e75-config-data" (OuterVolumeSpecName: "config-data") pod "abc660b6-f8c7-4a39-b5f1-861ecaa73e75" (UID: "abc660b6-f8c7-4a39-b5f1-861ecaa73e75"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:15:55 crc kubenswrapper[4771]: I1001 15:15:55.139521 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abc660b6-f8c7-4a39-b5f1-861ecaa73e75-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:55 crc kubenswrapper[4771]: I1001 15:15:55.139556 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abc660b6-f8c7-4a39-b5f1-861ecaa73e75-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:55 crc kubenswrapper[4771]: I1001 15:15:55.139567 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tpvl\" (UniqueName: \"kubernetes.io/projected/abc660b6-f8c7-4a39-b5f1-861ecaa73e75-kube-api-access-8tpvl\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:55 crc kubenswrapper[4771]: I1001 15:15:55.139575 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/abc660b6-f8c7-4a39-b5f1-861ecaa73e75-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:55 crc kubenswrapper[4771]: I1001 15:15:55.423422 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-6rfg4" event={"ID":"a9e21fb0-30b7-41e6-b40b-b2f1bca3043a","Type":"ContainerDied","Data":"0946af39b7eb661a94b8d0b8c3bd7dbcc335b425a37f465b78d8f447e9af8cf2"} Oct 01 15:15:55 crc kubenswrapper[4771]: I1001 15:15:55.423496 4771 scope.go:117] "RemoveContainer" containerID="88e46f77c24fe812a9282ebcc93064318f0b3fbbb76eb9a355bcf862301ba3d0" Oct 01 15:15:55 crc kubenswrapper[4771]: I1001 15:15:55.423700 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-6rfg4" Oct 01 15:15:55 crc kubenswrapper[4771]: I1001 15:15:55.427432 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d89ad9e9-e14e-4082-9121-39ae3de01ab3" containerName="nova-api-log" containerID="cri-o://b2452803e43391328b6c482a1d96540a9930a0786aed260396b88bda9c598c4c" gracePeriod=30 Oct 01 15:15:55 crc kubenswrapper[4771]: I1001 15:15:55.427565 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d89ad9e9-e14e-4082-9121-39ae3de01ab3" containerName="nova-api-api" containerID="cri-o://e49433c502f6816c19d599dd05f008d9bf78adb1b3b7f49c543c37bc8bee8470" gracePeriod=30 Oct 01 15:15:55 crc kubenswrapper[4771]: I1001 15:15:55.427723 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-fb68r" Oct 01 15:15:55 crc kubenswrapper[4771]: I1001 15:15:55.427880 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-fb68r" event={"ID":"abc660b6-f8c7-4a39-b5f1-861ecaa73e75","Type":"ContainerDied","Data":"c0c8ca476a05553249d06440eb6633beb2579b8efdc655007d751a9399adabdb"} Oct 01 15:15:55 crc kubenswrapper[4771]: I1001 15:15:55.428665 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0c8ca476a05553249d06440eb6633beb2579b8efdc655007d751a9399adabdb" Oct 01 15:15:55 crc kubenswrapper[4771]: I1001 15:15:55.428120 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="187e842b-e759-4c93-a584-53d3a9cc4bc0" containerName="nova-metadata-metadata" containerID="cri-o://08985d829f44174b2f22ff2a936a00f59cf5f6a6b810708b6c15f0c2c7d1ccfb" gracePeriod=30 Oct 01 15:15:55 crc kubenswrapper[4771]: I1001 15:15:55.428091 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="187e842b-e759-4c93-a584-53d3a9cc4bc0" containerName="nova-metadata-log" containerID="cri-o://8a3cc4764203b1f30ae5aa2d311b3b73591951229c35bdb9ae5c14ff5e47d640" gracePeriod=30 Oct 01 15:15:55 crc kubenswrapper[4771]: I1001 15:15:55.449666 4771 scope.go:117] "RemoveContainer" containerID="48592c9a7cb5619d3e4d46aeb65545fcb70d6aa842f0e7ce407ef457489cc869" Oct 01 15:15:55 crc kubenswrapper[4771]: I1001 15:15:55.490921 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 01 15:15:55 crc kubenswrapper[4771]: E1001 15:15:55.491282 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abc660b6-f8c7-4a39-b5f1-861ecaa73e75" containerName="nova-cell1-conductor-db-sync" Oct 01 15:15:55 crc kubenswrapper[4771]: I1001 15:15:55.491296 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="abc660b6-f8c7-4a39-b5f1-861ecaa73e75" containerName="nova-cell1-conductor-db-sync" Oct 01 15:15:55 crc kubenswrapper[4771]: E1001 15:15:55.491309 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9e21fb0-30b7-41e6-b40b-b2f1bca3043a" containerName="init" Oct 01 15:15:55 crc kubenswrapper[4771]: I1001 15:15:55.491316 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9e21fb0-30b7-41e6-b40b-b2f1bca3043a" containerName="init" Oct 01 15:15:55 crc kubenswrapper[4771]: E1001 15:15:55.491328 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97e141b7-bc02-4fbe-b918-6b31a4dea6cf" containerName="nova-manage" Oct 01 15:15:55 crc kubenswrapper[4771]: I1001 15:15:55.491335 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="97e141b7-bc02-4fbe-b918-6b31a4dea6cf" containerName="nova-manage" Oct 01 15:15:55 crc kubenswrapper[4771]: E1001 15:15:55.491371 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9e21fb0-30b7-41e6-b40b-b2f1bca3043a" containerName="dnsmasq-dns" Oct 01 15:15:55 crc kubenswrapper[4771]: I1001 15:15:55.491376 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9e21fb0-30b7-41e6-b40b-b2f1bca3043a" containerName="dnsmasq-dns" Oct 01 15:15:55 crc kubenswrapper[4771]: I1001 15:15:55.491528 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="97e141b7-bc02-4fbe-b918-6b31a4dea6cf" containerName="nova-manage" Oct 01 15:15:55 crc kubenswrapper[4771]: I1001 15:15:55.491542 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="abc660b6-f8c7-4a39-b5f1-861ecaa73e75" containerName="nova-cell1-conductor-db-sync" Oct 01 15:15:55 crc kubenswrapper[4771]: I1001 15:15:55.491559 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9e21fb0-30b7-41e6-b40b-b2f1bca3043a" containerName="dnsmasq-dns" Oct 01 15:15:55 crc kubenswrapper[4771]: I1001 15:15:55.492260 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 01 15:15:55 crc kubenswrapper[4771]: I1001 15:15:55.508372 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 01 15:15:55 crc kubenswrapper[4771]: I1001 15:15:55.511828 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 01 15:15:55 crc kubenswrapper[4771]: I1001 15:15:55.519866 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-6rfg4"] Oct 01 15:15:55 crc kubenswrapper[4771]: I1001 15:15:55.528935 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-6rfg4"] Oct 01 15:15:55 crc kubenswrapper[4771]: I1001 15:15:55.548809 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6jns\" (UniqueName: \"kubernetes.io/projected/8a756a41-c563-4d6c-a5e8-907724f1847c-kube-api-access-b6jns\") pod \"nova-cell1-conductor-0\" (UID: \"8a756a41-c563-4d6c-a5e8-907724f1847c\") " pod="openstack/nova-cell1-conductor-0" Oct 01 15:15:55 crc kubenswrapper[4771]: I1001 15:15:55.548881 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a756a41-c563-4d6c-a5e8-907724f1847c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"8a756a41-c563-4d6c-a5e8-907724f1847c\") " pod="openstack/nova-cell1-conductor-0" Oct 01 15:15:55 crc kubenswrapper[4771]: I1001 15:15:55.548956 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a756a41-c563-4d6c-a5e8-907724f1847c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"8a756a41-c563-4d6c-a5e8-907724f1847c\") " pod="openstack/nova-cell1-conductor-0" Oct 01 15:15:55 crc kubenswrapper[4771]: I1001 15:15:55.651189 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6jns\" (UniqueName: \"kubernetes.io/projected/8a756a41-c563-4d6c-a5e8-907724f1847c-kube-api-access-b6jns\") pod \"nova-cell1-conductor-0\" (UID: \"8a756a41-c563-4d6c-a5e8-907724f1847c\") " pod="openstack/nova-cell1-conductor-0" Oct 01 15:15:55 crc kubenswrapper[4771]: I1001 15:15:55.651330 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a756a41-c563-4d6c-a5e8-907724f1847c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"8a756a41-c563-4d6c-a5e8-907724f1847c\") " pod="openstack/nova-cell1-conductor-0" Oct 01 15:15:55 crc kubenswrapper[4771]: I1001 15:15:55.651483 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a756a41-c563-4d6c-a5e8-907724f1847c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"8a756a41-c563-4d6c-a5e8-907724f1847c\") " pod="openstack/nova-cell1-conductor-0" Oct 01 15:15:55 crc kubenswrapper[4771]: I1001 15:15:55.656182 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a756a41-c563-4d6c-a5e8-907724f1847c-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"8a756a41-c563-4d6c-a5e8-907724f1847c\") " pod="openstack/nova-cell1-conductor-0" Oct 01 15:15:55 crc kubenswrapper[4771]: I1001 15:15:55.656475 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a756a41-c563-4d6c-a5e8-907724f1847c-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"8a756a41-c563-4d6c-a5e8-907724f1847c\") " pod="openstack/nova-cell1-conductor-0" Oct 01 15:15:55 crc kubenswrapper[4771]: I1001 15:15:55.674459 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6jns\" (UniqueName: \"kubernetes.io/projected/8a756a41-c563-4d6c-a5e8-907724f1847c-kube-api-access-b6jns\") pod \"nova-cell1-conductor-0\" (UID: \"8a756a41-c563-4d6c-a5e8-907724f1847c\") " pod="openstack/nova-cell1-conductor-0" Oct 01 15:15:55 crc kubenswrapper[4771]: I1001 15:15:55.821278 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.025660 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9e21fb0-30b7-41e6-b40b-b2f1bca3043a" path="/var/lib/kubelet/pods/a9e21fb0-30b7-41e6-b40b-b2f1bca3043a/volumes" Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.068833 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.197004 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/187e842b-e759-4c93-a584-53d3a9cc4bc0-nova-metadata-tls-certs\") pod \"187e842b-e759-4c93-a584-53d3a9cc4bc0\" (UID: \"187e842b-e759-4c93-a584-53d3a9cc4bc0\") " Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.197049 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss8w5\" (UniqueName: \"kubernetes.io/projected/187e842b-e759-4c93-a584-53d3a9cc4bc0-kube-api-access-ss8w5\") pod \"187e842b-e759-4c93-a584-53d3a9cc4bc0\" (UID: \"187e842b-e759-4c93-a584-53d3a9cc4bc0\") " Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.197143 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/187e842b-e759-4c93-a584-53d3a9cc4bc0-logs\") pod \"187e842b-e759-4c93-a584-53d3a9cc4bc0\" (UID: \"187e842b-e759-4c93-a584-53d3a9cc4bc0\") " Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.197175 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/187e842b-e759-4c93-a584-53d3a9cc4bc0-config-data\") pod \"187e842b-e759-4c93-a584-53d3a9cc4bc0\" (UID: \"187e842b-e759-4c93-a584-53d3a9cc4bc0\") " Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.197337 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/187e842b-e759-4c93-a584-53d3a9cc4bc0-combined-ca-bundle\") pod \"187e842b-e759-4c93-a584-53d3a9cc4bc0\" (UID: \"187e842b-e759-4c93-a584-53d3a9cc4bc0\") " Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.199875 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/187e842b-e759-4c93-a584-53d3a9cc4bc0-logs" (OuterVolumeSpecName: "logs") pod "187e842b-e759-4c93-a584-53d3a9cc4bc0" (UID: "187e842b-e759-4c93-a584-53d3a9cc4bc0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.208924 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/187e842b-e759-4c93-a584-53d3a9cc4bc0-kube-api-access-ss8w5" (OuterVolumeSpecName: "kube-api-access-ss8w5") pod "187e842b-e759-4c93-a584-53d3a9cc4bc0" (UID: "187e842b-e759-4c93-a584-53d3a9cc4bc0"). InnerVolumeSpecName "kube-api-access-ss8w5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.236968 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/187e842b-e759-4c93-a584-53d3a9cc4bc0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "187e842b-e759-4c93-a584-53d3a9cc4bc0" (UID: "187e842b-e759-4c93-a584-53d3a9cc4bc0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.240223 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/187e842b-e759-4c93-a584-53d3a9cc4bc0-config-data" (OuterVolumeSpecName: "config-data") pod "187e842b-e759-4c93-a584-53d3a9cc4bc0" (UID: "187e842b-e759-4c93-a584-53d3a9cc4bc0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.257893 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/187e842b-e759-4c93-a584-53d3a9cc4bc0-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "187e842b-e759-4c93-a584-53d3a9cc4bc0" (UID: "187e842b-e759-4c93-a584-53d3a9cc4bc0"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.300030 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/187e842b-e759-4c93-a584-53d3a9cc4bc0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.300071 4771 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/187e842b-e759-4c93-a584-53d3a9cc4bc0-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.300084 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ss8w5\" (UniqueName: \"kubernetes.io/projected/187e842b-e759-4c93-a584-53d3a9cc4bc0-kube-api-access-ss8w5\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.300093 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/187e842b-e759-4c93-a584-53d3a9cc4bc0-logs\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.300104 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/187e842b-e759-4c93-a584-53d3a9cc4bc0-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.391679 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.447532 4771 generic.go:334] "Generic (PLEG): container finished" podID="187e842b-e759-4c93-a584-53d3a9cc4bc0" containerID="08985d829f44174b2f22ff2a936a00f59cf5f6a6b810708b6c15f0c2c7d1ccfb" exitCode=0 Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.447569 4771 generic.go:334] "Generic (PLEG): container finished" podID="187e842b-e759-4c93-a584-53d3a9cc4bc0" containerID="8a3cc4764203b1f30ae5aa2d311b3b73591951229c35bdb9ae5c14ff5e47d640" exitCode=143 Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.447830 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"187e842b-e759-4c93-a584-53d3a9cc4bc0","Type":"ContainerDied","Data":"08985d829f44174b2f22ff2a936a00f59cf5f6a6b810708b6c15f0c2c7d1ccfb"} Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.447872 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"187e842b-e759-4c93-a584-53d3a9cc4bc0","Type":"ContainerDied","Data":"8a3cc4764203b1f30ae5aa2d311b3b73591951229c35bdb9ae5c14ff5e47d640"} Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.447884 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"187e842b-e759-4c93-a584-53d3a9cc4bc0","Type":"ContainerDied","Data":"e499f65674ea0bd298c56009a1b9de36a662360c1be78d63d3e85c36eedc705d"} Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.447902 4771 scope.go:117] "RemoveContainer" containerID="08985d829f44174b2f22ff2a936a00f59cf5f6a6b810708b6c15f0c2c7d1ccfb" Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.447902 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.470408 4771 generic.go:334] "Generic (PLEG): container finished" podID="d89ad9e9-e14e-4082-9121-39ae3de01ab3" containerID="b2452803e43391328b6c482a1d96540a9930a0786aed260396b88bda9c598c4c" exitCode=143 Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.470474 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d89ad9e9-e14e-4082-9121-39ae3de01ab3","Type":"ContainerDied","Data":"b2452803e43391328b6c482a1d96540a9930a0786aed260396b88bda9c598c4c"} Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.476947 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="08a4e4bf-4009-4d69-b513-f821239be25c" containerName="nova-scheduler-scheduler" containerID="cri-o://6f0f3aca62e00797f194aa1cb54c9a72e0a60c49ddb7a8f3df6150aa5b5dd95b" gracePeriod=30 Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.477217 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"8a756a41-c563-4d6c-a5e8-907724f1847c","Type":"ContainerStarted","Data":"d747eed3a14422122cadff1993960f9b557a9bbb0c1edb7c6db9e2a93273bae0"} Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.477320 4771 scope.go:117] "RemoveContainer" containerID="8a3cc4764203b1f30ae5aa2d311b3b73591951229c35bdb9ae5c14ff5e47d640" Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.502574 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.523258 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.530424 4771 scope.go:117] "RemoveContainer" containerID="08985d829f44174b2f22ff2a936a00f59cf5f6a6b810708b6c15f0c2c7d1ccfb" Oct 01 15:15:56 crc kubenswrapper[4771]: E1001 15:15:56.530859 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08985d829f44174b2f22ff2a936a00f59cf5f6a6b810708b6c15f0c2c7d1ccfb\": container with ID starting with 08985d829f44174b2f22ff2a936a00f59cf5f6a6b810708b6c15f0c2c7d1ccfb not found: ID does not exist" containerID="08985d829f44174b2f22ff2a936a00f59cf5f6a6b810708b6c15f0c2c7d1ccfb" Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.530887 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08985d829f44174b2f22ff2a936a00f59cf5f6a6b810708b6c15f0c2c7d1ccfb"} err="failed to get container status \"08985d829f44174b2f22ff2a936a00f59cf5f6a6b810708b6c15f0c2c7d1ccfb\": rpc error: code = NotFound desc = could not find container \"08985d829f44174b2f22ff2a936a00f59cf5f6a6b810708b6c15f0c2c7d1ccfb\": container with ID starting with 08985d829f44174b2f22ff2a936a00f59cf5f6a6b810708b6c15f0c2c7d1ccfb not found: ID does not exist" Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.530912 4771 scope.go:117] "RemoveContainer" containerID="8a3cc4764203b1f30ae5aa2d311b3b73591951229c35bdb9ae5c14ff5e47d640" Oct 01 15:15:56 crc kubenswrapper[4771]: E1001 15:15:56.531158 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a3cc4764203b1f30ae5aa2d311b3b73591951229c35bdb9ae5c14ff5e47d640\": container with ID starting with 8a3cc4764203b1f30ae5aa2d311b3b73591951229c35bdb9ae5c14ff5e47d640 not found: ID does not exist" containerID="8a3cc4764203b1f30ae5aa2d311b3b73591951229c35bdb9ae5c14ff5e47d640" Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.531181 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a3cc4764203b1f30ae5aa2d311b3b73591951229c35bdb9ae5c14ff5e47d640"} err="failed to get container status \"8a3cc4764203b1f30ae5aa2d311b3b73591951229c35bdb9ae5c14ff5e47d640\": rpc error: code = NotFound desc = could not find container \"8a3cc4764203b1f30ae5aa2d311b3b73591951229c35bdb9ae5c14ff5e47d640\": container with ID starting with 8a3cc4764203b1f30ae5aa2d311b3b73591951229c35bdb9ae5c14ff5e47d640 not found: ID does not exist" Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.531195 4771 scope.go:117] "RemoveContainer" containerID="08985d829f44174b2f22ff2a936a00f59cf5f6a6b810708b6c15f0c2c7d1ccfb" Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.531265 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 01 15:15:56 crc kubenswrapper[4771]: E1001 15:15:56.531798 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="187e842b-e759-4c93-a584-53d3a9cc4bc0" containerName="nova-metadata-log" Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.531835 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="187e842b-e759-4c93-a584-53d3a9cc4bc0" containerName="nova-metadata-log" Oct 01 15:15:56 crc kubenswrapper[4771]: E1001 15:15:56.531847 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="187e842b-e759-4c93-a584-53d3a9cc4bc0" containerName="nova-metadata-metadata" Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.531856 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="187e842b-e759-4c93-a584-53d3a9cc4bc0" containerName="nova-metadata-metadata" Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.532198 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="187e842b-e759-4c93-a584-53d3a9cc4bc0" containerName="nova-metadata-log" Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.532251 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="187e842b-e759-4c93-a584-53d3a9cc4bc0" containerName="nova-metadata-metadata" Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.533842 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.535639 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08985d829f44174b2f22ff2a936a00f59cf5f6a6b810708b6c15f0c2c7d1ccfb"} err="failed to get container status \"08985d829f44174b2f22ff2a936a00f59cf5f6a6b810708b6c15f0c2c7d1ccfb\": rpc error: code = NotFound desc = could not find container \"08985d829f44174b2f22ff2a936a00f59cf5f6a6b810708b6c15f0c2c7d1ccfb\": container with ID starting with 08985d829f44174b2f22ff2a936a00f59cf5f6a6b810708b6c15f0c2c7d1ccfb not found: ID does not exist" Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.535673 4771 scope.go:117] "RemoveContainer" containerID="8a3cc4764203b1f30ae5aa2d311b3b73591951229c35bdb9ae5c14ff5e47d640" Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.539094 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.542890 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a3cc4764203b1f30ae5aa2d311b3b73591951229c35bdb9ae5c14ff5e47d640"} err="failed to get container status \"8a3cc4764203b1f30ae5aa2d311b3b73591951229c35bdb9ae5c14ff5e47d640\": rpc error: code = NotFound desc = could not find container \"8a3cc4764203b1f30ae5aa2d311b3b73591951229c35bdb9ae5c14ff5e47d640\": container with ID starting with 8a3cc4764203b1f30ae5aa2d311b3b73591951229c35bdb9ae5c14ff5e47d640 not found: ID does not exist" Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.543099 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.543263 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.706642 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c614d971-1db4-4f84-95d3-214aff9ed919-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c614d971-1db4-4f84-95d3-214aff9ed919\") " pod="openstack/nova-metadata-0" Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.706692 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjntr\" (UniqueName: \"kubernetes.io/projected/c614d971-1db4-4f84-95d3-214aff9ed919-kube-api-access-qjntr\") pod \"nova-metadata-0\" (UID: \"c614d971-1db4-4f84-95d3-214aff9ed919\") " pod="openstack/nova-metadata-0" Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.706870 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c614d971-1db4-4f84-95d3-214aff9ed919-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c614d971-1db4-4f84-95d3-214aff9ed919\") " pod="openstack/nova-metadata-0" Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.706909 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c614d971-1db4-4f84-95d3-214aff9ed919-config-data\") pod \"nova-metadata-0\" (UID: \"c614d971-1db4-4f84-95d3-214aff9ed919\") " pod="openstack/nova-metadata-0" Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.706962 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c614d971-1db4-4f84-95d3-214aff9ed919-logs\") pod \"nova-metadata-0\" (UID: \"c614d971-1db4-4f84-95d3-214aff9ed919\") " pod="openstack/nova-metadata-0" Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.808515 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c614d971-1db4-4f84-95d3-214aff9ed919-logs\") pod \"nova-metadata-0\" (UID: \"c614d971-1db4-4f84-95d3-214aff9ed919\") " pod="openstack/nova-metadata-0" Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.808885 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c614d971-1db4-4f84-95d3-214aff9ed919-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c614d971-1db4-4f84-95d3-214aff9ed919\") " pod="openstack/nova-metadata-0" Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.808923 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjntr\" (UniqueName: \"kubernetes.io/projected/c614d971-1db4-4f84-95d3-214aff9ed919-kube-api-access-qjntr\") pod \"nova-metadata-0\" (UID: \"c614d971-1db4-4f84-95d3-214aff9ed919\") " pod="openstack/nova-metadata-0" Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.808979 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c614d971-1db4-4f84-95d3-214aff9ed919-logs\") pod \"nova-metadata-0\" (UID: \"c614d971-1db4-4f84-95d3-214aff9ed919\") " pod="openstack/nova-metadata-0" Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.809029 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c614d971-1db4-4f84-95d3-214aff9ed919-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c614d971-1db4-4f84-95d3-214aff9ed919\") " pod="openstack/nova-metadata-0" Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.809058 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c614d971-1db4-4f84-95d3-214aff9ed919-config-data\") pod \"nova-metadata-0\" (UID: \"c614d971-1db4-4f84-95d3-214aff9ed919\") " pod="openstack/nova-metadata-0" Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.812664 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c614d971-1db4-4f84-95d3-214aff9ed919-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"c614d971-1db4-4f84-95d3-214aff9ed919\") " pod="openstack/nova-metadata-0" Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.813146 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c614d971-1db4-4f84-95d3-214aff9ed919-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"c614d971-1db4-4f84-95d3-214aff9ed919\") " pod="openstack/nova-metadata-0" Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.818310 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c614d971-1db4-4f84-95d3-214aff9ed919-config-data\") pod \"nova-metadata-0\" (UID: \"c614d971-1db4-4f84-95d3-214aff9ed919\") " pod="openstack/nova-metadata-0" Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.835600 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjntr\" (UniqueName: \"kubernetes.io/projected/c614d971-1db4-4f84-95d3-214aff9ed919-kube-api-access-qjntr\") pod \"nova-metadata-0\" (UID: \"c614d971-1db4-4f84-95d3-214aff9ed919\") " pod="openstack/nova-metadata-0" Oct 01 15:15:56 crc kubenswrapper[4771]: I1001 15:15:56.877955 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 15:15:57 crc kubenswrapper[4771]: I1001 15:15:57.325229 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 15:15:57 crc kubenswrapper[4771]: I1001 15:15:57.489885 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"8a756a41-c563-4d6c-a5e8-907724f1847c","Type":"ContainerStarted","Data":"d561d91ba7d19a79b7375745721a11059bcb91ae06ca292e1cc5bb5b67846961"} Oct 01 15:15:57 crc kubenswrapper[4771]: I1001 15:15:57.490005 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 01 15:15:57 crc kubenswrapper[4771]: I1001 15:15:57.492570 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c614d971-1db4-4f84-95d3-214aff9ed919","Type":"ContainerStarted","Data":"aafb4cd65b25fce3deff85f098ecb6c17d3bda5d3d94554ec66fe39def4d8e6e"} Oct 01 15:15:57 crc kubenswrapper[4771]: I1001 15:15:57.514943 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.514918288 podStartE2EDuration="2.514918288s" podCreationTimestamp="2025-10-01 15:15:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:15:57.50729112 +0000 UTC m=+1202.126466331" watchObservedRunningTime="2025-10-01 15:15:57.514918288 +0000 UTC m=+1202.134093499" Oct 01 15:15:58 crc kubenswrapper[4771]: I1001 15:15:58.002112 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="187e842b-e759-4c93-a584-53d3a9cc4bc0" path="/var/lib/kubelet/pods/187e842b-e759-4c93-a584-53d3a9cc4bc0/volumes" Oct 01 15:15:58 crc kubenswrapper[4771]: I1001 15:15:58.505503 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c614d971-1db4-4f84-95d3-214aff9ed919","Type":"ContainerStarted","Data":"f80733293ff439bb23cf5e76e0b49ae174a23b29f269b60b2a64a9438f090970"} Oct 01 15:15:58 crc kubenswrapper[4771]: I1001 15:15:58.505565 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c614d971-1db4-4f84-95d3-214aff9ed919","Type":"ContainerStarted","Data":"1d7c9f7d4164323adc8a025de5fe1b86d5e8636d9a188839ad1391abbe6737bf"} Oct 01 15:15:58 crc kubenswrapper[4771]: I1001 15:15:58.534786 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.5347655529999997 podStartE2EDuration="2.534765553s" podCreationTimestamp="2025-10-01 15:15:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:15:58.528899527 +0000 UTC m=+1203.148074698" watchObservedRunningTime="2025-10-01 15:15:58.534765553 +0000 UTC m=+1203.153940734" Oct 01 15:15:58 crc kubenswrapper[4771]: E1001 15:15:58.682538 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6f0f3aca62e00797f194aa1cb54c9a72e0a60c49ddb7a8f3df6150aa5b5dd95b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 01 15:15:58 crc kubenswrapper[4771]: E1001 15:15:58.685038 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6f0f3aca62e00797f194aa1cb54c9a72e0a60c49ddb7a8f3df6150aa5b5dd95b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 01 15:15:58 crc kubenswrapper[4771]: E1001 15:15:58.687760 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6f0f3aca62e00797f194aa1cb54c9a72e0a60c49ddb7a8f3df6150aa5b5dd95b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 01 15:15:58 crc kubenswrapper[4771]: E1001 15:15:58.687848 4771 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="08a4e4bf-4009-4d69-b513-f821239be25c" containerName="nova-scheduler-scheduler" Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.031031 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.185218 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jj45z\" (UniqueName: \"kubernetes.io/projected/08a4e4bf-4009-4d69-b513-f821239be25c-kube-api-access-jj45z\") pod \"08a4e4bf-4009-4d69-b513-f821239be25c\" (UID: \"08a4e4bf-4009-4d69-b513-f821239be25c\") " Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.185304 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08a4e4bf-4009-4d69-b513-f821239be25c-combined-ca-bundle\") pod \"08a4e4bf-4009-4d69-b513-f821239be25c\" (UID: \"08a4e4bf-4009-4d69-b513-f821239be25c\") " Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.185432 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08a4e4bf-4009-4d69-b513-f821239be25c-config-data\") pod \"08a4e4bf-4009-4d69-b513-f821239be25c\" (UID: \"08a4e4bf-4009-4d69-b513-f821239be25c\") " Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.192493 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08a4e4bf-4009-4d69-b513-f821239be25c-kube-api-access-jj45z" (OuterVolumeSpecName: "kube-api-access-jj45z") pod "08a4e4bf-4009-4d69-b513-f821239be25c" (UID: "08a4e4bf-4009-4d69-b513-f821239be25c"). InnerVolumeSpecName "kube-api-access-jj45z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.216009 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08a4e4bf-4009-4d69-b513-f821239be25c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08a4e4bf-4009-4d69-b513-f821239be25c" (UID: "08a4e4bf-4009-4d69-b513-f821239be25c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.240530 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08a4e4bf-4009-4d69-b513-f821239be25c-config-data" (OuterVolumeSpecName: "config-data") pod "08a4e4bf-4009-4d69-b513-f821239be25c" (UID: "08a4e4bf-4009-4d69-b513-f821239be25c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.288669 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08a4e4bf-4009-4d69-b513-f821239be25c-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.288705 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jj45z\" (UniqueName: \"kubernetes.io/projected/08a4e4bf-4009-4d69-b513-f821239be25c-kube-api-access-jj45z\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.288722 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08a4e4bf-4009-4d69-b513-f821239be25c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.297482 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.491498 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d89ad9e9-e14e-4082-9121-39ae3de01ab3-logs\") pod \"d89ad9e9-e14e-4082-9121-39ae3de01ab3\" (UID: \"d89ad9e9-e14e-4082-9121-39ae3de01ab3\") " Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.491545 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d89ad9e9-e14e-4082-9121-39ae3de01ab3-config-data\") pod \"d89ad9e9-e14e-4082-9121-39ae3de01ab3\" (UID: \"d89ad9e9-e14e-4082-9121-39ae3de01ab3\") " Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.491812 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfmjt\" (UniqueName: \"kubernetes.io/projected/d89ad9e9-e14e-4082-9121-39ae3de01ab3-kube-api-access-kfmjt\") pod \"d89ad9e9-e14e-4082-9121-39ae3de01ab3\" (UID: \"d89ad9e9-e14e-4082-9121-39ae3de01ab3\") " Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.491855 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d89ad9e9-e14e-4082-9121-39ae3de01ab3-combined-ca-bundle\") pod \"d89ad9e9-e14e-4082-9121-39ae3de01ab3\" (UID: \"d89ad9e9-e14e-4082-9121-39ae3de01ab3\") " Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.492349 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d89ad9e9-e14e-4082-9121-39ae3de01ab3-logs" (OuterVolumeSpecName: "logs") pod "d89ad9e9-e14e-4082-9121-39ae3de01ab3" (UID: "d89ad9e9-e14e-4082-9121-39ae3de01ab3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.497119 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d89ad9e9-e14e-4082-9121-39ae3de01ab3-kube-api-access-kfmjt" (OuterVolumeSpecName: "kube-api-access-kfmjt") pod "d89ad9e9-e14e-4082-9121-39ae3de01ab3" (UID: "d89ad9e9-e14e-4082-9121-39ae3de01ab3"). InnerVolumeSpecName "kube-api-access-kfmjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.521672 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d89ad9e9-e14e-4082-9121-39ae3de01ab3-config-data" (OuterVolumeSpecName: "config-data") pod "d89ad9e9-e14e-4082-9121-39ae3de01ab3" (UID: "d89ad9e9-e14e-4082-9121-39ae3de01ab3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.538376 4771 generic.go:334] "Generic (PLEG): container finished" podID="d89ad9e9-e14e-4082-9121-39ae3de01ab3" containerID="e49433c502f6816c19d599dd05f008d9bf78adb1b3b7f49c543c37bc8bee8470" exitCode=0 Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.538479 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d89ad9e9-e14e-4082-9121-39ae3de01ab3","Type":"ContainerDied","Data":"e49433c502f6816c19d599dd05f008d9bf78adb1b3b7f49c543c37bc8bee8470"} Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.538606 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d89ad9e9-e14e-4082-9121-39ae3de01ab3","Type":"ContainerDied","Data":"d59b25de00fdaf17c470f1bd1e781302aa1b147721ea86c44081000e324fcbee"} Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.538672 4771 scope.go:117] "RemoveContainer" containerID="e49433c502f6816c19d599dd05f008d9bf78adb1b3b7f49c543c37bc8bee8470" Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.538477 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.540054 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d89ad9e9-e14e-4082-9121-39ae3de01ab3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d89ad9e9-e14e-4082-9121-39ae3de01ab3" (UID: "d89ad9e9-e14e-4082-9121-39ae3de01ab3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.540416 4771 generic.go:334] "Generic (PLEG): container finished" podID="08a4e4bf-4009-4d69-b513-f821239be25c" containerID="6f0f3aca62e00797f194aa1cb54c9a72e0a60c49ddb7a8f3df6150aa5b5dd95b" exitCode=0 Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.540468 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.540478 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"08a4e4bf-4009-4d69-b513-f821239be25c","Type":"ContainerDied","Data":"6f0f3aca62e00797f194aa1cb54c9a72e0a60c49ddb7a8f3df6150aa5b5dd95b"} Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.540832 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"08a4e4bf-4009-4d69-b513-f821239be25c","Type":"ContainerDied","Data":"cf8294e01a66578a2f8cd637bac8a9ba270c6e7dfe115b0ff163b9e32828a145"} Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.573532 4771 scope.go:117] "RemoveContainer" containerID="b2452803e43391328b6c482a1d96540a9930a0786aed260396b88bda9c598c4c" Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.595572 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d89ad9e9-e14e-4082-9121-39ae3de01ab3-logs\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.595599 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d89ad9e9-e14e-4082-9121-39ae3de01ab3-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.595612 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfmjt\" (UniqueName: \"kubernetes.io/projected/d89ad9e9-e14e-4082-9121-39ae3de01ab3-kube-api-access-kfmjt\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.595620 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d89ad9e9-e14e-4082-9121-39ae3de01ab3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.605655 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.610297 4771 scope.go:117] "RemoveContainer" containerID="e49433c502f6816c19d599dd05f008d9bf78adb1b3b7f49c543c37bc8bee8470" Oct 01 15:16:00 crc kubenswrapper[4771]: E1001 15:16:00.610703 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e49433c502f6816c19d599dd05f008d9bf78adb1b3b7f49c543c37bc8bee8470\": container with ID starting with e49433c502f6816c19d599dd05f008d9bf78adb1b3b7f49c543c37bc8bee8470 not found: ID does not exist" containerID="e49433c502f6816c19d599dd05f008d9bf78adb1b3b7f49c543c37bc8bee8470" Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.610807 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e49433c502f6816c19d599dd05f008d9bf78adb1b3b7f49c543c37bc8bee8470"} err="failed to get container status \"e49433c502f6816c19d599dd05f008d9bf78adb1b3b7f49c543c37bc8bee8470\": rpc error: code = NotFound desc = could not find container \"e49433c502f6816c19d599dd05f008d9bf78adb1b3b7f49c543c37bc8bee8470\": container with ID starting with e49433c502f6816c19d599dd05f008d9bf78adb1b3b7f49c543c37bc8bee8470 not found: ID does not exist" Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.610834 4771 scope.go:117] "RemoveContainer" containerID="b2452803e43391328b6c482a1d96540a9930a0786aed260396b88bda9c598c4c" Oct 01 15:16:00 crc kubenswrapper[4771]: E1001 15:16:00.611307 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2452803e43391328b6c482a1d96540a9930a0786aed260396b88bda9c598c4c\": container with ID starting with b2452803e43391328b6c482a1d96540a9930a0786aed260396b88bda9c598c4c not found: ID does not exist" containerID="b2452803e43391328b6c482a1d96540a9930a0786aed260396b88bda9c598c4c" Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.611338 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2452803e43391328b6c482a1d96540a9930a0786aed260396b88bda9c598c4c"} err="failed to get container status \"b2452803e43391328b6c482a1d96540a9930a0786aed260396b88bda9c598c4c\": rpc error: code = NotFound desc = could not find container \"b2452803e43391328b6c482a1d96540a9930a0786aed260396b88bda9c598c4c\": container with ID starting with b2452803e43391328b6c482a1d96540a9930a0786aed260396b88bda9c598c4c not found: ID does not exist" Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.611356 4771 scope.go:117] "RemoveContainer" containerID="6f0f3aca62e00797f194aa1cb54c9a72e0a60c49ddb7a8f3df6150aa5b5dd95b" Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.622967 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.635650 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 15:16:00 crc kubenswrapper[4771]: E1001 15:16:00.636205 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d89ad9e9-e14e-4082-9121-39ae3de01ab3" containerName="nova-api-api" Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.636226 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d89ad9e9-e14e-4082-9121-39ae3de01ab3" containerName="nova-api-api" Oct 01 15:16:00 crc kubenswrapper[4771]: E1001 15:16:00.636281 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d89ad9e9-e14e-4082-9121-39ae3de01ab3" containerName="nova-api-log" Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.636291 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d89ad9e9-e14e-4082-9121-39ae3de01ab3" containerName="nova-api-log" Oct 01 15:16:00 crc kubenswrapper[4771]: E1001 15:16:00.636309 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08a4e4bf-4009-4d69-b513-f821239be25c" containerName="nova-scheduler-scheduler" Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.636319 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="08a4e4bf-4009-4d69-b513-f821239be25c" containerName="nova-scheduler-scheduler" Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.636546 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="08a4e4bf-4009-4d69-b513-f821239be25c" containerName="nova-scheduler-scheduler" Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.636599 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="d89ad9e9-e14e-4082-9121-39ae3de01ab3" containerName="nova-api-api" Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.636641 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="d89ad9e9-e14e-4082-9121-39ae3de01ab3" containerName="nova-api-log" Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.637400 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.640804 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.660505 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.673860 4771 scope.go:117] "RemoveContainer" containerID="6f0f3aca62e00797f194aa1cb54c9a72e0a60c49ddb7a8f3df6150aa5b5dd95b" Oct 01 15:16:00 crc kubenswrapper[4771]: E1001 15:16:00.674188 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f0f3aca62e00797f194aa1cb54c9a72e0a60c49ddb7a8f3df6150aa5b5dd95b\": container with ID starting with 6f0f3aca62e00797f194aa1cb54c9a72e0a60c49ddb7a8f3df6150aa5b5dd95b not found: ID does not exist" containerID="6f0f3aca62e00797f194aa1cb54c9a72e0a60c49ddb7a8f3df6150aa5b5dd95b" Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.674250 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f0f3aca62e00797f194aa1cb54c9a72e0a60c49ddb7a8f3df6150aa5b5dd95b"} err="failed to get container status \"6f0f3aca62e00797f194aa1cb54c9a72e0a60c49ddb7a8f3df6150aa5b5dd95b\": rpc error: code = NotFound desc = could not find container \"6f0f3aca62e00797f194aa1cb54c9a72e0a60c49ddb7a8f3df6150aa5b5dd95b\": container with ID starting with 6f0f3aca62e00797f194aa1cb54c9a72e0a60c49ddb7a8f3df6150aa5b5dd95b not found: ID does not exist" Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.799197 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d6c6725-6778-463a-89ab-d042864eca91-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2d6c6725-6778-463a-89ab-d042864eca91\") " pod="openstack/nova-scheduler-0" Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.799277 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2skw8\" (UniqueName: \"kubernetes.io/projected/2d6c6725-6778-463a-89ab-d042864eca91-kube-api-access-2skw8\") pod \"nova-scheduler-0\" (UID: \"2d6c6725-6778-463a-89ab-d042864eca91\") " pod="openstack/nova-scheduler-0" Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.799318 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d6c6725-6778-463a-89ab-d042864eca91-config-data\") pod \"nova-scheduler-0\" (UID: \"2d6c6725-6778-463a-89ab-d042864eca91\") " pod="openstack/nova-scheduler-0" Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.875595 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.893270 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.901060 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d6c6725-6778-463a-89ab-d042864eca91-config-data\") pod \"nova-scheduler-0\" (UID: \"2d6c6725-6778-463a-89ab-d042864eca91\") " pod="openstack/nova-scheduler-0" Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.901236 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.901237 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d6c6725-6778-463a-89ab-d042864eca91-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2d6c6725-6778-463a-89ab-d042864eca91\") " pod="openstack/nova-scheduler-0" Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.903878 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2skw8\" (UniqueName: \"kubernetes.io/projected/2d6c6725-6778-463a-89ab-d042864eca91-kube-api-access-2skw8\") pod \"nova-scheduler-0\" (UID: \"2d6c6725-6778-463a-89ab-d042864eca91\") " pod="openstack/nova-scheduler-0" Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.904068 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.907000 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.908422 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d6c6725-6778-463a-89ab-d042864eca91-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2d6c6725-6778-463a-89ab-d042864eca91\") " pod="openstack/nova-scheduler-0" Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.908448 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d6c6725-6778-463a-89ab-d042864eca91-config-data\") pod \"nova-scheduler-0\" (UID: \"2d6c6725-6778-463a-89ab-d042864eca91\") " pod="openstack/nova-scheduler-0" Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.931341 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2skw8\" (UniqueName: \"kubernetes.io/projected/2d6c6725-6778-463a-89ab-d042864eca91-kube-api-access-2skw8\") pod \"nova-scheduler-0\" (UID: \"2d6c6725-6778-463a-89ab-d042864eca91\") " pod="openstack/nova-scheduler-0" Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.941464 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 15:16:00 crc kubenswrapper[4771]: I1001 15:16:00.966975 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 15:16:01 crc kubenswrapper[4771]: I1001 15:16:01.005280 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a233694a-62d8-4f80-bb50-5764b8fdc3bb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a233694a-62d8-4f80-bb50-5764b8fdc3bb\") " pod="openstack/nova-api-0" Oct 01 15:16:01 crc kubenswrapper[4771]: I1001 15:16:01.005339 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a233694a-62d8-4f80-bb50-5764b8fdc3bb-logs\") pod \"nova-api-0\" (UID: \"a233694a-62d8-4f80-bb50-5764b8fdc3bb\") " pod="openstack/nova-api-0" Oct 01 15:16:01 crc kubenswrapper[4771]: I1001 15:16:01.005426 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92w72\" (UniqueName: \"kubernetes.io/projected/a233694a-62d8-4f80-bb50-5764b8fdc3bb-kube-api-access-92w72\") pod \"nova-api-0\" (UID: \"a233694a-62d8-4f80-bb50-5764b8fdc3bb\") " pod="openstack/nova-api-0" Oct 01 15:16:01 crc kubenswrapper[4771]: I1001 15:16:01.005462 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a233694a-62d8-4f80-bb50-5764b8fdc3bb-config-data\") pod \"nova-api-0\" (UID: \"a233694a-62d8-4f80-bb50-5764b8fdc3bb\") " pod="openstack/nova-api-0" Oct 01 15:16:01 crc kubenswrapper[4771]: I1001 15:16:01.110935 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a233694a-62d8-4f80-bb50-5764b8fdc3bb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a233694a-62d8-4f80-bb50-5764b8fdc3bb\") " pod="openstack/nova-api-0" Oct 01 15:16:01 crc kubenswrapper[4771]: I1001 15:16:01.111396 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a233694a-62d8-4f80-bb50-5764b8fdc3bb-logs\") pod \"nova-api-0\" (UID: \"a233694a-62d8-4f80-bb50-5764b8fdc3bb\") " pod="openstack/nova-api-0" Oct 01 15:16:01 crc kubenswrapper[4771]: I1001 15:16:01.111450 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92w72\" (UniqueName: \"kubernetes.io/projected/a233694a-62d8-4f80-bb50-5764b8fdc3bb-kube-api-access-92w72\") pod \"nova-api-0\" (UID: \"a233694a-62d8-4f80-bb50-5764b8fdc3bb\") " pod="openstack/nova-api-0" Oct 01 15:16:01 crc kubenswrapper[4771]: I1001 15:16:01.111494 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a233694a-62d8-4f80-bb50-5764b8fdc3bb-config-data\") pod \"nova-api-0\" (UID: \"a233694a-62d8-4f80-bb50-5764b8fdc3bb\") " pod="openstack/nova-api-0" Oct 01 15:16:01 crc kubenswrapper[4771]: I1001 15:16:01.112187 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a233694a-62d8-4f80-bb50-5764b8fdc3bb-logs\") pod \"nova-api-0\" (UID: \"a233694a-62d8-4f80-bb50-5764b8fdc3bb\") " pod="openstack/nova-api-0" Oct 01 15:16:01 crc kubenswrapper[4771]: I1001 15:16:01.126434 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a233694a-62d8-4f80-bb50-5764b8fdc3bb-config-data\") pod \"nova-api-0\" (UID: \"a233694a-62d8-4f80-bb50-5764b8fdc3bb\") " pod="openstack/nova-api-0" Oct 01 15:16:01 crc kubenswrapper[4771]: I1001 15:16:01.127375 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a233694a-62d8-4f80-bb50-5764b8fdc3bb-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a233694a-62d8-4f80-bb50-5764b8fdc3bb\") " pod="openstack/nova-api-0" Oct 01 15:16:01 crc kubenswrapper[4771]: I1001 15:16:01.132323 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92w72\" (UniqueName: \"kubernetes.io/projected/a233694a-62d8-4f80-bb50-5764b8fdc3bb-kube-api-access-92w72\") pod \"nova-api-0\" (UID: \"a233694a-62d8-4f80-bb50-5764b8fdc3bb\") " pod="openstack/nova-api-0" Oct 01 15:16:01 crc kubenswrapper[4771]: I1001 15:16:01.391850 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 15:16:01 crc kubenswrapper[4771]: I1001 15:16:01.405154 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 15:16:01 crc kubenswrapper[4771]: I1001 15:16:01.560067 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2d6c6725-6778-463a-89ab-d042864eca91","Type":"ContainerStarted","Data":"8ad98d7e781bc73b8e86d0fd4d2b4af030534be095e524d4fc3b4d930285a342"} Oct 01 15:16:01 crc kubenswrapper[4771]: I1001 15:16:01.878167 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 01 15:16:01 crc kubenswrapper[4771]: I1001 15:16:01.878299 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 01 15:16:01 crc kubenswrapper[4771]: I1001 15:16:01.919619 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 15:16:01 crc kubenswrapper[4771]: I1001 15:16:01.996673 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08a4e4bf-4009-4d69-b513-f821239be25c" path="/var/lib/kubelet/pods/08a4e4bf-4009-4d69-b513-f821239be25c/volumes" Oct 01 15:16:01 crc kubenswrapper[4771]: I1001 15:16:01.997792 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d89ad9e9-e14e-4082-9121-39ae3de01ab3" path="/var/lib/kubelet/pods/d89ad9e9-e14e-4082-9121-39ae3de01ab3/volumes" Oct 01 15:16:02 crc kubenswrapper[4771]: I1001 15:16:02.573865 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a233694a-62d8-4f80-bb50-5764b8fdc3bb","Type":"ContainerStarted","Data":"320d8a0267923fba7102455494ef87ae2981bd415e53210c65aa27b40bf64eff"} Oct 01 15:16:02 crc kubenswrapper[4771]: I1001 15:16:02.574297 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a233694a-62d8-4f80-bb50-5764b8fdc3bb","Type":"ContainerStarted","Data":"f7dab890de9147316571d9b93fbaaef46ba889a825cab999155a26a7ccb7bed8"} Oct 01 15:16:02 crc kubenswrapper[4771]: I1001 15:16:02.574315 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a233694a-62d8-4f80-bb50-5764b8fdc3bb","Type":"ContainerStarted","Data":"d703c695a748d4d5ab78ae982fd92f2a62023e0db12129ed9a119207131d7c0a"} Oct 01 15:16:02 crc kubenswrapper[4771]: I1001 15:16:02.575609 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2d6c6725-6778-463a-89ab-d042864eca91","Type":"ContainerStarted","Data":"6a6be13b2a991f9e86be260e28e6555487e4fbf87d6b2dcea6460227fd2a4606"} Oct 01 15:16:02 crc kubenswrapper[4771]: I1001 15:16:02.599039 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.599019187 podStartE2EDuration="2.599019187s" podCreationTimestamp="2025-10-01 15:16:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:16:02.595148122 +0000 UTC m=+1207.214323313" watchObservedRunningTime="2025-10-01 15:16:02.599019187 +0000 UTC m=+1207.218194358" Oct 01 15:16:02 crc kubenswrapper[4771]: I1001 15:16:02.613407 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.613386361 podStartE2EDuration="2.613386361s" podCreationTimestamp="2025-10-01 15:16:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:16:02.610567182 +0000 UTC m=+1207.229742353" watchObservedRunningTime="2025-10-01 15:16:02.613386361 +0000 UTC m=+1207.232561532" Oct 01 15:16:05 crc kubenswrapper[4771]: I1001 15:16:05.864232 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 01 15:16:05 crc kubenswrapper[4771]: I1001 15:16:05.968350 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 01 15:16:06 crc kubenswrapper[4771]: I1001 15:16:06.878821 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 01 15:16:06 crc kubenswrapper[4771]: I1001 15:16:06.878869 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 01 15:16:07 crc kubenswrapper[4771]: I1001 15:16:07.893922 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c614d971-1db4-4f84-95d3-214aff9ed919" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.196:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 15:16:07 crc kubenswrapper[4771]: I1001 15:16:07.893947 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="c614d971-1db4-4f84-95d3-214aff9ed919" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.196:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 15:16:10 crc kubenswrapper[4771]: I1001 15:16:10.967623 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 01 15:16:10 crc kubenswrapper[4771]: I1001 15:16:10.993498 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 01 15:16:11 crc kubenswrapper[4771]: I1001 15:16:11.393133 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 15:16:11 crc kubenswrapper[4771]: I1001 15:16:11.393188 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 15:16:11 crc kubenswrapper[4771]: I1001 15:16:11.706403 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 01 15:16:12 crc kubenswrapper[4771]: I1001 15:16:12.476152 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a233694a-62d8-4f80-bb50-5764b8fdc3bb" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.198:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 01 15:16:12 crc kubenswrapper[4771]: I1001 15:16:12.476162 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a233694a-62d8-4f80-bb50-5764b8fdc3bb" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.198:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 01 15:16:12 crc kubenswrapper[4771]: I1001 15:16:12.514132 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 01 15:16:16 crc kubenswrapper[4771]: I1001 15:16:16.303150 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 15:16:16 crc kubenswrapper[4771]: I1001 15:16:16.303698 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="eca5bbfa-3927-4c5b-b973-7dce060db69b" containerName="kube-state-metrics" containerID="cri-o://2ebe277151b21c46d2d99dbef04c1ed19ccdc61d7ba1739b0a3ef2c97c4243d6" gracePeriod=30 Oct 01 15:16:16 crc kubenswrapper[4771]: I1001 15:16:16.717383 4771 generic.go:334] "Generic (PLEG): container finished" podID="eca5bbfa-3927-4c5b-b973-7dce060db69b" containerID="2ebe277151b21c46d2d99dbef04c1ed19ccdc61d7ba1739b0a3ef2c97c4243d6" exitCode=2 Oct 01 15:16:16 crc kubenswrapper[4771]: I1001 15:16:16.717496 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"eca5bbfa-3927-4c5b-b973-7dce060db69b","Type":"ContainerDied","Data":"2ebe277151b21c46d2d99dbef04c1ed19ccdc61d7ba1739b0a3ef2c97c4243d6"} Oct 01 15:16:16 crc kubenswrapper[4771]: I1001 15:16:16.820084 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 15:16:16 crc kubenswrapper[4771]: I1001 15:16:16.843999 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9z5w\" (UniqueName: \"kubernetes.io/projected/eca5bbfa-3927-4c5b-b973-7dce060db69b-kube-api-access-h9z5w\") pod \"eca5bbfa-3927-4c5b-b973-7dce060db69b\" (UID: \"eca5bbfa-3927-4c5b-b973-7dce060db69b\") " Oct 01 15:16:16 crc kubenswrapper[4771]: I1001 15:16:16.855611 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eca5bbfa-3927-4c5b-b973-7dce060db69b-kube-api-access-h9z5w" (OuterVolumeSpecName: "kube-api-access-h9z5w") pod "eca5bbfa-3927-4c5b-b973-7dce060db69b" (UID: "eca5bbfa-3927-4c5b-b973-7dce060db69b"). InnerVolumeSpecName "kube-api-access-h9z5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:16:16 crc kubenswrapper[4771]: I1001 15:16:16.884441 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 01 15:16:16 crc kubenswrapper[4771]: I1001 15:16:16.887022 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 01 15:16:16 crc kubenswrapper[4771]: I1001 15:16:16.898040 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 01 15:16:16 crc kubenswrapper[4771]: I1001 15:16:16.946336 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9z5w\" (UniqueName: \"kubernetes.io/projected/eca5bbfa-3927-4c5b-b973-7dce060db69b-kube-api-access-h9z5w\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:17 crc kubenswrapper[4771]: I1001 15:16:17.727255 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"eca5bbfa-3927-4c5b-b973-7dce060db69b","Type":"ContainerDied","Data":"bbafb8ac2a540e25e11b91db5a3bb2c49a8d1897e1e007ba4a5875748efa30ca"} Oct 01 15:16:17 crc kubenswrapper[4771]: I1001 15:16:17.727284 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 15:16:17 crc kubenswrapper[4771]: I1001 15:16:17.727310 4771 scope.go:117] "RemoveContainer" containerID="2ebe277151b21c46d2d99dbef04c1ed19ccdc61d7ba1739b0a3ef2c97c4243d6" Oct 01 15:16:17 crc kubenswrapper[4771]: I1001 15:16:17.732669 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 01 15:16:17 crc kubenswrapper[4771]: I1001 15:16:17.787369 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 15:16:17 crc kubenswrapper[4771]: I1001 15:16:17.800798 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 15:16:17 crc kubenswrapper[4771]: I1001 15:16:17.808572 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 15:16:17 crc kubenswrapper[4771]: E1001 15:16:17.809305 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eca5bbfa-3927-4c5b-b973-7dce060db69b" containerName="kube-state-metrics" Oct 01 15:16:17 crc kubenswrapper[4771]: I1001 15:16:17.809396 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="eca5bbfa-3927-4c5b-b973-7dce060db69b" containerName="kube-state-metrics" Oct 01 15:16:17 crc kubenswrapper[4771]: I1001 15:16:17.809696 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="eca5bbfa-3927-4c5b-b973-7dce060db69b" containerName="kube-state-metrics" Oct 01 15:16:17 crc kubenswrapper[4771]: I1001 15:16:17.810582 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 15:16:17 crc kubenswrapper[4771]: I1001 15:16:17.816167 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 01 15:16:17 crc kubenswrapper[4771]: I1001 15:16:17.816356 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 01 15:16:17 crc kubenswrapper[4771]: I1001 15:16:17.826911 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 15:16:17 crc kubenswrapper[4771]: I1001 15:16:17.961420 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbfa5749-f148-47da-8cbf-b88b1ea7bd9f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"bbfa5749-f148-47da-8cbf-b88b1ea7bd9f\") " pod="openstack/kube-state-metrics-0" Oct 01 15:16:17 crc kubenswrapper[4771]: I1001 15:16:17.961802 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbfa5749-f148-47da-8cbf-b88b1ea7bd9f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"bbfa5749-f148-47da-8cbf-b88b1ea7bd9f\") " pod="openstack/kube-state-metrics-0" Oct 01 15:16:17 crc kubenswrapper[4771]: I1001 15:16:17.961964 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhcs4\" (UniqueName: \"kubernetes.io/projected/bbfa5749-f148-47da-8cbf-b88b1ea7bd9f-kube-api-access-zhcs4\") pod \"kube-state-metrics-0\" (UID: \"bbfa5749-f148-47da-8cbf-b88b1ea7bd9f\") " pod="openstack/kube-state-metrics-0" Oct 01 15:16:17 crc kubenswrapper[4771]: I1001 15:16:17.962087 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/bbfa5749-f148-47da-8cbf-b88b1ea7bd9f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"bbfa5749-f148-47da-8cbf-b88b1ea7bd9f\") " pod="openstack/kube-state-metrics-0" Oct 01 15:16:17 crc kubenswrapper[4771]: I1001 15:16:17.995045 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eca5bbfa-3927-4c5b-b973-7dce060db69b" path="/var/lib/kubelet/pods/eca5bbfa-3927-4c5b-b973-7dce060db69b/volumes" Oct 01 15:16:18 crc kubenswrapper[4771]: I1001 15:16:18.063214 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhcs4\" (UniqueName: \"kubernetes.io/projected/bbfa5749-f148-47da-8cbf-b88b1ea7bd9f-kube-api-access-zhcs4\") pod \"kube-state-metrics-0\" (UID: \"bbfa5749-f148-47da-8cbf-b88b1ea7bd9f\") " pod="openstack/kube-state-metrics-0" Oct 01 15:16:18 crc kubenswrapper[4771]: I1001 15:16:18.063293 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/bbfa5749-f148-47da-8cbf-b88b1ea7bd9f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"bbfa5749-f148-47da-8cbf-b88b1ea7bd9f\") " pod="openstack/kube-state-metrics-0" Oct 01 15:16:18 crc kubenswrapper[4771]: I1001 15:16:18.063354 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbfa5749-f148-47da-8cbf-b88b1ea7bd9f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"bbfa5749-f148-47da-8cbf-b88b1ea7bd9f\") " pod="openstack/kube-state-metrics-0" Oct 01 15:16:18 crc kubenswrapper[4771]: I1001 15:16:18.063448 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbfa5749-f148-47da-8cbf-b88b1ea7bd9f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"bbfa5749-f148-47da-8cbf-b88b1ea7bd9f\") " pod="openstack/kube-state-metrics-0" Oct 01 15:16:18 crc kubenswrapper[4771]: I1001 15:16:18.068588 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/bbfa5749-f148-47da-8cbf-b88b1ea7bd9f-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"bbfa5749-f148-47da-8cbf-b88b1ea7bd9f\") " pod="openstack/kube-state-metrics-0" Oct 01 15:16:18 crc kubenswrapper[4771]: I1001 15:16:18.069681 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbfa5749-f148-47da-8cbf-b88b1ea7bd9f-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"bbfa5749-f148-47da-8cbf-b88b1ea7bd9f\") " pod="openstack/kube-state-metrics-0" Oct 01 15:16:18 crc kubenswrapper[4771]: I1001 15:16:18.070346 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbfa5749-f148-47da-8cbf-b88b1ea7bd9f-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"bbfa5749-f148-47da-8cbf-b88b1ea7bd9f\") " pod="openstack/kube-state-metrics-0" Oct 01 15:16:18 crc kubenswrapper[4771]: I1001 15:16:18.083172 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhcs4\" (UniqueName: \"kubernetes.io/projected/bbfa5749-f148-47da-8cbf-b88b1ea7bd9f-kube-api-access-zhcs4\") pod \"kube-state-metrics-0\" (UID: \"bbfa5749-f148-47da-8cbf-b88b1ea7bd9f\") " pod="openstack/kube-state-metrics-0" Oct 01 15:16:18 crc kubenswrapper[4771]: I1001 15:16:18.141801 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 01 15:16:18 crc kubenswrapper[4771]: I1001 15:16:18.317857 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 15:16:18 crc kubenswrapper[4771]: I1001 15:16:18.318265 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f10e4587-5897-4af2-ae7f-5b4e3a8392d2" containerName="sg-core" containerID="cri-o://d81312d3000de177a8e959024c071a0140f63f31b8ffbb4f615251640832a58f" gracePeriod=30 Oct 01 15:16:18 crc kubenswrapper[4771]: I1001 15:16:18.318351 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f10e4587-5897-4af2-ae7f-5b4e3a8392d2" containerName="proxy-httpd" containerID="cri-o://743a4a90e2a8d1bba869c28314e97739392e0e2b236c4521ab60d98951ff670d" gracePeriod=30 Oct 01 15:16:18 crc kubenswrapper[4771]: I1001 15:16:18.318435 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f10e4587-5897-4af2-ae7f-5b4e3a8392d2" containerName="ceilometer-notification-agent" containerID="cri-o://daa8b6d6c018fc866fc424dc0af864a8d02a7b70ed46211915e7f4a3716f8be0" gracePeriod=30 Oct 01 15:16:18 crc kubenswrapper[4771]: I1001 15:16:18.318442 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f10e4587-5897-4af2-ae7f-5b4e3a8392d2" containerName="ceilometer-central-agent" containerID="cri-o://23ae913d6ac1cbd4c2707c2b990cdcad7e906aa212dfc4b3e288e4caa4aa4c4f" gracePeriod=30 Oct 01 15:16:18 crc kubenswrapper[4771]: I1001 15:16:18.606725 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 01 15:16:18 crc kubenswrapper[4771]: I1001 15:16:18.741766 4771 generic.go:334] "Generic (PLEG): container finished" podID="f10e4587-5897-4af2-ae7f-5b4e3a8392d2" containerID="743a4a90e2a8d1bba869c28314e97739392e0e2b236c4521ab60d98951ff670d" exitCode=0 Oct 01 15:16:18 crc kubenswrapper[4771]: I1001 15:16:18.741804 4771 generic.go:334] "Generic (PLEG): container finished" podID="f10e4587-5897-4af2-ae7f-5b4e3a8392d2" containerID="d81312d3000de177a8e959024c071a0140f63f31b8ffbb4f615251640832a58f" exitCode=2 Oct 01 15:16:18 crc kubenswrapper[4771]: I1001 15:16:18.741814 4771 generic.go:334] "Generic (PLEG): container finished" podID="f10e4587-5897-4af2-ae7f-5b4e3a8392d2" containerID="23ae913d6ac1cbd4c2707c2b990cdcad7e906aa212dfc4b3e288e4caa4aa4c4f" exitCode=0 Oct 01 15:16:18 crc kubenswrapper[4771]: I1001 15:16:18.741840 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f10e4587-5897-4af2-ae7f-5b4e3a8392d2","Type":"ContainerDied","Data":"743a4a90e2a8d1bba869c28314e97739392e0e2b236c4521ab60d98951ff670d"} Oct 01 15:16:18 crc kubenswrapper[4771]: I1001 15:16:18.741881 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f10e4587-5897-4af2-ae7f-5b4e3a8392d2","Type":"ContainerDied","Data":"d81312d3000de177a8e959024c071a0140f63f31b8ffbb4f615251640832a58f"} Oct 01 15:16:18 crc kubenswrapper[4771]: I1001 15:16:18.741891 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f10e4587-5897-4af2-ae7f-5b4e3a8392d2","Type":"ContainerDied","Data":"23ae913d6ac1cbd4c2707c2b990cdcad7e906aa212dfc4b3e288e4caa4aa4c4f"} Oct 01 15:16:18 crc kubenswrapper[4771]: I1001 15:16:18.743391 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"bbfa5749-f148-47da-8cbf-b88b1ea7bd9f","Type":"ContainerStarted","Data":"dd447aabb8294c535be9a0107f1a5f157b473c21febf42b362ae1a27e0ba006c"} Oct 01 15:16:19 crc kubenswrapper[4771]: E1001 15:16:19.328218 4771 manager.go:1116] Failed to create existing container: /kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod187e842b_e759_4c93_a584_53d3a9cc4bc0.slice/crio-e499f65674ea0bd298c56009a1b9de36a662360c1be78d63d3e85c36eedc705d: Error finding container e499f65674ea0bd298c56009a1b9de36a662360c1be78d63d3e85c36eedc705d: Status 404 returned error can't find the container with id e499f65674ea0bd298c56009a1b9de36a662360c1be78d63d3e85c36eedc705d Oct 01 15:16:19 crc kubenswrapper[4771]: I1001 15:16:19.718289 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 01 15:16:19 crc kubenswrapper[4771]: I1001 15:16:19.753975 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"bbfa5749-f148-47da-8cbf-b88b1ea7bd9f","Type":"ContainerStarted","Data":"764d1d7a9a297ee5f08f84d2cdb4cf4a8284dec19ffbf46e7320b9a8778900bd"} Oct 01 15:16:19 crc kubenswrapper[4771]: I1001 15:16:19.755271 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 01 15:16:19 crc kubenswrapper[4771]: I1001 15:16:19.757936 4771 generic.go:334] "Generic (PLEG): container finished" podID="7d74cde4-a358-4f81-9c52-7e4a9a1646f1" containerID="7e7a564aeaf570d53413068e4f2652c91526f4381e5e6d3fd8da85a657c78951" exitCode=137 Oct 01 15:16:19 crc kubenswrapper[4771]: I1001 15:16:19.757988 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 01 15:16:19 crc kubenswrapper[4771]: I1001 15:16:19.758014 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7d74cde4-a358-4f81-9c52-7e4a9a1646f1","Type":"ContainerDied","Data":"7e7a564aeaf570d53413068e4f2652c91526f4381e5e6d3fd8da85a657c78951"} Oct 01 15:16:19 crc kubenswrapper[4771]: I1001 15:16:19.758063 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7d74cde4-a358-4f81-9c52-7e4a9a1646f1","Type":"ContainerDied","Data":"39a5591c0b485e0269423a74303088d27a49e7a81b4932284c0750cc75e1f560"} Oct 01 15:16:19 crc kubenswrapper[4771]: I1001 15:16:19.758081 4771 scope.go:117] "RemoveContainer" containerID="7e7a564aeaf570d53413068e4f2652c91526f4381e5e6d3fd8da85a657c78951" Oct 01 15:16:19 crc kubenswrapper[4771]: I1001 15:16:19.772583 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.394750582 podStartE2EDuration="2.772563359s" podCreationTimestamp="2025-10-01 15:16:17 +0000 UTC" firstStartedPulling="2025-10-01 15:16:18.617571939 +0000 UTC m=+1223.236747110" lastFinishedPulling="2025-10-01 15:16:18.995384706 +0000 UTC m=+1223.614559887" observedRunningTime="2025-10-01 15:16:19.771506403 +0000 UTC m=+1224.390681604" watchObservedRunningTime="2025-10-01 15:16:19.772563359 +0000 UTC m=+1224.391738530" Oct 01 15:16:19 crc kubenswrapper[4771]: I1001 15:16:19.782647 4771 scope.go:117] "RemoveContainer" containerID="7e7a564aeaf570d53413068e4f2652c91526f4381e5e6d3fd8da85a657c78951" Oct 01 15:16:19 crc kubenswrapper[4771]: E1001 15:16:19.783146 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e7a564aeaf570d53413068e4f2652c91526f4381e5e6d3fd8da85a657c78951\": container with ID starting with 7e7a564aeaf570d53413068e4f2652c91526f4381e5e6d3fd8da85a657c78951 not found: ID does not exist" containerID="7e7a564aeaf570d53413068e4f2652c91526f4381e5e6d3fd8da85a657c78951" Oct 01 15:16:19 crc kubenswrapper[4771]: I1001 15:16:19.783183 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e7a564aeaf570d53413068e4f2652c91526f4381e5e6d3fd8da85a657c78951"} err="failed to get container status \"7e7a564aeaf570d53413068e4f2652c91526f4381e5e6d3fd8da85a657c78951\": rpc error: code = NotFound desc = could not find container \"7e7a564aeaf570d53413068e4f2652c91526f4381e5e6d3fd8da85a657c78951\": container with ID starting with 7e7a564aeaf570d53413068e4f2652c91526f4381e5e6d3fd8da85a657c78951 not found: ID does not exist" Oct 01 15:16:19 crc kubenswrapper[4771]: I1001 15:16:19.793199 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d74cde4-a358-4f81-9c52-7e4a9a1646f1-combined-ca-bundle\") pod \"7d74cde4-a358-4f81-9c52-7e4a9a1646f1\" (UID: \"7d74cde4-a358-4f81-9c52-7e4a9a1646f1\") " Oct 01 15:16:19 crc kubenswrapper[4771]: I1001 15:16:19.793239 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d74cde4-a358-4f81-9c52-7e4a9a1646f1-config-data\") pod \"7d74cde4-a358-4f81-9c52-7e4a9a1646f1\" (UID: \"7d74cde4-a358-4f81-9c52-7e4a9a1646f1\") " Oct 01 15:16:19 crc kubenswrapper[4771]: I1001 15:16:19.793258 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rssbb\" (UniqueName: \"kubernetes.io/projected/7d74cde4-a358-4f81-9c52-7e4a9a1646f1-kube-api-access-rssbb\") pod \"7d74cde4-a358-4f81-9c52-7e4a9a1646f1\" (UID: \"7d74cde4-a358-4f81-9c52-7e4a9a1646f1\") " Oct 01 15:16:19 crc kubenswrapper[4771]: I1001 15:16:19.798064 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d74cde4-a358-4f81-9c52-7e4a9a1646f1-kube-api-access-rssbb" (OuterVolumeSpecName: "kube-api-access-rssbb") pod "7d74cde4-a358-4f81-9c52-7e4a9a1646f1" (UID: "7d74cde4-a358-4f81-9c52-7e4a9a1646f1"). InnerVolumeSpecName "kube-api-access-rssbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:16:19 crc kubenswrapper[4771]: I1001 15:16:19.824501 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d74cde4-a358-4f81-9c52-7e4a9a1646f1-config-data" (OuterVolumeSpecName: "config-data") pod "7d74cde4-a358-4f81-9c52-7e4a9a1646f1" (UID: "7d74cde4-a358-4f81-9c52-7e4a9a1646f1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:16:19 crc kubenswrapper[4771]: I1001 15:16:19.827098 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d74cde4-a358-4f81-9c52-7e4a9a1646f1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7d74cde4-a358-4f81-9c52-7e4a9a1646f1" (UID: "7d74cde4-a358-4f81-9c52-7e4a9a1646f1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:16:19 crc kubenswrapper[4771]: I1001 15:16:19.896038 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d74cde4-a358-4f81-9c52-7e4a9a1646f1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:19 crc kubenswrapper[4771]: I1001 15:16:19.896070 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d74cde4-a358-4f81-9c52-7e4a9a1646f1-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:19 crc kubenswrapper[4771]: I1001 15:16:19.896111 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rssbb\" (UniqueName: \"kubernetes.io/projected/7d74cde4-a358-4f81-9c52-7e4a9a1646f1-kube-api-access-rssbb\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:20 crc kubenswrapper[4771]: I1001 15:16:20.108402 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 15:16:20 crc kubenswrapper[4771]: I1001 15:16:20.116838 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 15:16:20 crc kubenswrapper[4771]: I1001 15:16:20.161487 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 15:16:20 crc kubenswrapper[4771]: E1001 15:16:20.162029 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d74cde4-a358-4f81-9c52-7e4a9a1646f1" containerName="nova-cell1-novncproxy-novncproxy" Oct 01 15:16:20 crc kubenswrapper[4771]: I1001 15:16:20.162057 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d74cde4-a358-4f81-9c52-7e4a9a1646f1" containerName="nova-cell1-novncproxy-novncproxy" Oct 01 15:16:20 crc kubenswrapper[4771]: I1001 15:16:20.162315 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d74cde4-a358-4f81-9c52-7e4a9a1646f1" containerName="nova-cell1-novncproxy-novncproxy" Oct 01 15:16:20 crc kubenswrapper[4771]: I1001 15:16:20.163090 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 01 15:16:20 crc kubenswrapper[4771]: I1001 15:16:20.166613 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 01 15:16:20 crc kubenswrapper[4771]: I1001 15:16:20.166815 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 01 15:16:20 crc kubenswrapper[4771]: I1001 15:16:20.166825 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 01 15:16:20 crc kubenswrapper[4771]: I1001 15:16:20.247829 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 15:16:20 crc kubenswrapper[4771]: I1001 15:16:20.331546 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8db97b3-f960-4eff-a879-c2b42c4e6364-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8db97b3-f960-4eff-a879-c2b42c4e6364\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 15:16:20 crc kubenswrapper[4771]: I1001 15:16:20.331611 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8db97b3-f960-4eff-a879-c2b42c4e6364-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8db97b3-f960-4eff-a879-c2b42c4e6364\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 15:16:20 crc kubenswrapper[4771]: I1001 15:16:20.331631 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbsrk\" (UniqueName: \"kubernetes.io/projected/a8db97b3-f960-4eff-a879-c2b42c4e6364-kube-api-access-vbsrk\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8db97b3-f960-4eff-a879-c2b42c4e6364\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 15:16:20 crc kubenswrapper[4771]: I1001 15:16:20.332933 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8db97b3-f960-4eff-a879-c2b42c4e6364-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8db97b3-f960-4eff-a879-c2b42c4e6364\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 15:16:20 crc kubenswrapper[4771]: I1001 15:16:20.333060 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8db97b3-f960-4eff-a879-c2b42c4e6364-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8db97b3-f960-4eff-a879-c2b42c4e6364\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 15:16:20 crc kubenswrapper[4771]: I1001 15:16:20.434904 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8db97b3-f960-4eff-a879-c2b42c4e6364-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8db97b3-f960-4eff-a879-c2b42c4e6364\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 15:16:20 crc kubenswrapper[4771]: I1001 15:16:20.435363 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8db97b3-f960-4eff-a879-c2b42c4e6364-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8db97b3-f960-4eff-a879-c2b42c4e6364\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 15:16:20 crc kubenswrapper[4771]: I1001 15:16:20.435423 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8db97b3-f960-4eff-a879-c2b42c4e6364-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8db97b3-f960-4eff-a879-c2b42c4e6364\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 15:16:20 crc kubenswrapper[4771]: I1001 15:16:20.435463 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbsrk\" (UniqueName: \"kubernetes.io/projected/a8db97b3-f960-4eff-a879-c2b42c4e6364-kube-api-access-vbsrk\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8db97b3-f960-4eff-a879-c2b42c4e6364\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 15:16:20 crc kubenswrapper[4771]: I1001 15:16:20.435644 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8db97b3-f960-4eff-a879-c2b42c4e6364-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8db97b3-f960-4eff-a879-c2b42c4e6364\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 15:16:20 crc kubenswrapper[4771]: I1001 15:16:20.439543 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8db97b3-f960-4eff-a879-c2b42c4e6364-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8db97b3-f960-4eff-a879-c2b42c4e6364\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 15:16:20 crc kubenswrapper[4771]: I1001 15:16:20.440702 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/a8db97b3-f960-4eff-a879-c2b42c4e6364-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8db97b3-f960-4eff-a879-c2b42c4e6364\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 15:16:20 crc kubenswrapper[4771]: I1001 15:16:20.440711 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a8db97b3-f960-4eff-a879-c2b42c4e6364-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8db97b3-f960-4eff-a879-c2b42c4e6364\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 15:16:20 crc kubenswrapper[4771]: I1001 15:16:20.442246 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8db97b3-f960-4eff-a879-c2b42c4e6364-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8db97b3-f960-4eff-a879-c2b42c4e6364\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 15:16:20 crc kubenswrapper[4771]: I1001 15:16:20.452649 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbsrk\" (UniqueName: \"kubernetes.io/projected/a8db97b3-f960-4eff-a879-c2b42c4e6364-kube-api-access-vbsrk\") pod \"nova-cell1-novncproxy-0\" (UID: \"a8db97b3-f960-4eff-a879-c2b42c4e6364\") " pod="openstack/nova-cell1-novncproxy-0" Oct 01 15:16:20 crc kubenswrapper[4771]: I1001 15:16:20.481404 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 01 15:16:20 crc kubenswrapper[4771]: I1001 15:16:20.748746 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 01 15:16:20 crc kubenswrapper[4771]: W1001 15:16:20.760663 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8db97b3_f960_4eff_a879_c2b42c4e6364.slice/crio-a6943564da680e6ec7f4658306d3c9967f93355528d266c0055628b11f1431be WatchSource:0}: Error finding container a6943564da680e6ec7f4658306d3c9967f93355528d266c0055628b11f1431be: Status 404 returned error can't find the container with id a6943564da680e6ec7f4658306d3c9967f93355528d266c0055628b11f1431be Oct 01 15:16:20 crc kubenswrapper[4771]: I1001 15:16:20.780752 4771 generic.go:334] "Generic (PLEG): container finished" podID="f10e4587-5897-4af2-ae7f-5b4e3a8392d2" containerID="daa8b6d6c018fc866fc424dc0af864a8d02a7b70ed46211915e7f4a3716f8be0" exitCode=0 Oct 01 15:16:20 crc kubenswrapper[4771]: I1001 15:16:20.780866 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f10e4587-5897-4af2-ae7f-5b4e3a8392d2","Type":"ContainerDied","Data":"daa8b6d6c018fc866fc424dc0af864a8d02a7b70ed46211915e7f4a3716f8be0"} Oct 01 15:16:20 crc kubenswrapper[4771]: I1001 15:16:20.909988 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.049785 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f10e4587-5897-4af2-ae7f-5b4e3a8392d2-sg-core-conf-yaml\") pod \"f10e4587-5897-4af2-ae7f-5b4e3a8392d2\" (UID: \"f10e4587-5897-4af2-ae7f-5b4e3a8392d2\") " Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.050195 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5b95\" (UniqueName: \"kubernetes.io/projected/f10e4587-5897-4af2-ae7f-5b4e3a8392d2-kube-api-access-l5b95\") pod \"f10e4587-5897-4af2-ae7f-5b4e3a8392d2\" (UID: \"f10e4587-5897-4af2-ae7f-5b4e3a8392d2\") " Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.050253 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f10e4587-5897-4af2-ae7f-5b4e3a8392d2-config-data\") pod \"f10e4587-5897-4af2-ae7f-5b4e3a8392d2\" (UID: \"f10e4587-5897-4af2-ae7f-5b4e3a8392d2\") " Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.050329 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f10e4587-5897-4af2-ae7f-5b4e3a8392d2-run-httpd\") pod \"f10e4587-5897-4af2-ae7f-5b4e3a8392d2\" (UID: \"f10e4587-5897-4af2-ae7f-5b4e3a8392d2\") " Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.050394 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f10e4587-5897-4af2-ae7f-5b4e3a8392d2-combined-ca-bundle\") pod \"f10e4587-5897-4af2-ae7f-5b4e3a8392d2\" (UID: \"f10e4587-5897-4af2-ae7f-5b4e3a8392d2\") " Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.050413 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f10e4587-5897-4af2-ae7f-5b4e3a8392d2-scripts\") pod \"f10e4587-5897-4af2-ae7f-5b4e3a8392d2\" (UID: \"f10e4587-5897-4af2-ae7f-5b4e3a8392d2\") " Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.050487 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f10e4587-5897-4af2-ae7f-5b4e3a8392d2-log-httpd\") pod \"f10e4587-5897-4af2-ae7f-5b4e3a8392d2\" (UID: \"f10e4587-5897-4af2-ae7f-5b4e3a8392d2\") " Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.050837 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f10e4587-5897-4af2-ae7f-5b4e3a8392d2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f10e4587-5897-4af2-ae7f-5b4e3a8392d2" (UID: "f10e4587-5897-4af2-ae7f-5b4e3a8392d2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.051061 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f10e4587-5897-4af2-ae7f-5b4e3a8392d2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f10e4587-5897-4af2-ae7f-5b4e3a8392d2" (UID: "f10e4587-5897-4af2-ae7f-5b4e3a8392d2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.056909 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f10e4587-5897-4af2-ae7f-5b4e3a8392d2-scripts" (OuterVolumeSpecName: "scripts") pod "f10e4587-5897-4af2-ae7f-5b4e3a8392d2" (UID: "f10e4587-5897-4af2-ae7f-5b4e3a8392d2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.057056 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f10e4587-5897-4af2-ae7f-5b4e3a8392d2-kube-api-access-l5b95" (OuterVolumeSpecName: "kube-api-access-l5b95") pod "f10e4587-5897-4af2-ae7f-5b4e3a8392d2" (UID: "f10e4587-5897-4af2-ae7f-5b4e3a8392d2"). InnerVolumeSpecName "kube-api-access-l5b95". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.094755 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f10e4587-5897-4af2-ae7f-5b4e3a8392d2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f10e4587-5897-4af2-ae7f-5b4e3a8392d2" (UID: "f10e4587-5897-4af2-ae7f-5b4e3a8392d2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.136123 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f10e4587-5897-4af2-ae7f-5b4e3a8392d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f10e4587-5897-4af2-ae7f-5b4e3a8392d2" (UID: "f10e4587-5897-4af2-ae7f-5b4e3a8392d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.152653 4771 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f10e4587-5897-4af2-ae7f-5b4e3a8392d2-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.152679 4771 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f10e4587-5897-4af2-ae7f-5b4e3a8392d2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.152688 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5b95\" (UniqueName: \"kubernetes.io/projected/f10e4587-5897-4af2-ae7f-5b4e3a8392d2-kube-api-access-l5b95\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.152699 4771 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f10e4587-5897-4af2-ae7f-5b4e3a8392d2-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.152707 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f10e4587-5897-4af2-ae7f-5b4e3a8392d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.152715 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f10e4587-5897-4af2-ae7f-5b4e3a8392d2-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.157180 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f10e4587-5897-4af2-ae7f-5b4e3a8392d2-config-data" (OuterVolumeSpecName: "config-data") pod "f10e4587-5897-4af2-ae7f-5b4e3a8392d2" (UID: "f10e4587-5897-4af2-ae7f-5b4e3a8392d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.254521 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f10e4587-5897-4af2-ae7f-5b4e3a8392d2-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.396616 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.397959 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.398206 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.402012 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.795921 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a8db97b3-f960-4eff-a879-c2b42c4e6364","Type":"ContainerStarted","Data":"de4e47c6491c66e96a5ecbe1e686a51a98a3cd7bab5314bbda31c80a8f6efc21"} Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.795989 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a8db97b3-f960-4eff-a879-c2b42c4e6364","Type":"ContainerStarted","Data":"a6943564da680e6ec7f4658306d3c9967f93355528d266c0055628b11f1431be"} Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.802202 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f10e4587-5897-4af2-ae7f-5b4e3a8392d2","Type":"ContainerDied","Data":"cae200a7dddeafe5f0049eae651d42530627b7b7546f359abe45a88e4b63655f"} Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.802286 4771 scope.go:117] "RemoveContainer" containerID="743a4a90e2a8d1bba869c28314e97739392e0e2b236c4521ab60d98951ff670d" Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.802334 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.803145 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.827364 4771 scope.go:117] "RemoveContainer" containerID="d81312d3000de177a8e959024c071a0140f63f31b8ffbb4f615251640832a58f" Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.832012 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.833115 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.8330871800000001 podStartE2EDuration="1.83308718s" podCreationTimestamp="2025-10-01 15:16:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:16:21.832929086 +0000 UTC m=+1226.452104257" watchObservedRunningTime="2025-10-01 15:16:21.83308718 +0000 UTC m=+1226.452262361" Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.846955 4771 scope.go:117] "RemoveContainer" containerID="daa8b6d6c018fc866fc424dc0af864a8d02a7b70ed46211915e7f4a3716f8be0" Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.875164 4771 scope.go:117] "RemoveContainer" containerID="23ae913d6ac1cbd4c2707c2b990cdcad7e906aa212dfc4b3e288e4caa4aa4c4f" Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.878371 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.903298 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.918657 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 15:16:21 crc kubenswrapper[4771]: E1001 15:16:21.919384 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f10e4587-5897-4af2-ae7f-5b4e3a8392d2" containerName="ceilometer-central-agent" Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.919510 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f10e4587-5897-4af2-ae7f-5b4e3a8392d2" containerName="ceilometer-central-agent" Oct 01 15:16:21 crc kubenswrapper[4771]: E1001 15:16:21.919602 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f10e4587-5897-4af2-ae7f-5b4e3a8392d2" containerName="ceilometer-notification-agent" Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.919686 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f10e4587-5897-4af2-ae7f-5b4e3a8392d2" containerName="ceilometer-notification-agent" Oct 01 15:16:21 crc kubenswrapper[4771]: E1001 15:16:21.919822 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f10e4587-5897-4af2-ae7f-5b4e3a8392d2" containerName="sg-core" Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.919898 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f10e4587-5897-4af2-ae7f-5b4e3a8392d2" containerName="sg-core" Oct 01 15:16:21 crc kubenswrapper[4771]: E1001 15:16:21.919978 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f10e4587-5897-4af2-ae7f-5b4e3a8392d2" containerName="proxy-httpd" Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.920047 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f10e4587-5897-4af2-ae7f-5b4e3a8392d2" containerName="proxy-httpd" Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.920353 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f10e4587-5897-4af2-ae7f-5b4e3a8392d2" containerName="sg-core" Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.920445 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f10e4587-5897-4af2-ae7f-5b4e3a8392d2" containerName="proxy-httpd" Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.920557 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f10e4587-5897-4af2-ae7f-5b4e3a8392d2" containerName="ceilometer-central-agent" Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.920658 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f10e4587-5897-4af2-ae7f-5b4e3a8392d2" containerName="ceilometer-notification-agent" Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.939256 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.939539 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.945453 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.960023 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 01 15:16:21 crc kubenswrapper[4771]: I1001 15:16:21.960175 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.041541 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d74cde4-a358-4f81-9c52-7e4a9a1646f1" path="/var/lib/kubelet/pods/7d74cde4-a358-4f81-9c52-7e4a9a1646f1/volumes" Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.042657 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f10e4587-5897-4af2-ae7f-5b4e3a8392d2" path="/var/lib/kubelet/pods/f10e4587-5897-4af2-ae7f-5b4e3a8392d2/volumes" Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.043418 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-h7bvh"] Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.044926 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-h7bvh" Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.053487 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-h7bvh"] Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.075671 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbb43da8-f626-452c-88ac-5ff256382574-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bbb43da8-f626-452c-88ac-5ff256382574\") " pod="openstack/ceilometer-0" Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.075950 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb5x8\" (UniqueName: \"kubernetes.io/projected/bbb43da8-f626-452c-88ac-5ff256382574-kube-api-access-pb5x8\") pod \"ceilometer-0\" (UID: \"bbb43da8-f626-452c-88ac-5ff256382574\") " pod="openstack/ceilometer-0" Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.076063 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbb43da8-f626-452c-88ac-5ff256382574-log-httpd\") pod \"ceilometer-0\" (UID: \"bbb43da8-f626-452c-88ac-5ff256382574\") " pod="openstack/ceilometer-0" Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.076184 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbb43da8-f626-452c-88ac-5ff256382574-config-data\") pod \"ceilometer-0\" (UID: \"bbb43da8-f626-452c-88ac-5ff256382574\") " pod="openstack/ceilometer-0" Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.076283 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbb43da8-f626-452c-88ac-5ff256382574-scripts\") pod \"ceilometer-0\" (UID: \"bbb43da8-f626-452c-88ac-5ff256382574\") " pod="openstack/ceilometer-0" Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.076372 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bbb43da8-f626-452c-88ac-5ff256382574-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bbb43da8-f626-452c-88ac-5ff256382574\") " pod="openstack/ceilometer-0" Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.076437 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbb43da8-f626-452c-88ac-5ff256382574-run-httpd\") pod \"ceilometer-0\" (UID: \"bbb43da8-f626-452c-88ac-5ff256382574\") " pod="openstack/ceilometer-0" Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.076513 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbb43da8-f626-452c-88ac-5ff256382574-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bbb43da8-f626-452c-88ac-5ff256382574\") " pod="openstack/ceilometer-0" Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.178443 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/368f9218-00d0-4a40-a746-6ba4e5a67d1d-config\") pod \"dnsmasq-dns-59cf4bdb65-h7bvh\" (UID: \"368f9218-00d0-4a40-a746-6ba4e5a67d1d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-h7bvh" Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.178488 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pb5x8\" (UniqueName: \"kubernetes.io/projected/bbb43da8-f626-452c-88ac-5ff256382574-kube-api-access-pb5x8\") pod \"ceilometer-0\" (UID: \"bbb43da8-f626-452c-88ac-5ff256382574\") " pod="openstack/ceilometer-0" Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.178523 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbb43da8-f626-452c-88ac-5ff256382574-log-httpd\") pod \"ceilometer-0\" (UID: \"bbb43da8-f626-452c-88ac-5ff256382574\") " pod="openstack/ceilometer-0" Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.178563 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/368f9218-00d0-4a40-a746-6ba4e5a67d1d-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-h7bvh\" (UID: \"368f9218-00d0-4a40-a746-6ba4e5a67d1d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-h7bvh" Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.178585 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6ngs\" (UniqueName: \"kubernetes.io/projected/368f9218-00d0-4a40-a746-6ba4e5a67d1d-kube-api-access-d6ngs\") pod \"dnsmasq-dns-59cf4bdb65-h7bvh\" (UID: \"368f9218-00d0-4a40-a746-6ba4e5a67d1d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-h7bvh" Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.178648 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbb43da8-f626-452c-88ac-5ff256382574-config-data\") pod \"ceilometer-0\" (UID: \"bbb43da8-f626-452c-88ac-5ff256382574\") " pod="openstack/ceilometer-0" Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.178698 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/368f9218-00d0-4a40-a746-6ba4e5a67d1d-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-h7bvh\" (UID: \"368f9218-00d0-4a40-a746-6ba4e5a67d1d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-h7bvh" Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.178715 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbb43da8-f626-452c-88ac-5ff256382574-scripts\") pod \"ceilometer-0\" (UID: \"bbb43da8-f626-452c-88ac-5ff256382574\") " pod="openstack/ceilometer-0" Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.178753 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bbb43da8-f626-452c-88ac-5ff256382574-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bbb43da8-f626-452c-88ac-5ff256382574\") " pod="openstack/ceilometer-0" Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.178768 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbb43da8-f626-452c-88ac-5ff256382574-run-httpd\") pod \"ceilometer-0\" (UID: \"bbb43da8-f626-452c-88ac-5ff256382574\") " pod="openstack/ceilometer-0" Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.178786 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/368f9218-00d0-4a40-a746-6ba4e5a67d1d-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-h7bvh\" (UID: \"368f9218-00d0-4a40-a746-6ba4e5a67d1d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-h7bvh" Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.178805 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbb43da8-f626-452c-88ac-5ff256382574-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bbb43da8-f626-452c-88ac-5ff256382574\") " pod="openstack/ceilometer-0" Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.178831 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/368f9218-00d0-4a40-a746-6ba4e5a67d1d-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-h7bvh\" (UID: \"368f9218-00d0-4a40-a746-6ba4e5a67d1d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-h7bvh" Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.178852 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbb43da8-f626-452c-88ac-5ff256382574-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bbb43da8-f626-452c-88ac-5ff256382574\") " pod="openstack/ceilometer-0" Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.180071 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbb43da8-f626-452c-88ac-5ff256382574-log-httpd\") pod \"ceilometer-0\" (UID: \"bbb43da8-f626-452c-88ac-5ff256382574\") " pod="openstack/ceilometer-0" Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.182846 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbb43da8-f626-452c-88ac-5ff256382574-scripts\") pod \"ceilometer-0\" (UID: \"bbb43da8-f626-452c-88ac-5ff256382574\") " pod="openstack/ceilometer-0" Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.197607 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbb43da8-f626-452c-88ac-5ff256382574-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"bbb43da8-f626-452c-88ac-5ff256382574\") " pod="openstack/ceilometer-0" Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.198149 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbb43da8-f626-452c-88ac-5ff256382574-run-httpd\") pod \"ceilometer-0\" (UID: \"bbb43da8-f626-452c-88ac-5ff256382574\") " pod="openstack/ceilometer-0" Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.198183 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bbb43da8-f626-452c-88ac-5ff256382574-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bbb43da8-f626-452c-88ac-5ff256382574\") " pod="openstack/ceilometer-0" Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.198449 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbb43da8-f626-452c-88ac-5ff256382574-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bbb43da8-f626-452c-88ac-5ff256382574\") " pod="openstack/ceilometer-0" Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.198569 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbb43da8-f626-452c-88ac-5ff256382574-config-data\") pod \"ceilometer-0\" (UID: \"bbb43da8-f626-452c-88ac-5ff256382574\") " pod="openstack/ceilometer-0" Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.200632 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb5x8\" (UniqueName: \"kubernetes.io/projected/bbb43da8-f626-452c-88ac-5ff256382574-kube-api-access-pb5x8\") pod \"ceilometer-0\" (UID: \"bbb43da8-f626-452c-88ac-5ff256382574\") " pod="openstack/ceilometer-0" Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.264215 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.280627 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6ngs\" (UniqueName: \"kubernetes.io/projected/368f9218-00d0-4a40-a746-6ba4e5a67d1d-kube-api-access-d6ngs\") pod \"dnsmasq-dns-59cf4bdb65-h7bvh\" (UID: \"368f9218-00d0-4a40-a746-6ba4e5a67d1d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-h7bvh" Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.280742 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/368f9218-00d0-4a40-a746-6ba4e5a67d1d-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-h7bvh\" (UID: \"368f9218-00d0-4a40-a746-6ba4e5a67d1d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-h7bvh" Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.280986 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/368f9218-00d0-4a40-a746-6ba4e5a67d1d-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-h7bvh\" (UID: \"368f9218-00d0-4a40-a746-6ba4e5a67d1d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-h7bvh" Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.281483 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/368f9218-00d0-4a40-a746-6ba4e5a67d1d-dns-svc\") pod \"dnsmasq-dns-59cf4bdb65-h7bvh\" (UID: \"368f9218-00d0-4a40-a746-6ba4e5a67d1d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-h7bvh" Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.281769 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/368f9218-00d0-4a40-a746-6ba4e5a67d1d-dns-swift-storage-0\") pod \"dnsmasq-dns-59cf4bdb65-h7bvh\" (UID: \"368f9218-00d0-4a40-a746-6ba4e5a67d1d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-h7bvh" Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.281868 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/368f9218-00d0-4a40-a746-6ba4e5a67d1d-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-h7bvh\" (UID: \"368f9218-00d0-4a40-a746-6ba4e5a67d1d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-h7bvh" Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.281926 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/368f9218-00d0-4a40-a746-6ba4e5a67d1d-config\") pod \"dnsmasq-dns-59cf4bdb65-h7bvh\" (UID: \"368f9218-00d0-4a40-a746-6ba4e5a67d1d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-h7bvh" Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.281977 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/368f9218-00d0-4a40-a746-6ba4e5a67d1d-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-h7bvh\" (UID: \"368f9218-00d0-4a40-a746-6ba4e5a67d1d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-h7bvh" Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.282528 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/368f9218-00d0-4a40-a746-6ba4e5a67d1d-config\") pod \"dnsmasq-dns-59cf4bdb65-h7bvh\" (UID: \"368f9218-00d0-4a40-a746-6ba4e5a67d1d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-h7bvh" Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.282564 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/368f9218-00d0-4a40-a746-6ba4e5a67d1d-ovsdbserver-nb\") pod \"dnsmasq-dns-59cf4bdb65-h7bvh\" (UID: \"368f9218-00d0-4a40-a746-6ba4e5a67d1d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-h7bvh" Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.282924 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/368f9218-00d0-4a40-a746-6ba4e5a67d1d-ovsdbserver-sb\") pod \"dnsmasq-dns-59cf4bdb65-h7bvh\" (UID: \"368f9218-00d0-4a40-a746-6ba4e5a67d1d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-h7bvh" Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.296338 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6ngs\" (UniqueName: \"kubernetes.io/projected/368f9218-00d0-4a40-a746-6ba4e5a67d1d-kube-api-access-d6ngs\") pod \"dnsmasq-dns-59cf4bdb65-h7bvh\" (UID: \"368f9218-00d0-4a40-a746-6ba4e5a67d1d\") " pod="openstack/dnsmasq-dns-59cf4bdb65-h7bvh" Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.365305 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-h7bvh" Oct 01 15:16:22 crc kubenswrapper[4771]: W1001 15:16:22.723361 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbb43da8_f626_452c_88ac_5ff256382574.slice/crio-ca3fa1df8b34c9c967e0771d26f909bcebfeaa0e5d6fc7f743b7577a46a24521 WatchSource:0}: Error finding container ca3fa1df8b34c9c967e0771d26f909bcebfeaa0e5d6fc7f743b7577a46a24521: Status 404 returned error can't find the container with id ca3fa1df8b34c9c967e0771d26f909bcebfeaa0e5d6fc7f743b7577a46a24521 Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.730927 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.812363 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbb43da8-f626-452c-88ac-5ff256382574","Type":"ContainerStarted","Data":"ca3fa1df8b34c9c967e0771d26f909bcebfeaa0e5d6fc7f743b7577a46a24521"} Oct 01 15:16:22 crc kubenswrapper[4771]: I1001 15:16:22.883834 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-h7bvh"] Oct 01 15:16:22 crc kubenswrapper[4771]: W1001 15:16:22.884951 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod368f9218_00d0_4a40_a746_6ba4e5a67d1d.slice/crio-103122cd0a58b9480a1dffa15a11694133f3affc921745a5e14c3e537463e291 WatchSource:0}: Error finding container 103122cd0a58b9480a1dffa15a11694133f3affc921745a5e14c3e537463e291: Status 404 returned error can't find the container with id 103122cd0a58b9480a1dffa15a11694133f3affc921745a5e14c3e537463e291 Oct 01 15:16:23 crc kubenswrapper[4771]: I1001 15:16:23.666825 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 15:16:23 crc kubenswrapper[4771]: I1001 15:16:23.829602 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbb43da8-f626-452c-88ac-5ff256382574","Type":"ContainerStarted","Data":"882571b0d43051f92c1d3e576a5e289a55b8b3103e0191a333297085dc23daf2"} Oct 01 15:16:23 crc kubenswrapper[4771]: I1001 15:16:23.831141 4771 generic.go:334] "Generic (PLEG): container finished" podID="368f9218-00d0-4a40-a746-6ba4e5a67d1d" containerID="c944dbe155ae0fe15aab08aa5ba510e55107af409a3a3b5f7541c3bf81ec74e4" exitCode=0 Oct 01 15:16:23 crc kubenswrapper[4771]: I1001 15:16:23.831281 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-h7bvh" event={"ID":"368f9218-00d0-4a40-a746-6ba4e5a67d1d","Type":"ContainerDied","Data":"c944dbe155ae0fe15aab08aa5ba510e55107af409a3a3b5f7541c3bf81ec74e4"} Oct 01 15:16:23 crc kubenswrapper[4771]: I1001 15:16:23.831328 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-h7bvh" event={"ID":"368f9218-00d0-4a40-a746-6ba4e5a67d1d","Type":"ContainerStarted","Data":"103122cd0a58b9480a1dffa15a11694133f3affc921745a5e14c3e537463e291"} Oct 01 15:16:24 crc kubenswrapper[4771]: I1001 15:16:24.502275 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 01 15:16:24 crc kubenswrapper[4771]: I1001 15:16:24.842472 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbb43da8-f626-452c-88ac-5ff256382574","Type":"ContainerStarted","Data":"13b62963886d4022fd1b332ca948f28b5093d712975470f125fe0fc0861d2762"} Oct 01 15:16:24 crc kubenswrapper[4771]: I1001 15:16:24.844801 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a233694a-62d8-4f80-bb50-5764b8fdc3bb" containerName="nova-api-log" containerID="cri-o://f7dab890de9147316571d9b93fbaaef46ba889a825cab999155a26a7ccb7bed8" gracePeriod=30 Oct 01 15:16:24 crc kubenswrapper[4771]: I1001 15:16:24.846063 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-h7bvh" event={"ID":"368f9218-00d0-4a40-a746-6ba4e5a67d1d","Type":"ContainerStarted","Data":"d1446140f30b83ae0a9125c0a87210fe2e569343df251b2a5a27eb7e33a7da6e"} Oct 01 15:16:24 crc kubenswrapper[4771]: I1001 15:16:24.846106 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59cf4bdb65-h7bvh" Oct 01 15:16:24 crc kubenswrapper[4771]: I1001 15:16:24.846446 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a233694a-62d8-4f80-bb50-5764b8fdc3bb" containerName="nova-api-api" containerID="cri-o://320d8a0267923fba7102455494ef87ae2981bd415e53210c65aa27b40bf64eff" gracePeriod=30 Oct 01 15:16:24 crc kubenswrapper[4771]: I1001 15:16:24.875887 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59cf4bdb65-h7bvh" podStartSLOduration=3.8758666809999998 podStartE2EDuration="3.875866681s" podCreationTimestamp="2025-10-01 15:16:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:16:24.871152934 +0000 UTC m=+1229.490328115" watchObservedRunningTime="2025-10-01 15:16:24.875866681 +0000 UTC m=+1229.495041862" Oct 01 15:16:25 crc kubenswrapper[4771]: I1001 15:16:25.482046 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 01 15:16:25 crc kubenswrapper[4771]: I1001 15:16:25.855887 4771 generic.go:334] "Generic (PLEG): container finished" podID="a233694a-62d8-4f80-bb50-5764b8fdc3bb" containerID="f7dab890de9147316571d9b93fbaaef46ba889a825cab999155a26a7ccb7bed8" exitCode=143 Oct 01 15:16:25 crc kubenswrapper[4771]: I1001 15:16:25.855959 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a233694a-62d8-4f80-bb50-5764b8fdc3bb","Type":"ContainerDied","Data":"f7dab890de9147316571d9b93fbaaef46ba889a825cab999155a26a7ccb7bed8"} Oct 01 15:16:25 crc kubenswrapper[4771]: I1001 15:16:25.859641 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbb43da8-f626-452c-88ac-5ff256382574","Type":"ContainerStarted","Data":"734c189802e8a863a498e0547cc2dd49836abc181c2d9fbd40dd3baf3ad1d920"} Oct 01 15:16:27 crc kubenswrapper[4771]: I1001 15:16:27.889230 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbb43da8-f626-452c-88ac-5ff256382574","Type":"ContainerStarted","Data":"aca94d38e001a78334a93967eeaf2db30423e99ff823edecc946fd8319dbf528"} Oct 01 15:16:27 crc kubenswrapper[4771]: I1001 15:16:27.890971 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 15:16:27 crc kubenswrapper[4771]: I1001 15:16:27.890283 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bbb43da8-f626-452c-88ac-5ff256382574" containerName="proxy-httpd" containerID="cri-o://aca94d38e001a78334a93967eeaf2db30423e99ff823edecc946fd8319dbf528" gracePeriod=30 Oct 01 15:16:27 crc kubenswrapper[4771]: I1001 15:16:27.889621 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bbb43da8-f626-452c-88ac-5ff256382574" containerName="ceilometer-central-agent" containerID="cri-o://882571b0d43051f92c1d3e576a5e289a55b8b3103e0191a333297085dc23daf2" gracePeriod=30 Oct 01 15:16:27 crc kubenswrapper[4771]: I1001 15:16:27.890310 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bbb43da8-f626-452c-88ac-5ff256382574" containerName="ceilometer-notification-agent" containerID="cri-o://13b62963886d4022fd1b332ca948f28b5093d712975470f125fe0fc0861d2762" gracePeriod=30 Oct 01 15:16:27 crc kubenswrapper[4771]: I1001 15:16:27.890298 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bbb43da8-f626-452c-88ac-5ff256382574" containerName="sg-core" containerID="cri-o://734c189802e8a863a498e0547cc2dd49836abc181c2d9fbd40dd3baf3ad1d920" gracePeriod=30 Oct 01 15:16:27 crc kubenswrapper[4771]: I1001 15:16:27.917947 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.016897837 podStartE2EDuration="6.917926737s" podCreationTimestamp="2025-10-01 15:16:21 +0000 UTC" firstStartedPulling="2025-10-01 15:16:22.726047692 +0000 UTC m=+1227.345222863" lastFinishedPulling="2025-10-01 15:16:26.627076592 +0000 UTC m=+1231.246251763" observedRunningTime="2025-10-01 15:16:27.916329841 +0000 UTC m=+1232.535505022" watchObservedRunningTime="2025-10-01 15:16:27.917926737 +0000 UTC m=+1232.537101918" Oct 01 15:16:28 crc kubenswrapper[4771]: I1001 15:16:28.156808 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 01 15:16:28 crc kubenswrapper[4771]: I1001 15:16:28.435150 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 15:16:28 crc kubenswrapper[4771]: I1001 15:16:28.624478 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92w72\" (UniqueName: \"kubernetes.io/projected/a233694a-62d8-4f80-bb50-5764b8fdc3bb-kube-api-access-92w72\") pod \"a233694a-62d8-4f80-bb50-5764b8fdc3bb\" (UID: \"a233694a-62d8-4f80-bb50-5764b8fdc3bb\") " Oct 01 15:16:28 crc kubenswrapper[4771]: I1001 15:16:28.624522 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a233694a-62d8-4f80-bb50-5764b8fdc3bb-config-data\") pod \"a233694a-62d8-4f80-bb50-5764b8fdc3bb\" (UID: \"a233694a-62d8-4f80-bb50-5764b8fdc3bb\") " Oct 01 15:16:28 crc kubenswrapper[4771]: I1001 15:16:28.624546 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a233694a-62d8-4f80-bb50-5764b8fdc3bb-combined-ca-bundle\") pod \"a233694a-62d8-4f80-bb50-5764b8fdc3bb\" (UID: \"a233694a-62d8-4f80-bb50-5764b8fdc3bb\") " Oct 01 15:16:28 crc kubenswrapper[4771]: I1001 15:16:28.624624 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a233694a-62d8-4f80-bb50-5764b8fdc3bb-logs\") pod \"a233694a-62d8-4f80-bb50-5764b8fdc3bb\" (UID: \"a233694a-62d8-4f80-bb50-5764b8fdc3bb\") " Oct 01 15:16:28 crc kubenswrapper[4771]: I1001 15:16:28.625229 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a233694a-62d8-4f80-bb50-5764b8fdc3bb-logs" (OuterVolumeSpecName: "logs") pod "a233694a-62d8-4f80-bb50-5764b8fdc3bb" (UID: "a233694a-62d8-4f80-bb50-5764b8fdc3bb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:16:28 crc kubenswrapper[4771]: I1001 15:16:28.632703 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a233694a-62d8-4f80-bb50-5764b8fdc3bb-kube-api-access-92w72" (OuterVolumeSpecName: "kube-api-access-92w72") pod "a233694a-62d8-4f80-bb50-5764b8fdc3bb" (UID: "a233694a-62d8-4f80-bb50-5764b8fdc3bb"). InnerVolumeSpecName "kube-api-access-92w72". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:16:28 crc kubenswrapper[4771]: I1001 15:16:28.649967 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a233694a-62d8-4f80-bb50-5764b8fdc3bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a233694a-62d8-4f80-bb50-5764b8fdc3bb" (UID: "a233694a-62d8-4f80-bb50-5764b8fdc3bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:16:28 crc kubenswrapper[4771]: I1001 15:16:28.653281 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a233694a-62d8-4f80-bb50-5764b8fdc3bb-config-data" (OuterVolumeSpecName: "config-data") pod "a233694a-62d8-4f80-bb50-5764b8fdc3bb" (UID: "a233694a-62d8-4f80-bb50-5764b8fdc3bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:16:28 crc kubenswrapper[4771]: I1001 15:16:28.727058 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92w72\" (UniqueName: \"kubernetes.io/projected/a233694a-62d8-4f80-bb50-5764b8fdc3bb-kube-api-access-92w72\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:28 crc kubenswrapper[4771]: I1001 15:16:28.727099 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a233694a-62d8-4f80-bb50-5764b8fdc3bb-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:28 crc kubenswrapper[4771]: I1001 15:16:28.727113 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a233694a-62d8-4f80-bb50-5764b8fdc3bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:28 crc kubenswrapper[4771]: I1001 15:16:28.727128 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a233694a-62d8-4f80-bb50-5764b8fdc3bb-logs\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:28 crc kubenswrapper[4771]: I1001 15:16:28.908261 4771 generic.go:334] "Generic (PLEG): container finished" podID="bbb43da8-f626-452c-88ac-5ff256382574" containerID="aca94d38e001a78334a93967eeaf2db30423e99ff823edecc946fd8319dbf528" exitCode=0 Oct 01 15:16:28 crc kubenswrapper[4771]: I1001 15:16:28.908654 4771 generic.go:334] "Generic (PLEG): container finished" podID="bbb43da8-f626-452c-88ac-5ff256382574" containerID="734c189802e8a863a498e0547cc2dd49836abc181c2d9fbd40dd3baf3ad1d920" exitCode=2 Oct 01 15:16:28 crc kubenswrapper[4771]: I1001 15:16:28.908322 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbb43da8-f626-452c-88ac-5ff256382574","Type":"ContainerDied","Data":"aca94d38e001a78334a93967eeaf2db30423e99ff823edecc946fd8319dbf528"} Oct 01 15:16:28 crc kubenswrapper[4771]: I1001 15:16:28.908718 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbb43da8-f626-452c-88ac-5ff256382574","Type":"ContainerDied","Data":"734c189802e8a863a498e0547cc2dd49836abc181c2d9fbd40dd3baf3ad1d920"} Oct 01 15:16:28 crc kubenswrapper[4771]: I1001 15:16:28.908747 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbb43da8-f626-452c-88ac-5ff256382574","Type":"ContainerDied","Data":"13b62963886d4022fd1b332ca948f28b5093d712975470f125fe0fc0861d2762"} Oct 01 15:16:28 crc kubenswrapper[4771]: I1001 15:16:28.908668 4771 generic.go:334] "Generic (PLEG): container finished" podID="bbb43da8-f626-452c-88ac-5ff256382574" containerID="13b62963886d4022fd1b332ca948f28b5093d712975470f125fe0fc0861d2762" exitCode=0 Oct 01 15:16:28 crc kubenswrapper[4771]: I1001 15:16:28.911368 4771 generic.go:334] "Generic (PLEG): container finished" podID="a233694a-62d8-4f80-bb50-5764b8fdc3bb" containerID="320d8a0267923fba7102455494ef87ae2981bd415e53210c65aa27b40bf64eff" exitCode=0 Oct 01 15:16:28 crc kubenswrapper[4771]: I1001 15:16:28.911404 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a233694a-62d8-4f80-bb50-5764b8fdc3bb","Type":"ContainerDied","Data":"320d8a0267923fba7102455494ef87ae2981bd415e53210c65aa27b40bf64eff"} Oct 01 15:16:28 crc kubenswrapper[4771]: I1001 15:16:28.911428 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 15:16:28 crc kubenswrapper[4771]: I1001 15:16:28.911443 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a233694a-62d8-4f80-bb50-5764b8fdc3bb","Type":"ContainerDied","Data":"d703c695a748d4d5ab78ae982fd92f2a62023e0db12129ed9a119207131d7c0a"} Oct 01 15:16:28 crc kubenswrapper[4771]: I1001 15:16:28.911477 4771 scope.go:117] "RemoveContainer" containerID="320d8a0267923fba7102455494ef87ae2981bd415e53210c65aa27b40bf64eff" Oct 01 15:16:28 crc kubenswrapper[4771]: I1001 15:16:28.943398 4771 scope.go:117] "RemoveContainer" containerID="f7dab890de9147316571d9b93fbaaef46ba889a825cab999155a26a7ccb7bed8" Oct 01 15:16:28 crc kubenswrapper[4771]: I1001 15:16:28.952175 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 01 15:16:28 crc kubenswrapper[4771]: I1001 15:16:28.961898 4771 scope.go:117] "RemoveContainer" containerID="320d8a0267923fba7102455494ef87ae2981bd415e53210c65aa27b40bf64eff" Oct 01 15:16:28 crc kubenswrapper[4771]: E1001 15:16:28.962380 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"320d8a0267923fba7102455494ef87ae2981bd415e53210c65aa27b40bf64eff\": container with ID starting with 320d8a0267923fba7102455494ef87ae2981bd415e53210c65aa27b40bf64eff not found: ID does not exist" containerID="320d8a0267923fba7102455494ef87ae2981bd415e53210c65aa27b40bf64eff" Oct 01 15:16:28 crc kubenswrapper[4771]: I1001 15:16:28.962423 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"320d8a0267923fba7102455494ef87ae2981bd415e53210c65aa27b40bf64eff"} err="failed to get container status \"320d8a0267923fba7102455494ef87ae2981bd415e53210c65aa27b40bf64eff\": rpc error: code = NotFound desc = could not find container \"320d8a0267923fba7102455494ef87ae2981bd415e53210c65aa27b40bf64eff\": container with ID starting with 320d8a0267923fba7102455494ef87ae2981bd415e53210c65aa27b40bf64eff not found: ID does not exist" Oct 01 15:16:28 crc kubenswrapper[4771]: I1001 15:16:28.962446 4771 scope.go:117] "RemoveContainer" containerID="f7dab890de9147316571d9b93fbaaef46ba889a825cab999155a26a7ccb7bed8" Oct 01 15:16:28 crc kubenswrapper[4771]: E1001 15:16:28.962807 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7dab890de9147316571d9b93fbaaef46ba889a825cab999155a26a7ccb7bed8\": container with ID starting with f7dab890de9147316571d9b93fbaaef46ba889a825cab999155a26a7ccb7bed8 not found: ID does not exist" containerID="f7dab890de9147316571d9b93fbaaef46ba889a825cab999155a26a7ccb7bed8" Oct 01 15:16:28 crc kubenswrapper[4771]: I1001 15:16:28.962835 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7dab890de9147316571d9b93fbaaef46ba889a825cab999155a26a7ccb7bed8"} err="failed to get container status \"f7dab890de9147316571d9b93fbaaef46ba889a825cab999155a26a7ccb7bed8\": rpc error: code = NotFound desc = could not find container \"f7dab890de9147316571d9b93fbaaef46ba889a825cab999155a26a7ccb7bed8\": container with ID starting with f7dab890de9147316571d9b93fbaaef46ba889a825cab999155a26a7ccb7bed8 not found: ID does not exist" Oct 01 15:16:28 crc kubenswrapper[4771]: I1001 15:16:28.963038 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 01 15:16:28 crc kubenswrapper[4771]: I1001 15:16:28.971974 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 01 15:16:28 crc kubenswrapper[4771]: E1001 15:16:28.972398 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a233694a-62d8-4f80-bb50-5764b8fdc3bb" containerName="nova-api-api" Oct 01 15:16:28 crc kubenswrapper[4771]: I1001 15:16:28.972418 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a233694a-62d8-4f80-bb50-5764b8fdc3bb" containerName="nova-api-api" Oct 01 15:16:28 crc kubenswrapper[4771]: E1001 15:16:28.972448 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a233694a-62d8-4f80-bb50-5764b8fdc3bb" containerName="nova-api-log" Oct 01 15:16:28 crc kubenswrapper[4771]: I1001 15:16:28.972458 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a233694a-62d8-4f80-bb50-5764b8fdc3bb" containerName="nova-api-log" Oct 01 15:16:28 crc kubenswrapper[4771]: I1001 15:16:28.972632 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a233694a-62d8-4f80-bb50-5764b8fdc3bb" containerName="nova-api-log" Oct 01 15:16:28 crc kubenswrapper[4771]: I1001 15:16:28.972649 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a233694a-62d8-4f80-bb50-5764b8fdc3bb" containerName="nova-api-api" Oct 01 15:16:28 crc kubenswrapper[4771]: I1001 15:16:28.973563 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 15:16:28 crc kubenswrapper[4771]: I1001 15:16:28.975709 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 01 15:16:28 crc kubenswrapper[4771]: I1001 15:16:28.975895 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 01 15:16:28 crc kubenswrapper[4771]: I1001 15:16:28.976023 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 01 15:16:28 crc kubenswrapper[4771]: I1001 15:16:28.996889 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 15:16:29 crc kubenswrapper[4771]: I1001 15:16:29.134716 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4909086b-5a48-46b7-af75-e54bddc4014e-config-data\") pod \"nova-api-0\" (UID: \"4909086b-5a48-46b7-af75-e54bddc4014e\") " pod="openstack/nova-api-0" Oct 01 15:16:29 crc kubenswrapper[4771]: I1001 15:16:29.134789 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nb4gk\" (UniqueName: \"kubernetes.io/projected/4909086b-5a48-46b7-af75-e54bddc4014e-kube-api-access-nb4gk\") pod \"nova-api-0\" (UID: \"4909086b-5a48-46b7-af75-e54bddc4014e\") " pod="openstack/nova-api-0" Oct 01 15:16:29 crc kubenswrapper[4771]: I1001 15:16:29.135003 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4909086b-5a48-46b7-af75-e54bddc4014e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4909086b-5a48-46b7-af75-e54bddc4014e\") " pod="openstack/nova-api-0" Oct 01 15:16:29 crc kubenswrapper[4771]: I1001 15:16:29.135090 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4909086b-5a48-46b7-af75-e54bddc4014e-logs\") pod \"nova-api-0\" (UID: \"4909086b-5a48-46b7-af75-e54bddc4014e\") " pod="openstack/nova-api-0" Oct 01 15:16:29 crc kubenswrapper[4771]: I1001 15:16:29.135166 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4909086b-5a48-46b7-af75-e54bddc4014e-public-tls-certs\") pod \"nova-api-0\" (UID: \"4909086b-5a48-46b7-af75-e54bddc4014e\") " pod="openstack/nova-api-0" Oct 01 15:16:29 crc kubenswrapper[4771]: I1001 15:16:29.135212 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4909086b-5a48-46b7-af75-e54bddc4014e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4909086b-5a48-46b7-af75-e54bddc4014e\") " pod="openstack/nova-api-0" Oct 01 15:16:29 crc kubenswrapper[4771]: I1001 15:16:29.236783 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4909086b-5a48-46b7-af75-e54bddc4014e-config-data\") pod \"nova-api-0\" (UID: \"4909086b-5a48-46b7-af75-e54bddc4014e\") " pod="openstack/nova-api-0" Oct 01 15:16:29 crc kubenswrapper[4771]: I1001 15:16:29.236843 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nb4gk\" (UniqueName: \"kubernetes.io/projected/4909086b-5a48-46b7-af75-e54bddc4014e-kube-api-access-nb4gk\") pod \"nova-api-0\" (UID: \"4909086b-5a48-46b7-af75-e54bddc4014e\") " pod="openstack/nova-api-0" Oct 01 15:16:29 crc kubenswrapper[4771]: I1001 15:16:29.236903 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4909086b-5a48-46b7-af75-e54bddc4014e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4909086b-5a48-46b7-af75-e54bddc4014e\") " pod="openstack/nova-api-0" Oct 01 15:16:29 crc kubenswrapper[4771]: I1001 15:16:29.236962 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4909086b-5a48-46b7-af75-e54bddc4014e-logs\") pod \"nova-api-0\" (UID: \"4909086b-5a48-46b7-af75-e54bddc4014e\") " pod="openstack/nova-api-0" Oct 01 15:16:29 crc kubenswrapper[4771]: I1001 15:16:29.237004 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4909086b-5a48-46b7-af75-e54bddc4014e-public-tls-certs\") pod \"nova-api-0\" (UID: \"4909086b-5a48-46b7-af75-e54bddc4014e\") " pod="openstack/nova-api-0" Oct 01 15:16:29 crc kubenswrapper[4771]: I1001 15:16:29.237037 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4909086b-5a48-46b7-af75-e54bddc4014e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4909086b-5a48-46b7-af75-e54bddc4014e\") " pod="openstack/nova-api-0" Oct 01 15:16:29 crc kubenswrapper[4771]: I1001 15:16:29.238399 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4909086b-5a48-46b7-af75-e54bddc4014e-logs\") pod \"nova-api-0\" (UID: \"4909086b-5a48-46b7-af75-e54bddc4014e\") " pod="openstack/nova-api-0" Oct 01 15:16:29 crc kubenswrapper[4771]: I1001 15:16:29.242233 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4909086b-5a48-46b7-af75-e54bddc4014e-public-tls-certs\") pod \"nova-api-0\" (UID: \"4909086b-5a48-46b7-af75-e54bddc4014e\") " pod="openstack/nova-api-0" Oct 01 15:16:29 crc kubenswrapper[4771]: I1001 15:16:29.242233 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4909086b-5a48-46b7-af75-e54bddc4014e-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4909086b-5a48-46b7-af75-e54bddc4014e\") " pod="openstack/nova-api-0" Oct 01 15:16:29 crc kubenswrapper[4771]: I1001 15:16:29.243631 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4909086b-5a48-46b7-af75-e54bddc4014e-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4909086b-5a48-46b7-af75-e54bddc4014e\") " pod="openstack/nova-api-0" Oct 01 15:16:29 crc kubenswrapper[4771]: I1001 15:16:29.256274 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4909086b-5a48-46b7-af75-e54bddc4014e-config-data\") pod \"nova-api-0\" (UID: \"4909086b-5a48-46b7-af75-e54bddc4014e\") " pod="openstack/nova-api-0" Oct 01 15:16:29 crc kubenswrapper[4771]: I1001 15:16:29.257636 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nb4gk\" (UniqueName: \"kubernetes.io/projected/4909086b-5a48-46b7-af75-e54bddc4014e-kube-api-access-nb4gk\") pod \"nova-api-0\" (UID: \"4909086b-5a48-46b7-af75-e54bddc4014e\") " pod="openstack/nova-api-0" Oct 01 15:16:29 crc kubenswrapper[4771]: I1001 15:16:29.291881 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 15:16:29 crc kubenswrapper[4771]: I1001 15:16:29.759415 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 15:16:29 crc kubenswrapper[4771]: I1001 15:16:29.933475 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4909086b-5a48-46b7-af75-e54bddc4014e","Type":"ContainerStarted","Data":"1b12ecb27fa9880765a8df4a62ec21b0f92ed7c7043ee970caa275baac1d02fb"} Oct 01 15:16:29 crc kubenswrapper[4771]: I1001 15:16:29.939362 4771 generic.go:334] "Generic (PLEG): container finished" podID="bbb43da8-f626-452c-88ac-5ff256382574" containerID="882571b0d43051f92c1d3e576a5e289a55b8b3103e0191a333297085dc23daf2" exitCode=0 Oct 01 15:16:29 crc kubenswrapper[4771]: I1001 15:16:29.939408 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbb43da8-f626-452c-88ac-5ff256382574","Type":"ContainerDied","Data":"882571b0d43051f92c1d3e576a5e289a55b8b3103e0191a333297085dc23daf2"} Oct 01 15:16:30 crc kubenswrapper[4771]: I1001 15:16:30.027452 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a233694a-62d8-4f80-bb50-5764b8fdc3bb" path="/var/lib/kubelet/pods/a233694a-62d8-4f80-bb50-5764b8fdc3bb/volumes" Oct 01 15:16:30 crc kubenswrapper[4771]: I1001 15:16:30.230774 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 15:16:30 crc kubenswrapper[4771]: I1001 15:16:30.361827 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbb43da8-f626-452c-88ac-5ff256382574-config-data\") pod \"bbb43da8-f626-452c-88ac-5ff256382574\" (UID: \"bbb43da8-f626-452c-88ac-5ff256382574\") " Oct 01 15:16:30 crc kubenswrapper[4771]: I1001 15:16:30.361933 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pb5x8\" (UniqueName: \"kubernetes.io/projected/bbb43da8-f626-452c-88ac-5ff256382574-kube-api-access-pb5x8\") pod \"bbb43da8-f626-452c-88ac-5ff256382574\" (UID: \"bbb43da8-f626-452c-88ac-5ff256382574\") " Oct 01 15:16:30 crc kubenswrapper[4771]: I1001 15:16:30.361961 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbb43da8-f626-452c-88ac-5ff256382574-run-httpd\") pod \"bbb43da8-f626-452c-88ac-5ff256382574\" (UID: \"bbb43da8-f626-452c-88ac-5ff256382574\") " Oct 01 15:16:30 crc kubenswrapper[4771]: I1001 15:16:30.361985 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbb43da8-f626-452c-88ac-5ff256382574-ceilometer-tls-certs\") pod \"bbb43da8-f626-452c-88ac-5ff256382574\" (UID: \"bbb43da8-f626-452c-88ac-5ff256382574\") " Oct 01 15:16:30 crc kubenswrapper[4771]: I1001 15:16:30.362037 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbb43da8-f626-452c-88ac-5ff256382574-scripts\") pod \"bbb43da8-f626-452c-88ac-5ff256382574\" (UID: \"bbb43da8-f626-452c-88ac-5ff256382574\") " Oct 01 15:16:30 crc kubenswrapper[4771]: I1001 15:16:30.362091 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbb43da8-f626-452c-88ac-5ff256382574-log-httpd\") pod \"bbb43da8-f626-452c-88ac-5ff256382574\" (UID: \"bbb43da8-f626-452c-88ac-5ff256382574\") " Oct 01 15:16:30 crc kubenswrapper[4771]: I1001 15:16:30.362121 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbb43da8-f626-452c-88ac-5ff256382574-combined-ca-bundle\") pod \"bbb43da8-f626-452c-88ac-5ff256382574\" (UID: \"bbb43da8-f626-452c-88ac-5ff256382574\") " Oct 01 15:16:30 crc kubenswrapper[4771]: I1001 15:16:30.362144 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bbb43da8-f626-452c-88ac-5ff256382574-sg-core-conf-yaml\") pod \"bbb43da8-f626-452c-88ac-5ff256382574\" (UID: \"bbb43da8-f626-452c-88ac-5ff256382574\") " Oct 01 15:16:30 crc kubenswrapper[4771]: I1001 15:16:30.362784 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbb43da8-f626-452c-88ac-5ff256382574-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bbb43da8-f626-452c-88ac-5ff256382574" (UID: "bbb43da8-f626-452c-88ac-5ff256382574"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:16:30 crc kubenswrapper[4771]: I1001 15:16:30.362961 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bbb43da8-f626-452c-88ac-5ff256382574-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bbb43da8-f626-452c-88ac-5ff256382574" (UID: "bbb43da8-f626-452c-88ac-5ff256382574"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:16:30 crc kubenswrapper[4771]: I1001 15:16:30.366834 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbb43da8-f626-452c-88ac-5ff256382574-kube-api-access-pb5x8" (OuterVolumeSpecName: "kube-api-access-pb5x8") pod "bbb43da8-f626-452c-88ac-5ff256382574" (UID: "bbb43da8-f626-452c-88ac-5ff256382574"). InnerVolumeSpecName "kube-api-access-pb5x8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:16:30 crc kubenswrapper[4771]: I1001 15:16:30.367903 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbb43da8-f626-452c-88ac-5ff256382574-scripts" (OuterVolumeSpecName: "scripts") pod "bbb43da8-f626-452c-88ac-5ff256382574" (UID: "bbb43da8-f626-452c-88ac-5ff256382574"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:16:30 crc kubenswrapper[4771]: I1001 15:16:30.397697 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbb43da8-f626-452c-88ac-5ff256382574-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bbb43da8-f626-452c-88ac-5ff256382574" (UID: "bbb43da8-f626-452c-88ac-5ff256382574"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:16:30 crc kubenswrapper[4771]: I1001 15:16:30.450824 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbb43da8-f626-452c-88ac-5ff256382574-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "bbb43da8-f626-452c-88ac-5ff256382574" (UID: "bbb43da8-f626-452c-88ac-5ff256382574"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:16:30 crc kubenswrapper[4771]: I1001 15:16:30.458144 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbb43da8-f626-452c-88ac-5ff256382574-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bbb43da8-f626-452c-88ac-5ff256382574" (UID: "bbb43da8-f626-452c-88ac-5ff256382574"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:16:30 crc kubenswrapper[4771]: I1001 15:16:30.464313 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pb5x8\" (UniqueName: \"kubernetes.io/projected/bbb43da8-f626-452c-88ac-5ff256382574-kube-api-access-pb5x8\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:30 crc kubenswrapper[4771]: I1001 15:16:30.464349 4771 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbb43da8-f626-452c-88ac-5ff256382574-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:30 crc kubenswrapper[4771]: I1001 15:16:30.464360 4771 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/bbb43da8-f626-452c-88ac-5ff256382574-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:30 crc kubenswrapper[4771]: I1001 15:16:30.464370 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bbb43da8-f626-452c-88ac-5ff256382574-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:30 crc kubenswrapper[4771]: I1001 15:16:30.464407 4771 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bbb43da8-f626-452c-88ac-5ff256382574-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:30 crc kubenswrapper[4771]: I1001 15:16:30.464418 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbb43da8-f626-452c-88ac-5ff256382574-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:30 crc kubenswrapper[4771]: I1001 15:16:30.464427 4771 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bbb43da8-f626-452c-88ac-5ff256382574-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:30 crc kubenswrapper[4771]: I1001 15:16:30.480001 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbb43da8-f626-452c-88ac-5ff256382574-config-data" (OuterVolumeSpecName: "config-data") pod "bbb43da8-f626-452c-88ac-5ff256382574" (UID: "bbb43da8-f626-452c-88ac-5ff256382574"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:16:30 crc kubenswrapper[4771]: I1001 15:16:30.482700 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 01 15:16:30 crc kubenswrapper[4771]: I1001 15:16:30.500424 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 01 15:16:30 crc kubenswrapper[4771]: I1001 15:16:30.566449 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbb43da8-f626-452c-88ac-5ff256382574-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:30 crc kubenswrapper[4771]: I1001 15:16:30.955517 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bbb43da8-f626-452c-88ac-5ff256382574","Type":"ContainerDied","Data":"ca3fa1df8b34c9c967e0771d26f909bcebfeaa0e5d6fc7f743b7577a46a24521"} Oct 01 15:16:30 crc kubenswrapper[4771]: I1001 15:16:30.955577 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 15:16:30 crc kubenswrapper[4771]: I1001 15:16:30.955605 4771 scope.go:117] "RemoveContainer" containerID="aca94d38e001a78334a93967eeaf2db30423e99ff823edecc946fd8319dbf528" Oct 01 15:16:30 crc kubenswrapper[4771]: I1001 15:16:30.958448 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4909086b-5a48-46b7-af75-e54bddc4014e","Type":"ContainerStarted","Data":"35af8f22ec4841f2f5bcc95e24bf765a91c3a73ee9ff6421a43c92c642c456d1"} Oct 01 15:16:30 crc kubenswrapper[4771]: I1001 15:16:30.958487 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4909086b-5a48-46b7-af75-e54bddc4014e","Type":"ContainerStarted","Data":"9c02131a2868b23ccc91f0912dd1b14039bb369303116298fff6d95ba6f327d1"} Oct 01 15:16:30 crc kubenswrapper[4771]: I1001 15:16:30.979877 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 01 15:16:30 crc kubenswrapper[4771]: I1001 15:16:30.996154 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.996134283 podStartE2EDuration="2.996134283s" podCreationTimestamp="2025-10-01 15:16:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:16:30.9816094 +0000 UTC m=+1235.600784581" watchObservedRunningTime="2025-10-01 15:16:30.996134283 +0000 UTC m=+1235.615309454" Oct 01 15:16:30 crc kubenswrapper[4771]: I1001 15:16:30.997791 4771 scope.go:117] "RemoveContainer" containerID="734c189802e8a863a498e0547cc2dd49836abc181c2d9fbd40dd3baf3ad1d920" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.044926 4771 scope.go:117] "RemoveContainer" containerID="13b62963886d4022fd1b332ca948f28b5093d712975470f125fe0fc0861d2762" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.045072 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.052793 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.076626 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 01 15:16:31 crc kubenswrapper[4771]: E1001 15:16:31.077090 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbb43da8-f626-452c-88ac-5ff256382574" containerName="ceilometer-notification-agent" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.077105 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbb43da8-f626-452c-88ac-5ff256382574" containerName="ceilometer-notification-agent" Oct 01 15:16:31 crc kubenswrapper[4771]: E1001 15:16:31.077121 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbb43da8-f626-452c-88ac-5ff256382574" containerName="proxy-httpd" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.077129 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbb43da8-f626-452c-88ac-5ff256382574" containerName="proxy-httpd" Oct 01 15:16:31 crc kubenswrapper[4771]: E1001 15:16:31.077144 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbb43da8-f626-452c-88ac-5ff256382574" containerName="ceilometer-central-agent" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.077153 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbb43da8-f626-452c-88ac-5ff256382574" containerName="ceilometer-central-agent" Oct 01 15:16:31 crc kubenswrapper[4771]: E1001 15:16:31.077169 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbb43da8-f626-452c-88ac-5ff256382574" containerName="sg-core" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.077178 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbb43da8-f626-452c-88ac-5ff256382574" containerName="sg-core" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.077407 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbb43da8-f626-452c-88ac-5ff256382574" containerName="ceilometer-central-agent" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.077432 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbb43da8-f626-452c-88ac-5ff256382574" containerName="proxy-httpd" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.077450 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbb43da8-f626-452c-88ac-5ff256382574" containerName="ceilometer-notification-agent" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.077477 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbb43da8-f626-452c-88ac-5ff256382574" containerName="sg-core" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.082520 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.087024 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.089375 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.089616 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.111484 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.119853 4771 scope.go:117] "RemoveContainer" containerID="882571b0d43051f92c1d3e576a5e289a55b8b3103e0191a333297085dc23daf2" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.183766 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3fb688fd-269e-4d0f-a84f-ccb670696d20-run-httpd\") pod \"ceilometer-0\" (UID: \"3fb688fd-269e-4d0f-a84f-ccb670696d20\") " pod="openstack/ceilometer-0" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.184182 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3fb688fd-269e-4d0f-a84f-ccb670696d20-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3fb688fd-269e-4d0f-a84f-ccb670696d20\") " pod="openstack/ceilometer-0" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.184232 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fb688fd-269e-4d0f-a84f-ccb670696d20-config-data\") pod \"ceilometer-0\" (UID: \"3fb688fd-269e-4d0f-a84f-ccb670696d20\") " pod="openstack/ceilometer-0" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.187870 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3fb688fd-269e-4d0f-a84f-ccb670696d20-log-httpd\") pod \"ceilometer-0\" (UID: \"3fb688fd-269e-4d0f-a84f-ccb670696d20\") " pod="openstack/ceilometer-0" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.188000 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fb688fd-269e-4d0f-a84f-ccb670696d20-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3fb688fd-269e-4d0f-a84f-ccb670696d20\") " pod="openstack/ceilometer-0" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.188101 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v65jq\" (UniqueName: \"kubernetes.io/projected/3fb688fd-269e-4d0f-a84f-ccb670696d20-kube-api-access-v65jq\") pod \"ceilometer-0\" (UID: \"3fb688fd-269e-4d0f-a84f-ccb670696d20\") " pod="openstack/ceilometer-0" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.188129 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fb688fd-269e-4d0f-a84f-ccb670696d20-scripts\") pod \"ceilometer-0\" (UID: \"3fb688fd-269e-4d0f-a84f-ccb670696d20\") " pod="openstack/ceilometer-0" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.188265 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fb688fd-269e-4d0f-a84f-ccb670696d20-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3fb688fd-269e-4d0f-a84f-ccb670696d20\") " pod="openstack/ceilometer-0" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.214897 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-w58g7"] Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.216126 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-w58g7" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.218134 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.218844 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.223032 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-w58g7"] Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.290293 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3fb688fd-269e-4d0f-a84f-ccb670696d20-log-httpd\") pod \"ceilometer-0\" (UID: \"3fb688fd-269e-4d0f-a84f-ccb670696d20\") " pod="openstack/ceilometer-0" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.290336 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fb688fd-269e-4d0f-a84f-ccb670696d20-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3fb688fd-269e-4d0f-a84f-ccb670696d20\") " pod="openstack/ceilometer-0" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.290374 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fb688fd-269e-4d0f-a84f-ccb670696d20-scripts\") pod \"ceilometer-0\" (UID: \"3fb688fd-269e-4d0f-a84f-ccb670696d20\") " pod="openstack/ceilometer-0" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.290390 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v65jq\" (UniqueName: \"kubernetes.io/projected/3fb688fd-269e-4d0f-a84f-ccb670696d20-kube-api-access-v65jq\") pod \"ceilometer-0\" (UID: \"3fb688fd-269e-4d0f-a84f-ccb670696d20\") " pod="openstack/ceilometer-0" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.290430 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fb688fd-269e-4d0f-a84f-ccb670696d20-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3fb688fd-269e-4d0f-a84f-ccb670696d20\") " pod="openstack/ceilometer-0" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.290461 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/609dfd55-d3d8-4ae3-b8ce-9b64c07ac798-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-w58g7\" (UID: \"609dfd55-d3d8-4ae3-b8ce-9b64c07ac798\") " pod="openstack/nova-cell1-cell-mapping-w58g7" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.290507 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3fb688fd-269e-4d0f-a84f-ccb670696d20-run-httpd\") pod \"ceilometer-0\" (UID: \"3fb688fd-269e-4d0f-a84f-ccb670696d20\") " pod="openstack/ceilometer-0" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.290525 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/609dfd55-d3d8-4ae3-b8ce-9b64c07ac798-scripts\") pod \"nova-cell1-cell-mapping-w58g7\" (UID: \"609dfd55-d3d8-4ae3-b8ce-9b64c07ac798\") " pod="openstack/nova-cell1-cell-mapping-w58g7" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.290553 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/609dfd55-d3d8-4ae3-b8ce-9b64c07ac798-config-data\") pod \"nova-cell1-cell-mapping-w58g7\" (UID: \"609dfd55-d3d8-4ae3-b8ce-9b64c07ac798\") " pod="openstack/nova-cell1-cell-mapping-w58g7" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.290575 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3fb688fd-269e-4d0f-a84f-ccb670696d20-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3fb688fd-269e-4d0f-a84f-ccb670696d20\") " pod="openstack/ceilometer-0" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.290599 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79vl6\" (UniqueName: \"kubernetes.io/projected/609dfd55-d3d8-4ae3-b8ce-9b64c07ac798-kube-api-access-79vl6\") pod \"nova-cell1-cell-mapping-w58g7\" (UID: \"609dfd55-d3d8-4ae3-b8ce-9b64c07ac798\") " pod="openstack/nova-cell1-cell-mapping-w58g7" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.290617 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fb688fd-269e-4d0f-a84f-ccb670696d20-config-data\") pod \"ceilometer-0\" (UID: \"3fb688fd-269e-4d0f-a84f-ccb670696d20\") " pod="openstack/ceilometer-0" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.290760 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3fb688fd-269e-4d0f-a84f-ccb670696d20-log-httpd\") pod \"ceilometer-0\" (UID: \"3fb688fd-269e-4d0f-a84f-ccb670696d20\") " pod="openstack/ceilometer-0" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.290968 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3fb688fd-269e-4d0f-a84f-ccb670696d20-run-httpd\") pod \"ceilometer-0\" (UID: \"3fb688fd-269e-4d0f-a84f-ccb670696d20\") " pod="openstack/ceilometer-0" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.295540 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3fb688fd-269e-4d0f-a84f-ccb670696d20-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3fb688fd-269e-4d0f-a84f-ccb670696d20\") " pod="openstack/ceilometer-0" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.299439 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fb688fd-269e-4d0f-a84f-ccb670696d20-config-data\") pod \"ceilometer-0\" (UID: \"3fb688fd-269e-4d0f-a84f-ccb670696d20\") " pod="openstack/ceilometer-0" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.299535 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3fb688fd-269e-4d0f-a84f-ccb670696d20-scripts\") pod \"ceilometer-0\" (UID: \"3fb688fd-269e-4d0f-a84f-ccb670696d20\") " pod="openstack/ceilometer-0" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.300310 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fb688fd-269e-4d0f-a84f-ccb670696d20-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"3fb688fd-269e-4d0f-a84f-ccb670696d20\") " pod="openstack/ceilometer-0" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.301881 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fb688fd-269e-4d0f-a84f-ccb670696d20-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3fb688fd-269e-4d0f-a84f-ccb670696d20\") " pod="openstack/ceilometer-0" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.309330 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v65jq\" (UniqueName: \"kubernetes.io/projected/3fb688fd-269e-4d0f-a84f-ccb670696d20-kube-api-access-v65jq\") pod \"ceilometer-0\" (UID: \"3fb688fd-269e-4d0f-a84f-ccb670696d20\") " pod="openstack/ceilometer-0" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.398018 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/609dfd55-d3d8-4ae3-b8ce-9b64c07ac798-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-w58g7\" (UID: \"609dfd55-d3d8-4ae3-b8ce-9b64c07ac798\") " pod="openstack/nova-cell1-cell-mapping-w58g7" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.398109 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/609dfd55-d3d8-4ae3-b8ce-9b64c07ac798-scripts\") pod \"nova-cell1-cell-mapping-w58g7\" (UID: \"609dfd55-d3d8-4ae3-b8ce-9b64c07ac798\") " pod="openstack/nova-cell1-cell-mapping-w58g7" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.398149 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/609dfd55-d3d8-4ae3-b8ce-9b64c07ac798-config-data\") pod \"nova-cell1-cell-mapping-w58g7\" (UID: \"609dfd55-d3d8-4ae3-b8ce-9b64c07ac798\") " pod="openstack/nova-cell1-cell-mapping-w58g7" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.398197 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79vl6\" (UniqueName: \"kubernetes.io/projected/609dfd55-d3d8-4ae3-b8ce-9b64c07ac798-kube-api-access-79vl6\") pod \"nova-cell1-cell-mapping-w58g7\" (UID: \"609dfd55-d3d8-4ae3-b8ce-9b64c07ac798\") " pod="openstack/nova-cell1-cell-mapping-w58g7" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.401787 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/609dfd55-d3d8-4ae3-b8ce-9b64c07ac798-scripts\") pod \"nova-cell1-cell-mapping-w58g7\" (UID: \"609dfd55-d3d8-4ae3-b8ce-9b64c07ac798\") " pod="openstack/nova-cell1-cell-mapping-w58g7" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.401945 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/609dfd55-d3d8-4ae3-b8ce-9b64c07ac798-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-w58g7\" (UID: \"609dfd55-d3d8-4ae3-b8ce-9b64c07ac798\") " pod="openstack/nova-cell1-cell-mapping-w58g7" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.402372 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/609dfd55-d3d8-4ae3-b8ce-9b64c07ac798-config-data\") pod \"nova-cell1-cell-mapping-w58g7\" (UID: \"609dfd55-d3d8-4ae3-b8ce-9b64c07ac798\") " pod="openstack/nova-cell1-cell-mapping-w58g7" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.415933 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79vl6\" (UniqueName: \"kubernetes.io/projected/609dfd55-d3d8-4ae3-b8ce-9b64c07ac798-kube-api-access-79vl6\") pod \"nova-cell1-cell-mapping-w58g7\" (UID: \"609dfd55-d3d8-4ae3-b8ce-9b64c07ac798\") " pod="openstack/nova-cell1-cell-mapping-w58g7" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.417044 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.533908 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-w58g7" Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.834116 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 01 15:16:31 crc kubenswrapper[4771]: W1001 15:16:31.842127 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3fb688fd_269e_4d0f_a84f_ccb670696d20.slice/crio-fe6cc70d7f536a10f7957d575ecb226a93394aeeb0d2f2775dd4931dffec9d29 WatchSource:0}: Error finding container fe6cc70d7f536a10f7957d575ecb226a93394aeeb0d2f2775dd4931dffec9d29: Status 404 returned error can't find the container with id fe6cc70d7f536a10f7957d575ecb226a93394aeeb0d2f2775dd4931dffec9d29 Oct 01 15:16:31 crc kubenswrapper[4771]: I1001 15:16:31.969141 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3fb688fd-269e-4d0f-a84f-ccb670696d20","Type":"ContainerStarted","Data":"fe6cc70d7f536a10f7957d575ecb226a93394aeeb0d2f2775dd4931dffec9d29"} Oct 01 15:16:31 crc kubenswrapper[4771]: W1001 15:16:31.997875 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod609dfd55_d3d8_4ae3_b8ce_9b64c07ac798.slice/crio-e68915c9f85c0e2fcb6834df60060a4c98b3e87e1bf04af1456b1bf844c71502 WatchSource:0}: Error finding container e68915c9f85c0e2fcb6834df60060a4c98b3e87e1bf04af1456b1bf844c71502: Status 404 returned error can't find the container with id e68915c9f85c0e2fcb6834df60060a4c98b3e87e1bf04af1456b1bf844c71502 Oct 01 15:16:32 crc kubenswrapper[4771]: I1001 15:16:32.002398 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbb43da8-f626-452c-88ac-5ff256382574" path="/var/lib/kubelet/pods/bbb43da8-f626-452c-88ac-5ff256382574/volumes" Oct 01 15:16:32 crc kubenswrapper[4771]: I1001 15:16:32.003466 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-w58g7"] Oct 01 15:16:32 crc kubenswrapper[4771]: I1001 15:16:32.367022 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-59cf4bdb65-h7bvh" Oct 01 15:16:32 crc kubenswrapper[4771]: I1001 15:16:32.448183 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-4wgt5"] Oct 01 15:16:32 crc kubenswrapper[4771]: I1001 15:16:32.448492 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-845d6d6f59-4wgt5" podUID="b3a0a923-de02-41f9-81ea-79f1529d12e4" containerName="dnsmasq-dns" containerID="cri-o://a4ccff7103fb4b263cb63d12ea6893044788b0999974860a95366abf99eedb0e" gracePeriod=10 Oct 01 15:16:32 crc kubenswrapper[4771]: I1001 15:16:32.894133 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-4wgt5" Oct 01 15:16:32 crc kubenswrapper[4771]: I1001 15:16:32.981968 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3fb688fd-269e-4d0f-a84f-ccb670696d20","Type":"ContainerStarted","Data":"fea171086ef76090421728587a47d9a2794e4ca7300f806b1f55682cf3ebb406"} Oct 01 15:16:32 crc kubenswrapper[4771]: I1001 15:16:32.983345 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-w58g7" event={"ID":"609dfd55-d3d8-4ae3-b8ce-9b64c07ac798","Type":"ContainerStarted","Data":"7db5e2a842876c469532a0a62c92604ad4c42a6288c773c3789a76d4565d4977"} Oct 01 15:16:32 crc kubenswrapper[4771]: I1001 15:16:32.983692 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-w58g7" event={"ID":"609dfd55-d3d8-4ae3-b8ce-9b64c07ac798","Type":"ContainerStarted","Data":"e68915c9f85c0e2fcb6834df60060a4c98b3e87e1bf04af1456b1bf844c71502"} Oct 01 15:16:32 crc kubenswrapper[4771]: I1001 15:16:32.990181 4771 generic.go:334] "Generic (PLEG): container finished" podID="b3a0a923-de02-41f9-81ea-79f1529d12e4" containerID="a4ccff7103fb4b263cb63d12ea6893044788b0999974860a95366abf99eedb0e" exitCode=0 Oct 01 15:16:32 crc kubenswrapper[4771]: I1001 15:16:32.990218 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-4wgt5" event={"ID":"b3a0a923-de02-41f9-81ea-79f1529d12e4","Type":"ContainerDied","Data":"a4ccff7103fb4b263cb63d12ea6893044788b0999974860a95366abf99eedb0e"} Oct 01 15:16:32 crc kubenswrapper[4771]: I1001 15:16:32.990260 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845d6d6f59-4wgt5" event={"ID":"b3a0a923-de02-41f9-81ea-79f1529d12e4","Type":"ContainerDied","Data":"a6e3a2fda0e7ca42978b5f6defc8d67ddeedb2b0afee56e1c5226d4b96b57e89"} Oct 01 15:16:32 crc kubenswrapper[4771]: I1001 15:16:32.990278 4771 scope.go:117] "RemoveContainer" containerID="a4ccff7103fb4b263cb63d12ea6893044788b0999974860a95366abf99eedb0e" Oct 01 15:16:32 crc kubenswrapper[4771]: I1001 15:16:32.990292 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845d6d6f59-4wgt5" Oct 01 15:16:32 crc kubenswrapper[4771]: I1001 15:16:32.999334 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-w58g7" podStartSLOduration=1.9993158709999999 podStartE2EDuration="1.999315871s" podCreationTimestamp="2025-10-01 15:16:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:16:32.998221446 +0000 UTC m=+1237.617396627" watchObservedRunningTime="2025-10-01 15:16:32.999315871 +0000 UTC m=+1237.618491042" Oct 01 15:16:33 crc kubenswrapper[4771]: I1001 15:16:33.010363 4771 scope.go:117] "RemoveContainer" containerID="1ff57663ba973863ef573aa4564036f34adab79db261e35bb36d7cbfc77a2042" Oct 01 15:16:33 crc kubenswrapper[4771]: I1001 15:16:33.029937 4771 scope.go:117] "RemoveContainer" containerID="a4ccff7103fb4b263cb63d12ea6893044788b0999974860a95366abf99eedb0e" Oct 01 15:16:33 crc kubenswrapper[4771]: E1001 15:16:33.030499 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4ccff7103fb4b263cb63d12ea6893044788b0999974860a95366abf99eedb0e\": container with ID starting with a4ccff7103fb4b263cb63d12ea6893044788b0999974860a95366abf99eedb0e not found: ID does not exist" containerID="a4ccff7103fb4b263cb63d12ea6893044788b0999974860a95366abf99eedb0e" Oct 01 15:16:33 crc kubenswrapper[4771]: I1001 15:16:33.030534 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4ccff7103fb4b263cb63d12ea6893044788b0999974860a95366abf99eedb0e"} err="failed to get container status \"a4ccff7103fb4b263cb63d12ea6893044788b0999974860a95366abf99eedb0e\": rpc error: code = NotFound desc = could not find container \"a4ccff7103fb4b263cb63d12ea6893044788b0999974860a95366abf99eedb0e\": container with ID starting with a4ccff7103fb4b263cb63d12ea6893044788b0999974860a95366abf99eedb0e not found: ID does not exist" Oct 01 15:16:33 crc kubenswrapper[4771]: I1001 15:16:33.030555 4771 scope.go:117] "RemoveContainer" containerID="1ff57663ba973863ef573aa4564036f34adab79db261e35bb36d7cbfc77a2042" Oct 01 15:16:33 crc kubenswrapper[4771]: E1001 15:16:33.030836 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ff57663ba973863ef573aa4564036f34adab79db261e35bb36d7cbfc77a2042\": container with ID starting with 1ff57663ba973863ef573aa4564036f34adab79db261e35bb36d7cbfc77a2042 not found: ID does not exist" containerID="1ff57663ba973863ef573aa4564036f34adab79db261e35bb36d7cbfc77a2042" Oct 01 15:16:33 crc kubenswrapper[4771]: I1001 15:16:33.030858 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ff57663ba973863ef573aa4564036f34adab79db261e35bb36d7cbfc77a2042"} err="failed to get container status \"1ff57663ba973863ef573aa4564036f34adab79db261e35bb36d7cbfc77a2042\": rpc error: code = NotFound desc = could not find container \"1ff57663ba973863ef573aa4564036f34adab79db261e35bb36d7cbfc77a2042\": container with ID starting with 1ff57663ba973863ef573aa4564036f34adab79db261e35bb36d7cbfc77a2042 not found: ID does not exist" Oct 01 15:16:33 crc kubenswrapper[4771]: I1001 15:16:33.035664 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3a0a923-de02-41f9-81ea-79f1529d12e4-config\") pod \"b3a0a923-de02-41f9-81ea-79f1529d12e4\" (UID: \"b3a0a923-de02-41f9-81ea-79f1529d12e4\") " Oct 01 15:16:33 crc kubenswrapper[4771]: I1001 15:16:33.035784 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3a0a923-de02-41f9-81ea-79f1529d12e4-ovsdbserver-sb\") pod \"b3a0a923-de02-41f9-81ea-79f1529d12e4\" (UID: \"b3a0a923-de02-41f9-81ea-79f1529d12e4\") " Oct 01 15:16:33 crc kubenswrapper[4771]: I1001 15:16:33.035814 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2b8d7\" (UniqueName: \"kubernetes.io/projected/b3a0a923-de02-41f9-81ea-79f1529d12e4-kube-api-access-2b8d7\") pod \"b3a0a923-de02-41f9-81ea-79f1529d12e4\" (UID: \"b3a0a923-de02-41f9-81ea-79f1529d12e4\") " Oct 01 15:16:33 crc kubenswrapper[4771]: I1001 15:16:33.035853 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3a0a923-de02-41f9-81ea-79f1529d12e4-dns-svc\") pod \"b3a0a923-de02-41f9-81ea-79f1529d12e4\" (UID: \"b3a0a923-de02-41f9-81ea-79f1529d12e4\") " Oct 01 15:16:33 crc kubenswrapper[4771]: I1001 15:16:33.035919 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3a0a923-de02-41f9-81ea-79f1529d12e4-ovsdbserver-nb\") pod \"b3a0a923-de02-41f9-81ea-79f1529d12e4\" (UID: \"b3a0a923-de02-41f9-81ea-79f1529d12e4\") " Oct 01 15:16:33 crc kubenswrapper[4771]: I1001 15:16:33.035938 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b3a0a923-de02-41f9-81ea-79f1529d12e4-dns-swift-storage-0\") pod \"b3a0a923-de02-41f9-81ea-79f1529d12e4\" (UID: \"b3a0a923-de02-41f9-81ea-79f1529d12e4\") " Oct 01 15:16:33 crc kubenswrapper[4771]: I1001 15:16:33.041268 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3a0a923-de02-41f9-81ea-79f1529d12e4-kube-api-access-2b8d7" (OuterVolumeSpecName: "kube-api-access-2b8d7") pod "b3a0a923-de02-41f9-81ea-79f1529d12e4" (UID: "b3a0a923-de02-41f9-81ea-79f1529d12e4"). InnerVolumeSpecName "kube-api-access-2b8d7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:16:33 crc kubenswrapper[4771]: I1001 15:16:33.085783 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3a0a923-de02-41f9-81ea-79f1529d12e4-config" (OuterVolumeSpecName: "config") pod "b3a0a923-de02-41f9-81ea-79f1529d12e4" (UID: "b3a0a923-de02-41f9-81ea-79f1529d12e4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:16:33 crc kubenswrapper[4771]: I1001 15:16:33.087087 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3a0a923-de02-41f9-81ea-79f1529d12e4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b3a0a923-de02-41f9-81ea-79f1529d12e4" (UID: "b3a0a923-de02-41f9-81ea-79f1529d12e4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:16:33 crc kubenswrapper[4771]: I1001 15:16:33.102403 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3a0a923-de02-41f9-81ea-79f1529d12e4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b3a0a923-de02-41f9-81ea-79f1529d12e4" (UID: "b3a0a923-de02-41f9-81ea-79f1529d12e4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:16:33 crc kubenswrapper[4771]: I1001 15:16:33.105530 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3a0a923-de02-41f9-81ea-79f1529d12e4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b3a0a923-de02-41f9-81ea-79f1529d12e4" (UID: "b3a0a923-de02-41f9-81ea-79f1529d12e4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:16:33 crc kubenswrapper[4771]: I1001 15:16:33.108211 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3a0a923-de02-41f9-81ea-79f1529d12e4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b3a0a923-de02-41f9-81ea-79f1529d12e4" (UID: "b3a0a923-de02-41f9-81ea-79f1529d12e4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:16:33 crc kubenswrapper[4771]: I1001 15:16:33.137496 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b3a0a923-de02-41f9-81ea-79f1529d12e4-config\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:33 crc kubenswrapper[4771]: I1001 15:16:33.137526 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b3a0a923-de02-41f9-81ea-79f1529d12e4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:33 crc kubenswrapper[4771]: I1001 15:16:33.137537 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2b8d7\" (UniqueName: \"kubernetes.io/projected/b3a0a923-de02-41f9-81ea-79f1529d12e4-kube-api-access-2b8d7\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:33 crc kubenswrapper[4771]: I1001 15:16:33.137546 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b3a0a923-de02-41f9-81ea-79f1529d12e4-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:33 crc kubenswrapper[4771]: I1001 15:16:33.137554 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b3a0a923-de02-41f9-81ea-79f1529d12e4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:33 crc kubenswrapper[4771]: I1001 15:16:33.137561 4771 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b3a0a923-de02-41f9-81ea-79f1529d12e4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:33 crc kubenswrapper[4771]: I1001 15:16:33.376657 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-4wgt5"] Oct 01 15:16:33 crc kubenswrapper[4771]: I1001 15:16:33.386309 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-845d6d6f59-4wgt5"] Oct 01 15:16:34 crc kubenswrapper[4771]: I1001 15:16:34.016932 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3a0a923-de02-41f9-81ea-79f1529d12e4" path="/var/lib/kubelet/pods/b3a0a923-de02-41f9-81ea-79f1529d12e4/volumes" Oct 01 15:16:34 crc kubenswrapper[4771]: I1001 15:16:34.027795 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3fb688fd-269e-4d0f-a84f-ccb670696d20","Type":"ContainerStarted","Data":"5e4f1d625a696e417cc43159d5b35bfa7428d03aa9ac4702934a518022dbd4b9"} Oct 01 15:16:35 crc kubenswrapper[4771]: I1001 15:16:35.040121 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3fb688fd-269e-4d0f-a84f-ccb670696d20","Type":"ContainerStarted","Data":"12a8aaacab7d537bc4774e6233862dbb642d7d08d547e7a4218ebe60ce5c72e3"} Oct 01 15:16:36 crc kubenswrapper[4771]: I1001 15:16:36.051277 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3fb688fd-269e-4d0f-a84f-ccb670696d20","Type":"ContainerStarted","Data":"8ccf12a7e3a9d07dfa2be92b9ce140fb8fc1a5d6204f2325c060bd05537ef4e0"} Oct 01 15:16:36 crc kubenswrapper[4771]: I1001 15:16:36.052778 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 01 15:16:36 crc kubenswrapper[4771]: I1001 15:16:36.075370 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.434465681 podStartE2EDuration="5.075353776s" podCreationTimestamp="2025-10-01 15:16:31 +0000 UTC" firstStartedPulling="2025-10-01 15:16:31.846638833 +0000 UTC m=+1236.465814004" lastFinishedPulling="2025-10-01 15:16:35.487526888 +0000 UTC m=+1240.106702099" observedRunningTime="2025-10-01 15:16:36.068016477 +0000 UTC m=+1240.687191668" watchObservedRunningTime="2025-10-01 15:16:36.075353776 +0000 UTC m=+1240.694528947" Oct 01 15:16:37 crc kubenswrapper[4771]: I1001 15:16:37.062074 4771 generic.go:334] "Generic (PLEG): container finished" podID="609dfd55-d3d8-4ae3-b8ce-9b64c07ac798" containerID="7db5e2a842876c469532a0a62c92604ad4c42a6288c773c3789a76d4565d4977" exitCode=0 Oct 01 15:16:37 crc kubenswrapper[4771]: I1001 15:16:37.063774 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-w58g7" event={"ID":"609dfd55-d3d8-4ae3-b8ce-9b64c07ac798","Type":"ContainerDied","Data":"7db5e2a842876c469532a0a62c92604ad4c42a6288c773c3789a76d4565d4977"} Oct 01 15:16:38 crc kubenswrapper[4771]: I1001 15:16:38.523638 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-w58g7" Oct 01 15:16:38 crc kubenswrapper[4771]: I1001 15:16:38.659409 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/609dfd55-d3d8-4ae3-b8ce-9b64c07ac798-combined-ca-bundle\") pod \"609dfd55-d3d8-4ae3-b8ce-9b64c07ac798\" (UID: \"609dfd55-d3d8-4ae3-b8ce-9b64c07ac798\") " Oct 01 15:16:38 crc kubenswrapper[4771]: I1001 15:16:38.659652 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/609dfd55-d3d8-4ae3-b8ce-9b64c07ac798-config-data\") pod \"609dfd55-d3d8-4ae3-b8ce-9b64c07ac798\" (UID: \"609dfd55-d3d8-4ae3-b8ce-9b64c07ac798\") " Oct 01 15:16:38 crc kubenswrapper[4771]: I1001 15:16:38.659805 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79vl6\" (UniqueName: \"kubernetes.io/projected/609dfd55-d3d8-4ae3-b8ce-9b64c07ac798-kube-api-access-79vl6\") pod \"609dfd55-d3d8-4ae3-b8ce-9b64c07ac798\" (UID: \"609dfd55-d3d8-4ae3-b8ce-9b64c07ac798\") " Oct 01 15:16:38 crc kubenswrapper[4771]: I1001 15:16:38.659831 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/609dfd55-d3d8-4ae3-b8ce-9b64c07ac798-scripts\") pod \"609dfd55-d3d8-4ae3-b8ce-9b64c07ac798\" (UID: \"609dfd55-d3d8-4ae3-b8ce-9b64c07ac798\") " Oct 01 15:16:38 crc kubenswrapper[4771]: I1001 15:16:38.664928 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/609dfd55-d3d8-4ae3-b8ce-9b64c07ac798-kube-api-access-79vl6" (OuterVolumeSpecName: "kube-api-access-79vl6") pod "609dfd55-d3d8-4ae3-b8ce-9b64c07ac798" (UID: "609dfd55-d3d8-4ae3-b8ce-9b64c07ac798"). InnerVolumeSpecName "kube-api-access-79vl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:16:38 crc kubenswrapper[4771]: I1001 15:16:38.677091 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/609dfd55-d3d8-4ae3-b8ce-9b64c07ac798-scripts" (OuterVolumeSpecName: "scripts") pod "609dfd55-d3d8-4ae3-b8ce-9b64c07ac798" (UID: "609dfd55-d3d8-4ae3-b8ce-9b64c07ac798"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:16:38 crc kubenswrapper[4771]: I1001 15:16:38.698767 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/609dfd55-d3d8-4ae3-b8ce-9b64c07ac798-config-data" (OuterVolumeSpecName: "config-data") pod "609dfd55-d3d8-4ae3-b8ce-9b64c07ac798" (UID: "609dfd55-d3d8-4ae3-b8ce-9b64c07ac798"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:16:38 crc kubenswrapper[4771]: I1001 15:16:38.716146 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/609dfd55-d3d8-4ae3-b8ce-9b64c07ac798-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "609dfd55-d3d8-4ae3-b8ce-9b64c07ac798" (UID: "609dfd55-d3d8-4ae3-b8ce-9b64c07ac798"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:16:38 crc kubenswrapper[4771]: I1001 15:16:38.761576 4771 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/609dfd55-d3d8-4ae3-b8ce-9b64c07ac798-scripts\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:38 crc kubenswrapper[4771]: I1001 15:16:38.761615 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/609dfd55-d3d8-4ae3-b8ce-9b64c07ac798-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:38 crc kubenswrapper[4771]: I1001 15:16:38.761625 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/609dfd55-d3d8-4ae3-b8ce-9b64c07ac798-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:38 crc kubenswrapper[4771]: I1001 15:16:38.761634 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79vl6\" (UniqueName: \"kubernetes.io/projected/609dfd55-d3d8-4ae3-b8ce-9b64c07ac798-kube-api-access-79vl6\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:39 crc kubenswrapper[4771]: I1001 15:16:39.092410 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-w58g7" event={"ID":"609dfd55-d3d8-4ae3-b8ce-9b64c07ac798","Type":"ContainerDied","Data":"e68915c9f85c0e2fcb6834df60060a4c98b3e87e1bf04af1456b1bf844c71502"} Oct 01 15:16:39 crc kubenswrapper[4771]: I1001 15:16:39.092453 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e68915c9f85c0e2fcb6834df60060a4c98b3e87e1bf04af1456b1bf844c71502" Oct 01 15:16:39 crc kubenswrapper[4771]: I1001 15:16:39.092515 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-w58g7" Oct 01 15:16:39 crc kubenswrapper[4771]: I1001 15:16:39.292526 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 15:16:39 crc kubenswrapper[4771]: I1001 15:16:39.293125 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 15:16:39 crc kubenswrapper[4771]: I1001 15:16:39.302993 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 01 15:16:39 crc kubenswrapper[4771]: I1001 15:16:39.318079 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 15:16:39 crc kubenswrapper[4771]: I1001 15:16:39.318430 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="2d6c6725-6778-463a-89ab-d042864eca91" containerName="nova-scheduler-scheduler" containerID="cri-o://6a6be13b2a991f9e86be260e28e6555487e4fbf87d6b2dcea6460227fd2a4606" gracePeriod=30 Oct 01 15:16:39 crc kubenswrapper[4771]: I1001 15:16:39.335925 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 15:16:39 crc kubenswrapper[4771]: I1001 15:16:39.336267 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c614d971-1db4-4f84-95d3-214aff9ed919" containerName="nova-metadata-metadata" containerID="cri-o://f80733293ff439bb23cf5e76e0b49ae174a23b29f269b60b2a64a9438f090970" gracePeriod=30 Oct 01 15:16:39 crc kubenswrapper[4771]: I1001 15:16:39.336212 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="c614d971-1db4-4f84-95d3-214aff9ed919" containerName="nova-metadata-log" containerID="cri-o://1d7c9f7d4164323adc8a025de5fe1b86d5e8636d9a188839ad1391abbe6737bf" gracePeriod=30 Oct 01 15:16:40 crc kubenswrapper[4771]: I1001 15:16:40.121097 4771 generic.go:334] "Generic (PLEG): container finished" podID="c614d971-1db4-4f84-95d3-214aff9ed919" containerID="1d7c9f7d4164323adc8a025de5fe1b86d5e8636d9a188839ad1391abbe6737bf" exitCode=143 Oct 01 15:16:40 crc kubenswrapper[4771]: I1001 15:16:40.123465 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c614d971-1db4-4f84-95d3-214aff9ed919","Type":"ContainerDied","Data":"1d7c9f7d4164323adc8a025de5fe1b86d5e8636d9a188839ad1391abbe6737bf"} Oct 01 15:16:40 crc kubenswrapper[4771]: I1001 15:16:40.303977 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4909086b-5a48-46b7-af75-e54bddc4014e" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.203:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 15:16:40 crc kubenswrapper[4771]: I1001 15:16:40.304296 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4909086b-5a48-46b7-af75-e54bddc4014e" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.203:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 15:16:40 crc kubenswrapper[4771]: E1001 15:16:40.969392 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6a6be13b2a991f9e86be260e28e6555487e4fbf87d6b2dcea6460227fd2a4606" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 01 15:16:40 crc kubenswrapper[4771]: E1001 15:16:40.970826 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6a6be13b2a991f9e86be260e28e6555487e4fbf87d6b2dcea6460227fd2a4606" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 01 15:16:40 crc kubenswrapper[4771]: E1001 15:16:40.972928 4771 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="6a6be13b2a991f9e86be260e28e6555487e4fbf87d6b2dcea6460227fd2a4606" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 01 15:16:40 crc kubenswrapper[4771]: E1001 15:16:40.972968 4771 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="2d6c6725-6778-463a-89ab-d042864eca91" containerName="nova-scheduler-scheduler" Oct 01 15:16:41 crc kubenswrapper[4771]: I1001 15:16:41.132952 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4909086b-5a48-46b7-af75-e54bddc4014e" containerName="nova-api-log" containerID="cri-o://9c02131a2868b23ccc91f0912dd1b14039bb369303116298fff6d95ba6f327d1" gracePeriod=30 Oct 01 15:16:41 crc kubenswrapper[4771]: I1001 15:16:41.133143 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4909086b-5a48-46b7-af75-e54bddc4014e" containerName="nova-api-api" containerID="cri-o://35af8f22ec4841f2f5bcc95e24bf765a91c3a73ee9ff6421a43c92c642c456d1" gracePeriod=30 Oct 01 15:16:42 crc kubenswrapper[4771]: I1001 15:16:42.149201 4771 generic.go:334] "Generic (PLEG): container finished" podID="4909086b-5a48-46b7-af75-e54bddc4014e" containerID="9c02131a2868b23ccc91f0912dd1b14039bb369303116298fff6d95ba6f327d1" exitCode=143 Oct 01 15:16:42 crc kubenswrapper[4771]: I1001 15:16:42.149296 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4909086b-5a48-46b7-af75-e54bddc4014e","Type":"ContainerDied","Data":"9c02131a2868b23ccc91f0912dd1b14039bb369303116298fff6d95ba6f327d1"} Oct 01 15:16:42 crc kubenswrapper[4771]: I1001 15:16:42.471191 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="c614d971-1db4-4f84-95d3-214aff9ed919" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.196:8775/\": read tcp 10.217.0.2:58298->10.217.0.196:8775: read: connection reset by peer" Oct 01 15:16:42 crc kubenswrapper[4771]: I1001 15:16:42.471538 4771 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="c614d971-1db4-4f84-95d3-214aff9ed919" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.196:8775/\": read tcp 10.217.0.2:58314->10.217.0.196:8775: read: connection reset by peer" Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.002756 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.050637 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c614d971-1db4-4f84-95d3-214aff9ed919-config-data\") pod \"c614d971-1db4-4f84-95d3-214aff9ed919\" (UID: \"c614d971-1db4-4f84-95d3-214aff9ed919\") " Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.051099 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c614d971-1db4-4f84-95d3-214aff9ed919-logs\") pod \"c614d971-1db4-4f84-95d3-214aff9ed919\" (UID: \"c614d971-1db4-4f84-95d3-214aff9ed919\") " Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.051138 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c614d971-1db4-4f84-95d3-214aff9ed919-nova-metadata-tls-certs\") pod \"c614d971-1db4-4f84-95d3-214aff9ed919\" (UID: \"c614d971-1db4-4f84-95d3-214aff9ed919\") " Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.051350 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjntr\" (UniqueName: \"kubernetes.io/projected/c614d971-1db4-4f84-95d3-214aff9ed919-kube-api-access-qjntr\") pod \"c614d971-1db4-4f84-95d3-214aff9ed919\" (UID: \"c614d971-1db4-4f84-95d3-214aff9ed919\") " Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.051482 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c614d971-1db4-4f84-95d3-214aff9ed919-combined-ca-bundle\") pod \"c614d971-1db4-4f84-95d3-214aff9ed919\" (UID: \"c614d971-1db4-4f84-95d3-214aff9ed919\") " Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.051813 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c614d971-1db4-4f84-95d3-214aff9ed919-logs" (OuterVolumeSpecName: "logs") pod "c614d971-1db4-4f84-95d3-214aff9ed919" (UID: "c614d971-1db4-4f84-95d3-214aff9ed919"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.051952 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c614d971-1db4-4f84-95d3-214aff9ed919-logs\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.056325 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c614d971-1db4-4f84-95d3-214aff9ed919-kube-api-access-qjntr" (OuterVolumeSpecName: "kube-api-access-qjntr") pod "c614d971-1db4-4f84-95d3-214aff9ed919" (UID: "c614d971-1db4-4f84-95d3-214aff9ed919"). InnerVolumeSpecName "kube-api-access-qjntr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.091545 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c614d971-1db4-4f84-95d3-214aff9ed919-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c614d971-1db4-4f84-95d3-214aff9ed919" (UID: "c614d971-1db4-4f84-95d3-214aff9ed919"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.103059 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c614d971-1db4-4f84-95d3-214aff9ed919-config-data" (OuterVolumeSpecName: "config-data") pod "c614d971-1db4-4f84-95d3-214aff9ed919" (UID: "c614d971-1db4-4f84-95d3-214aff9ed919"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.122109 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c614d971-1db4-4f84-95d3-214aff9ed919-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "c614d971-1db4-4f84-95d3-214aff9ed919" (UID: "c614d971-1db4-4f84-95d3-214aff9ed919"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.153420 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjntr\" (UniqueName: \"kubernetes.io/projected/c614d971-1db4-4f84-95d3-214aff9ed919-kube-api-access-qjntr\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.153446 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c614d971-1db4-4f84-95d3-214aff9ed919-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.153456 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c614d971-1db4-4f84-95d3-214aff9ed919-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.153470 4771 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/c614d971-1db4-4f84-95d3-214aff9ed919-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.168587 4771 generic.go:334] "Generic (PLEG): container finished" podID="c614d971-1db4-4f84-95d3-214aff9ed919" containerID="f80733293ff439bb23cf5e76e0b49ae174a23b29f269b60b2a64a9438f090970" exitCode=0 Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.168622 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.168641 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c614d971-1db4-4f84-95d3-214aff9ed919","Type":"ContainerDied","Data":"f80733293ff439bb23cf5e76e0b49ae174a23b29f269b60b2a64a9438f090970"} Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.168703 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"c614d971-1db4-4f84-95d3-214aff9ed919","Type":"ContainerDied","Data":"aafb4cd65b25fce3deff85f098ecb6c17d3bda5d3d94554ec66fe39def4d8e6e"} Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.168766 4771 scope.go:117] "RemoveContainer" containerID="f80733293ff439bb23cf5e76e0b49ae174a23b29f269b60b2a64a9438f090970" Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.207230 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.211908 4771 scope.go:117] "RemoveContainer" containerID="1d7c9f7d4164323adc8a025de5fe1b86d5e8636d9a188839ad1391abbe6737bf" Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.219833 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.227240 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 01 15:16:43 crc kubenswrapper[4771]: E1001 15:16:43.227610 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c614d971-1db4-4f84-95d3-214aff9ed919" containerName="nova-metadata-metadata" Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.227626 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="c614d971-1db4-4f84-95d3-214aff9ed919" containerName="nova-metadata-metadata" Oct 01 15:16:43 crc kubenswrapper[4771]: E1001 15:16:43.227648 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c614d971-1db4-4f84-95d3-214aff9ed919" containerName="nova-metadata-log" Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.227656 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="c614d971-1db4-4f84-95d3-214aff9ed919" containerName="nova-metadata-log" Oct 01 15:16:43 crc kubenswrapper[4771]: E1001 15:16:43.227668 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3a0a923-de02-41f9-81ea-79f1529d12e4" containerName="dnsmasq-dns" Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.227674 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3a0a923-de02-41f9-81ea-79f1529d12e4" containerName="dnsmasq-dns" Oct 01 15:16:43 crc kubenswrapper[4771]: E1001 15:16:43.227691 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3a0a923-de02-41f9-81ea-79f1529d12e4" containerName="init" Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.227697 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3a0a923-de02-41f9-81ea-79f1529d12e4" containerName="init" Oct 01 15:16:43 crc kubenswrapper[4771]: E1001 15:16:43.227704 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="609dfd55-d3d8-4ae3-b8ce-9b64c07ac798" containerName="nova-manage" Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.227710 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="609dfd55-d3d8-4ae3-b8ce-9b64c07ac798" containerName="nova-manage" Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.228306 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3a0a923-de02-41f9-81ea-79f1529d12e4" containerName="dnsmasq-dns" Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.228333 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="c614d971-1db4-4f84-95d3-214aff9ed919" containerName="nova-metadata-metadata" Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.228363 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="609dfd55-d3d8-4ae3-b8ce-9b64c07ac798" containerName="nova-manage" Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.228381 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="c614d971-1db4-4f84-95d3-214aff9ed919" containerName="nova-metadata-log" Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.229494 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.231468 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.233713 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.240056 4771 scope.go:117] "RemoveContainer" containerID="f80733293ff439bb23cf5e76e0b49ae174a23b29f269b60b2a64a9438f090970" Oct 01 15:16:43 crc kubenswrapper[4771]: E1001 15:16:43.240500 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f80733293ff439bb23cf5e76e0b49ae174a23b29f269b60b2a64a9438f090970\": container with ID starting with f80733293ff439bb23cf5e76e0b49ae174a23b29f269b60b2a64a9438f090970 not found: ID does not exist" containerID="f80733293ff439bb23cf5e76e0b49ae174a23b29f269b60b2a64a9438f090970" Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.240539 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f80733293ff439bb23cf5e76e0b49ae174a23b29f269b60b2a64a9438f090970"} err="failed to get container status \"f80733293ff439bb23cf5e76e0b49ae174a23b29f269b60b2a64a9438f090970\": rpc error: code = NotFound desc = could not find container \"f80733293ff439bb23cf5e76e0b49ae174a23b29f269b60b2a64a9438f090970\": container with ID starting with f80733293ff439bb23cf5e76e0b49ae174a23b29f269b60b2a64a9438f090970 not found: ID does not exist" Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.240564 4771 scope.go:117] "RemoveContainer" containerID="1d7c9f7d4164323adc8a025de5fe1b86d5e8636d9a188839ad1391abbe6737bf" Oct 01 15:16:43 crc kubenswrapper[4771]: E1001 15:16:43.241067 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d7c9f7d4164323adc8a025de5fe1b86d5e8636d9a188839ad1391abbe6737bf\": container with ID starting with 1d7c9f7d4164323adc8a025de5fe1b86d5e8636d9a188839ad1391abbe6737bf not found: ID does not exist" containerID="1d7c9f7d4164323adc8a025de5fe1b86d5e8636d9a188839ad1391abbe6737bf" Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.241089 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d7c9f7d4164323adc8a025de5fe1b86d5e8636d9a188839ad1391abbe6737bf"} err="failed to get container status \"1d7c9f7d4164323adc8a025de5fe1b86d5e8636d9a188839ad1391abbe6737bf\": rpc error: code = NotFound desc = could not find container \"1d7c9f7d4164323adc8a025de5fe1b86d5e8636d9a188839ad1391abbe6737bf\": container with ID starting with 1d7c9f7d4164323adc8a025de5fe1b86d5e8636d9a188839ad1391abbe6737bf not found: ID does not exist" Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.243598 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.254139 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10227ed2-4069-45bc-b3b9-091bb98d72af-config-data\") pod \"nova-metadata-0\" (UID: \"10227ed2-4069-45bc-b3b9-091bb98d72af\") " pod="openstack/nova-metadata-0" Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.254220 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10227ed2-4069-45bc-b3b9-091bb98d72af-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"10227ed2-4069-45bc-b3b9-091bb98d72af\") " pod="openstack/nova-metadata-0" Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.254317 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10227ed2-4069-45bc-b3b9-091bb98d72af-logs\") pod \"nova-metadata-0\" (UID: \"10227ed2-4069-45bc-b3b9-091bb98d72af\") " pod="openstack/nova-metadata-0" Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.254407 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6m54\" (UniqueName: \"kubernetes.io/projected/10227ed2-4069-45bc-b3b9-091bb98d72af-kube-api-access-n6m54\") pod \"nova-metadata-0\" (UID: \"10227ed2-4069-45bc-b3b9-091bb98d72af\") " pod="openstack/nova-metadata-0" Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.254449 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/10227ed2-4069-45bc-b3b9-091bb98d72af-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"10227ed2-4069-45bc-b3b9-091bb98d72af\") " pod="openstack/nova-metadata-0" Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.356272 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/10227ed2-4069-45bc-b3b9-091bb98d72af-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"10227ed2-4069-45bc-b3b9-091bb98d72af\") " pod="openstack/nova-metadata-0" Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.356371 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10227ed2-4069-45bc-b3b9-091bb98d72af-config-data\") pod \"nova-metadata-0\" (UID: \"10227ed2-4069-45bc-b3b9-091bb98d72af\") " pod="openstack/nova-metadata-0" Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.356464 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10227ed2-4069-45bc-b3b9-091bb98d72af-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"10227ed2-4069-45bc-b3b9-091bb98d72af\") " pod="openstack/nova-metadata-0" Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.356489 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10227ed2-4069-45bc-b3b9-091bb98d72af-logs\") pod \"nova-metadata-0\" (UID: \"10227ed2-4069-45bc-b3b9-091bb98d72af\") " pod="openstack/nova-metadata-0" Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.356541 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6m54\" (UniqueName: \"kubernetes.io/projected/10227ed2-4069-45bc-b3b9-091bb98d72af-kube-api-access-n6m54\") pod \"nova-metadata-0\" (UID: \"10227ed2-4069-45bc-b3b9-091bb98d72af\") " pod="openstack/nova-metadata-0" Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.357281 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10227ed2-4069-45bc-b3b9-091bb98d72af-logs\") pod \"nova-metadata-0\" (UID: \"10227ed2-4069-45bc-b3b9-091bb98d72af\") " pod="openstack/nova-metadata-0" Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.360027 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10227ed2-4069-45bc-b3b9-091bb98d72af-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"10227ed2-4069-45bc-b3b9-091bb98d72af\") " pod="openstack/nova-metadata-0" Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.360105 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10227ed2-4069-45bc-b3b9-091bb98d72af-config-data\") pod \"nova-metadata-0\" (UID: \"10227ed2-4069-45bc-b3b9-091bb98d72af\") " pod="openstack/nova-metadata-0" Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.363239 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/10227ed2-4069-45bc-b3b9-091bb98d72af-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"10227ed2-4069-45bc-b3b9-091bb98d72af\") " pod="openstack/nova-metadata-0" Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.387298 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6m54\" (UniqueName: \"kubernetes.io/projected/10227ed2-4069-45bc-b3b9-091bb98d72af-kube-api-access-n6m54\") pod \"nova-metadata-0\" (UID: \"10227ed2-4069-45bc-b3b9-091bb98d72af\") " pod="openstack/nova-metadata-0" Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.549825 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.998775 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c614d971-1db4-4f84-95d3-214aff9ed919" path="/var/lib/kubelet/pods/c614d971-1db4-4f84-95d3-214aff9ed919/volumes" Oct 01 15:16:43 crc kubenswrapper[4771]: I1001 15:16:43.999532 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 01 15:16:44 crc kubenswrapper[4771]: I1001 15:16:44.186419 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"10227ed2-4069-45bc-b3b9-091bb98d72af","Type":"ContainerStarted","Data":"790d7fba72a3b27429f383dec9f18737cb2de2594d327c126d1a0e97714e1c89"} Oct 01 15:16:44 crc kubenswrapper[4771]: I1001 15:16:44.831419 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 15:16:44 crc kubenswrapper[4771]: I1001 15:16:44.988627 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d6c6725-6778-463a-89ab-d042864eca91-combined-ca-bundle\") pod \"2d6c6725-6778-463a-89ab-d042864eca91\" (UID: \"2d6c6725-6778-463a-89ab-d042864eca91\") " Oct 01 15:16:44 crc kubenswrapper[4771]: I1001 15:16:44.988726 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d6c6725-6778-463a-89ab-d042864eca91-config-data\") pod \"2d6c6725-6778-463a-89ab-d042864eca91\" (UID: \"2d6c6725-6778-463a-89ab-d042864eca91\") " Oct 01 15:16:44 crc kubenswrapper[4771]: I1001 15:16:44.988931 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2skw8\" (UniqueName: \"kubernetes.io/projected/2d6c6725-6778-463a-89ab-d042864eca91-kube-api-access-2skw8\") pod \"2d6c6725-6778-463a-89ab-d042864eca91\" (UID: \"2d6c6725-6778-463a-89ab-d042864eca91\") " Oct 01 15:16:44 crc kubenswrapper[4771]: I1001 15:16:44.994937 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d6c6725-6778-463a-89ab-d042864eca91-kube-api-access-2skw8" (OuterVolumeSpecName: "kube-api-access-2skw8") pod "2d6c6725-6778-463a-89ab-d042864eca91" (UID: "2d6c6725-6778-463a-89ab-d042864eca91"). InnerVolumeSpecName "kube-api-access-2skw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:16:45 crc kubenswrapper[4771]: I1001 15:16:45.024104 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d6c6725-6778-463a-89ab-d042864eca91-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d6c6725-6778-463a-89ab-d042864eca91" (UID: "2d6c6725-6778-463a-89ab-d042864eca91"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:16:45 crc kubenswrapper[4771]: I1001 15:16:45.027686 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d6c6725-6778-463a-89ab-d042864eca91-config-data" (OuterVolumeSpecName: "config-data") pod "2d6c6725-6778-463a-89ab-d042864eca91" (UID: "2d6c6725-6778-463a-89ab-d042864eca91"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:16:45 crc kubenswrapper[4771]: I1001 15:16:45.092453 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d6c6725-6778-463a-89ab-d042864eca91-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:45 crc kubenswrapper[4771]: I1001 15:16:45.092491 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d6c6725-6778-463a-89ab-d042864eca91-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:45 crc kubenswrapper[4771]: I1001 15:16:45.092503 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2skw8\" (UniqueName: \"kubernetes.io/projected/2d6c6725-6778-463a-89ab-d042864eca91-kube-api-access-2skw8\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:45 crc kubenswrapper[4771]: I1001 15:16:45.204020 4771 generic.go:334] "Generic (PLEG): container finished" podID="2d6c6725-6778-463a-89ab-d042864eca91" containerID="6a6be13b2a991f9e86be260e28e6555487e4fbf87d6b2dcea6460227fd2a4606" exitCode=0 Oct 01 15:16:45 crc kubenswrapper[4771]: I1001 15:16:45.204119 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2d6c6725-6778-463a-89ab-d042864eca91","Type":"ContainerDied","Data":"6a6be13b2a991f9e86be260e28e6555487e4fbf87d6b2dcea6460227fd2a4606"} Oct 01 15:16:45 crc kubenswrapper[4771]: I1001 15:16:45.204153 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2d6c6725-6778-463a-89ab-d042864eca91","Type":"ContainerDied","Data":"8ad98d7e781bc73b8e86d0fd4d2b4af030534be095e524d4fc3b4d930285a342"} Oct 01 15:16:45 crc kubenswrapper[4771]: I1001 15:16:45.204175 4771 scope.go:117] "RemoveContainer" containerID="6a6be13b2a991f9e86be260e28e6555487e4fbf87d6b2dcea6460227fd2a4606" Oct 01 15:16:45 crc kubenswrapper[4771]: I1001 15:16:45.204186 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 15:16:45 crc kubenswrapper[4771]: I1001 15:16:45.206647 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"10227ed2-4069-45bc-b3b9-091bb98d72af","Type":"ContainerStarted","Data":"cf6dfc17defa77ba31f0d7ad7a4a0ccafd34a984c70bf073d2ef8edd84146d7f"} Oct 01 15:16:45 crc kubenswrapper[4771]: I1001 15:16:45.206678 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"10227ed2-4069-45bc-b3b9-091bb98d72af","Type":"ContainerStarted","Data":"45dd34f44cdb1727451cff0a926e810b40833386d5be69c22db155b9c7f8a709"} Oct 01 15:16:45 crc kubenswrapper[4771]: I1001 15:16:45.257855 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.257801542 podStartE2EDuration="2.257801542s" podCreationTimestamp="2025-10-01 15:16:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:16:45.229486552 +0000 UTC m=+1249.848661734" watchObservedRunningTime="2025-10-01 15:16:45.257801542 +0000 UTC m=+1249.876976723" Oct 01 15:16:45 crc kubenswrapper[4771]: I1001 15:16:45.267676 4771 scope.go:117] "RemoveContainer" containerID="6a6be13b2a991f9e86be260e28e6555487e4fbf87d6b2dcea6460227fd2a4606" Oct 01 15:16:45 crc kubenswrapper[4771]: E1001 15:16:45.277701 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a6be13b2a991f9e86be260e28e6555487e4fbf87d6b2dcea6460227fd2a4606\": container with ID starting with 6a6be13b2a991f9e86be260e28e6555487e4fbf87d6b2dcea6460227fd2a4606 not found: ID does not exist" containerID="6a6be13b2a991f9e86be260e28e6555487e4fbf87d6b2dcea6460227fd2a4606" Oct 01 15:16:45 crc kubenswrapper[4771]: I1001 15:16:45.277806 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a6be13b2a991f9e86be260e28e6555487e4fbf87d6b2dcea6460227fd2a4606"} err="failed to get container status \"6a6be13b2a991f9e86be260e28e6555487e4fbf87d6b2dcea6460227fd2a4606\": rpc error: code = NotFound desc = could not find container \"6a6be13b2a991f9e86be260e28e6555487e4fbf87d6b2dcea6460227fd2a4606\": container with ID starting with 6a6be13b2a991f9e86be260e28e6555487e4fbf87d6b2dcea6460227fd2a4606 not found: ID does not exist" Oct 01 15:16:45 crc kubenswrapper[4771]: I1001 15:16:45.291842 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 15:16:45 crc kubenswrapper[4771]: I1001 15:16:45.299754 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 15:16:45 crc kubenswrapper[4771]: I1001 15:16:45.307037 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 15:16:45 crc kubenswrapper[4771]: E1001 15:16:45.308844 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d6c6725-6778-463a-89ab-d042864eca91" containerName="nova-scheduler-scheduler" Oct 01 15:16:45 crc kubenswrapper[4771]: I1001 15:16:45.308872 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d6c6725-6778-463a-89ab-d042864eca91" containerName="nova-scheduler-scheduler" Oct 01 15:16:45 crc kubenswrapper[4771]: I1001 15:16:45.309044 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d6c6725-6778-463a-89ab-d042864eca91" containerName="nova-scheduler-scheduler" Oct 01 15:16:45 crc kubenswrapper[4771]: I1001 15:16:45.309845 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 15:16:45 crc kubenswrapper[4771]: I1001 15:16:45.311859 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 15:16:45 crc kubenswrapper[4771]: I1001 15:16:45.317276 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 01 15:16:45 crc kubenswrapper[4771]: I1001 15:16:45.498523 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6dxr\" (UniqueName: \"kubernetes.io/projected/f5a9190d-63c4-47e3-9fcd-ed0e0615d807-kube-api-access-t6dxr\") pod \"nova-scheduler-0\" (UID: \"f5a9190d-63c4-47e3-9fcd-ed0e0615d807\") " pod="openstack/nova-scheduler-0" Oct 01 15:16:45 crc kubenswrapper[4771]: I1001 15:16:45.498647 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a9190d-63c4-47e3-9fcd-ed0e0615d807-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f5a9190d-63c4-47e3-9fcd-ed0e0615d807\") " pod="openstack/nova-scheduler-0" Oct 01 15:16:45 crc kubenswrapper[4771]: I1001 15:16:45.498716 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5a9190d-63c4-47e3-9fcd-ed0e0615d807-config-data\") pod \"nova-scheduler-0\" (UID: \"f5a9190d-63c4-47e3-9fcd-ed0e0615d807\") " pod="openstack/nova-scheduler-0" Oct 01 15:16:45 crc kubenswrapper[4771]: I1001 15:16:45.601020 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6dxr\" (UniqueName: \"kubernetes.io/projected/f5a9190d-63c4-47e3-9fcd-ed0e0615d807-kube-api-access-t6dxr\") pod \"nova-scheduler-0\" (UID: \"f5a9190d-63c4-47e3-9fcd-ed0e0615d807\") " pod="openstack/nova-scheduler-0" Oct 01 15:16:45 crc kubenswrapper[4771]: I1001 15:16:45.601610 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a9190d-63c4-47e3-9fcd-ed0e0615d807-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f5a9190d-63c4-47e3-9fcd-ed0e0615d807\") " pod="openstack/nova-scheduler-0" Oct 01 15:16:45 crc kubenswrapper[4771]: I1001 15:16:45.601840 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5a9190d-63c4-47e3-9fcd-ed0e0615d807-config-data\") pod \"nova-scheduler-0\" (UID: \"f5a9190d-63c4-47e3-9fcd-ed0e0615d807\") " pod="openstack/nova-scheduler-0" Oct 01 15:16:45 crc kubenswrapper[4771]: I1001 15:16:45.611578 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5a9190d-63c4-47e3-9fcd-ed0e0615d807-config-data\") pod \"nova-scheduler-0\" (UID: \"f5a9190d-63c4-47e3-9fcd-ed0e0615d807\") " pod="openstack/nova-scheduler-0" Oct 01 15:16:45 crc kubenswrapper[4771]: I1001 15:16:45.615803 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a9190d-63c4-47e3-9fcd-ed0e0615d807-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f5a9190d-63c4-47e3-9fcd-ed0e0615d807\") " pod="openstack/nova-scheduler-0" Oct 01 15:16:45 crc kubenswrapper[4771]: I1001 15:16:45.624700 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6dxr\" (UniqueName: \"kubernetes.io/projected/f5a9190d-63c4-47e3-9fcd-ed0e0615d807-kube-api-access-t6dxr\") pod \"nova-scheduler-0\" (UID: \"f5a9190d-63c4-47e3-9fcd-ed0e0615d807\") " pod="openstack/nova-scheduler-0" Oct 01 15:16:45 crc kubenswrapper[4771]: I1001 15:16:45.635938 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.012513 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d6c6725-6778-463a-89ab-d042864eca91" path="/var/lib/kubelet/pods/2d6c6725-6778-463a-89ab-d042864eca91/volumes" Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.076542 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.213660 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4909086b-5a48-46b7-af75-e54bddc4014e-combined-ca-bundle\") pod \"4909086b-5a48-46b7-af75-e54bddc4014e\" (UID: \"4909086b-5a48-46b7-af75-e54bddc4014e\") " Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.213758 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4909086b-5a48-46b7-af75-e54bddc4014e-internal-tls-certs\") pod \"4909086b-5a48-46b7-af75-e54bddc4014e\" (UID: \"4909086b-5a48-46b7-af75-e54bddc4014e\") " Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.213852 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4909086b-5a48-46b7-af75-e54bddc4014e-config-data\") pod \"4909086b-5a48-46b7-af75-e54bddc4014e\" (UID: \"4909086b-5a48-46b7-af75-e54bddc4014e\") " Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.213931 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nb4gk\" (UniqueName: \"kubernetes.io/projected/4909086b-5a48-46b7-af75-e54bddc4014e-kube-api-access-nb4gk\") pod \"4909086b-5a48-46b7-af75-e54bddc4014e\" (UID: \"4909086b-5a48-46b7-af75-e54bddc4014e\") " Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.213979 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4909086b-5a48-46b7-af75-e54bddc4014e-public-tls-certs\") pod \"4909086b-5a48-46b7-af75-e54bddc4014e\" (UID: \"4909086b-5a48-46b7-af75-e54bddc4014e\") " Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.214053 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4909086b-5a48-46b7-af75-e54bddc4014e-logs\") pod \"4909086b-5a48-46b7-af75-e54bddc4014e\" (UID: \"4909086b-5a48-46b7-af75-e54bddc4014e\") " Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.214653 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4909086b-5a48-46b7-af75-e54bddc4014e-logs" (OuterVolumeSpecName: "logs") pod "4909086b-5a48-46b7-af75-e54bddc4014e" (UID: "4909086b-5a48-46b7-af75-e54bddc4014e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.217927 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4909086b-5a48-46b7-af75-e54bddc4014e-kube-api-access-nb4gk" (OuterVolumeSpecName: "kube-api-access-nb4gk") pod "4909086b-5a48-46b7-af75-e54bddc4014e" (UID: "4909086b-5a48-46b7-af75-e54bddc4014e"). InnerVolumeSpecName "kube-api-access-nb4gk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.221454 4771 generic.go:334] "Generic (PLEG): container finished" podID="4909086b-5a48-46b7-af75-e54bddc4014e" containerID="35af8f22ec4841f2f5bcc95e24bf765a91c3a73ee9ff6421a43c92c642c456d1" exitCode=0 Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.221511 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.221557 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4909086b-5a48-46b7-af75-e54bddc4014e","Type":"ContainerDied","Data":"35af8f22ec4841f2f5bcc95e24bf765a91c3a73ee9ff6421a43c92c642c456d1"} Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.221600 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4909086b-5a48-46b7-af75-e54bddc4014e","Type":"ContainerDied","Data":"1b12ecb27fa9880765a8df4a62ec21b0f92ed7c7043ee970caa275baac1d02fb"} Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.221623 4771 scope.go:117] "RemoveContainer" containerID="35af8f22ec4841f2f5bcc95e24bf765a91c3a73ee9ff6421a43c92c642c456d1" Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.244199 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4909086b-5a48-46b7-af75-e54bddc4014e-config-data" (OuterVolumeSpecName: "config-data") pod "4909086b-5a48-46b7-af75-e54bddc4014e" (UID: "4909086b-5a48-46b7-af75-e54bddc4014e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.266631 4771 scope.go:117] "RemoveContainer" containerID="9c02131a2868b23ccc91f0912dd1b14039bb369303116298fff6d95ba6f327d1" Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.272812 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4909086b-5a48-46b7-af75-e54bddc4014e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4909086b-5a48-46b7-af75-e54bddc4014e" (UID: "4909086b-5a48-46b7-af75-e54bddc4014e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.287911 4771 scope.go:117] "RemoveContainer" containerID="35af8f22ec4841f2f5bcc95e24bf765a91c3a73ee9ff6421a43c92c642c456d1" Oct 01 15:16:46 crc kubenswrapper[4771]: E1001 15:16:46.288323 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35af8f22ec4841f2f5bcc95e24bf765a91c3a73ee9ff6421a43c92c642c456d1\": container with ID starting with 35af8f22ec4841f2f5bcc95e24bf765a91c3a73ee9ff6421a43c92c642c456d1 not found: ID does not exist" containerID="35af8f22ec4841f2f5bcc95e24bf765a91c3a73ee9ff6421a43c92c642c456d1" Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.288365 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35af8f22ec4841f2f5bcc95e24bf765a91c3a73ee9ff6421a43c92c642c456d1"} err="failed to get container status \"35af8f22ec4841f2f5bcc95e24bf765a91c3a73ee9ff6421a43c92c642c456d1\": rpc error: code = NotFound desc = could not find container \"35af8f22ec4841f2f5bcc95e24bf765a91c3a73ee9ff6421a43c92c642c456d1\": container with ID starting with 35af8f22ec4841f2f5bcc95e24bf765a91c3a73ee9ff6421a43c92c642c456d1 not found: ID does not exist" Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.288400 4771 scope.go:117] "RemoveContainer" containerID="9c02131a2868b23ccc91f0912dd1b14039bb369303116298fff6d95ba6f327d1" Oct 01 15:16:46 crc kubenswrapper[4771]: E1001 15:16:46.288833 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c02131a2868b23ccc91f0912dd1b14039bb369303116298fff6d95ba6f327d1\": container with ID starting with 9c02131a2868b23ccc91f0912dd1b14039bb369303116298fff6d95ba6f327d1 not found: ID does not exist" containerID="9c02131a2868b23ccc91f0912dd1b14039bb369303116298fff6d95ba6f327d1" Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.288889 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c02131a2868b23ccc91f0912dd1b14039bb369303116298fff6d95ba6f327d1"} err="failed to get container status \"9c02131a2868b23ccc91f0912dd1b14039bb369303116298fff6d95ba6f327d1\": rpc error: code = NotFound desc = could not find container \"9c02131a2868b23ccc91f0912dd1b14039bb369303116298fff6d95ba6f327d1\": container with ID starting with 9c02131a2868b23ccc91f0912dd1b14039bb369303116298fff6d95ba6f327d1 not found: ID does not exist" Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.294642 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4909086b-5a48-46b7-af75-e54bddc4014e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4909086b-5a48-46b7-af75-e54bddc4014e" (UID: "4909086b-5a48-46b7-af75-e54bddc4014e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.316411 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4909086b-5a48-46b7-af75-e54bddc4014e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.316443 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4909086b-5a48-46b7-af75-e54bddc4014e-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.316464 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nb4gk\" (UniqueName: \"kubernetes.io/projected/4909086b-5a48-46b7-af75-e54bddc4014e-kube-api-access-nb4gk\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.316493 4771 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4909086b-5a48-46b7-af75-e54bddc4014e-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.316510 4771 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4909086b-5a48-46b7-af75-e54bddc4014e-logs\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.317006 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4909086b-5a48-46b7-af75-e54bddc4014e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4909086b-5a48-46b7-af75-e54bddc4014e" (UID: "4909086b-5a48-46b7-af75-e54bddc4014e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.418721 4771 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4909086b-5a48-46b7-af75-e54bddc4014e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.568602 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.581178 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.591400 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 01 15:16:46 crc kubenswrapper[4771]: E1001 15:16:46.591851 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4909086b-5a48-46b7-af75-e54bddc4014e" containerName="nova-api-api" Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.591875 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="4909086b-5a48-46b7-af75-e54bddc4014e" containerName="nova-api-api" Oct 01 15:16:46 crc kubenswrapper[4771]: E1001 15:16:46.591907 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4909086b-5a48-46b7-af75-e54bddc4014e" containerName="nova-api-log" Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.591917 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="4909086b-5a48-46b7-af75-e54bddc4014e" containerName="nova-api-log" Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.592138 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="4909086b-5a48-46b7-af75-e54bddc4014e" containerName="nova-api-api" Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.592173 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="4909086b-5a48-46b7-af75-e54bddc4014e" containerName="nova-api-log" Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.593305 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.597970 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.598039 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.598340 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.616235 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.722828 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5ed9a0d-0d21-4432-aae5-dca422c5c331-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f5ed9a0d-0d21-4432-aae5-dca422c5c331\") " pod="openstack/nova-api-0" Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.723323 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5ed9a0d-0d21-4432-aae5-dca422c5c331-public-tls-certs\") pod \"nova-api-0\" (UID: \"f5ed9a0d-0d21-4432-aae5-dca422c5c331\") " pod="openstack/nova-api-0" Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.723353 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5ed9a0d-0d21-4432-aae5-dca422c5c331-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f5ed9a0d-0d21-4432-aae5-dca422c5c331\") " pod="openstack/nova-api-0" Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.723465 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5ed9a0d-0d21-4432-aae5-dca422c5c331-config-data\") pod \"nova-api-0\" (UID: \"f5ed9a0d-0d21-4432-aae5-dca422c5c331\") " pod="openstack/nova-api-0" Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.723528 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbxhd\" (UniqueName: \"kubernetes.io/projected/f5ed9a0d-0d21-4432-aae5-dca422c5c331-kube-api-access-vbxhd\") pod \"nova-api-0\" (UID: \"f5ed9a0d-0d21-4432-aae5-dca422c5c331\") " pod="openstack/nova-api-0" Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.723591 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5ed9a0d-0d21-4432-aae5-dca422c5c331-logs\") pod \"nova-api-0\" (UID: \"f5ed9a0d-0d21-4432-aae5-dca422c5c331\") " pod="openstack/nova-api-0" Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.776783 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 01 15:16:46 crc kubenswrapper[4771]: W1001 15:16:46.780842 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5a9190d_63c4_47e3_9fcd_ed0e0615d807.slice/crio-7042b9210ed84ed48cbecd868b0518732594dba68cef808059cfdc11fa4f33b6 WatchSource:0}: Error finding container 7042b9210ed84ed48cbecd868b0518732594dba68cef808059cfdc11fa4f33b6: Status 404 returned error can't find the container with id 7042b9210ed84ed48cbecd868b0518732594dba68cef808059cfdc11fa4f33b6 Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.825322 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5ed9a0d-0d21-4432-aae5-dca422c5c331-public-tls-certs\") pod \"nova-api-0\" (UID: \"f5ed9a0d-0d21-4432-aae5-dca422c5c331\") " pod="openstack/nova-api-0" Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.825370 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5ed9a0d-0d21-4432-aae5-dca422c5c331-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f5ed9a0d-0d21-4432-aae5-dca422c5c331\") " pod="openstack/nova-api-0" Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.825438 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5ed9a0d-0d21-4432-aae5-dca422c5c331-config-data\") pod \"nova-api-0\" (UID: \"f5ed9a0d-0d21-4432-aae5-dca422c5c331\") " pod="openstack/nova-api-0" Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.825477 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbxhd\" (UniqueName: \"kubernetes.io/projected/f5ed9a0d-0d21-4432-aae5-dca422c5c331-kube-api-access-vbxhd\") pod \"nova-api-0\" (UID: \"f5ed9a0d-0d21-4432-aae5-dca422c5c331\") " pod="openstack/nova-api-0" Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.825505 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5ed9a0d-0d21-4432-aae5-dca422c5c331-logs\") pod \"nova-api-0\" (UID: \"f5ed9a0d-0d21-4432-aae5-dca422c5c331\") " pod="openstack/nova-api-0" Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.825540 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5ed9a0d-0d21-4432-aae5-dca422c5c331-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f5ed9a0d-0d21-4432-aae5-dca422c5c331\") " pod="openstack/nova-api-0" Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.826115 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f5ed9a0d-0d21-4432-aae5-dca422c5c331-logs\") pod \"nova-api-0\" (UID: \"f5ed9a0d-0d21-4432-aae5-dca422c5c331\") " pod="openstack/nova-api-0" Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.830199 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5ed9a0d-0d21-4432-aae5-dca422c5c331-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f5ed9a0d-0d21-4432-aae5-dca422c5c331\") " pod="openstack/nova-api-0" Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.830205 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5ed9a0d-0d21-4432-aae5-dca422c5c331-config-data\") pod \"nova-api-0\" (UID: \"f5ed9a0d-0d21-4432-aae5-dca422c5c331\") " pod="openstack/nova-api-0" Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.831201 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5ed9a0d-0d21-4432-aae5-dca422c5c331-public-tls-certs\") pod \"nova-api-0\" (UID: \"f5ed9a0d-0d21-4432-aae5-dca422c5c331\") " pod="openstack/nova-api-0" Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.833132 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5ed9a0d-0d21-4432-aae5-dca422c5c331-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f5ed9a0d-0d21-4432-aae5-dca422c5c331\") " pod="openstack/nova-api-0" Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.844825 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbxhd\" (UniqueName: \"kubernetes.io/projected/f5ed9a0d-0d21-4432-aae5-dca422c5c331-kube-api-access-vbxhd\") pod \"nova-api-0\" (UID: \"f5ed9a0d-0d21-4432-aae5-dca422c5c331\") " pod="openstack/nova-api-0" Oct 01 15:16:46 crc kubenswrapper[4771]: I1001 15:16:46.961892 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 01 15:16:47 crc kubenswrapper[4771]: I1001 15:16:47.237633 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f5a9190d-63c4-47e3-9fcd-ed0e0615d807","Type":"ContainerStarted","Data":"0618a8aa7e222a6e45d8e21d55583b922185ae3b7ee9fa64ce7c96d6d7e28629"} Oct 01 15:16:47 crc kubenswrapper[4771]: I1001 15:16:47.238011 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f5a9190d-63c4-47e3-9fcd-ed0e0615d807","Type":"ContainerStarted","Data":"7042b9210ed84ed48cbecd868b0518732594dba68cef808059cfdc11fa4f33b6"} Oct 01 15:16:47 crc kubenswrapper[4771]: I1001 15:16:47.451763 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.45166313 podStartE2EDuration="2.45166313s" podCreationTimestamp="2025-10-01 15:16:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:16:47.25539177 +0000 UTC m=+1251.874566951" watchObservedRunningTime="2025-10-01 15:16:47.45166313 +0000 UTC m=+1252.070838291" Oct 01 15:16:47 crc kubenswrapper[4771]: I1001 15:16:47.457740 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 01 15:16:48 crc kubenswrapper[4771]: I1001 15:16:48.001291 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4909086b-5a48-46b7-af75-e54bddc4014e" path="/var/lib/kubelet/pods/4909086b-5a48-46b7-af75-e54bddc4014e/volumes" Oct 01 15:16:48 crc kubenswrapper[4771]: I1001 15:16:48.251441 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f5ed9a0d-0d21-4432-aae5-dca422c5c331","Type":"ContainerStarted","Data":"179d38946de13a3e4f3d421d10250688e7885ddea26652ee72551da6a04ec5d6"} Oct 01 15:16:48 crc kubenswrapper[4771]: I1001 15:16:48.251933 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f5ed9a0d-0d21-4432-aae5-dca422c5c331","Type":"ContainerStarted","Data":"31d5def552e1ed0a2e39fca78dae708be99c152cd79dec1e2423a7f5661c9822"} Oct 01 15:16:48 crc kubenswrapper[4771]: I1001 15:16:48.251960 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f5ed9a0d-0d21-4432-aae5-dca422c5c331","Type":"ContainerStarted","Data":"4bd0237d6eefb7f75451010a276073d2dfc41d090a0f8d8946b27fe58ea2c13b"} Oct 01 15:16:48 crc kubenswrapper[4771]: I1001 15:16:48.289586 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.28956067 podStartE2EDuration="2.28956067s" podCreationTimestamp="2025-10-01 15:16:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:16:48.287159885 +0000 UTC m=+1252.906335056" watchObservedRunningTime="2025-10-01 15:16:48.28956067 +0000 UTC m=+1252.908735871" Oct 01 15:16:48 crc kubenswrapper[4771]: I1001 15:16:48.550697 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 01 15:16:48 crc kubenswrapper[4771]: I1001 15:16:48.550794 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 01 15:16:50 crc kubenswrapper[4771]: I1001 15:16:50.636710 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 01 15:16:53 crc kubenswrapper[4771]: I1001 15:16:53.550636 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 01 15:16:53 crc kubenswrapper[4771]: I1001 15:16:53.551187 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 01 15:16:54 crc kubenswrapper[4771]: I1001 15:16:54.569934 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="10227ed2-4069-45bc-b3b9-091bb98d72af" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 15:16:54 crc kubenswrapper[4771]: I1001 15:16:54.569992 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="10227ed2-4069-45bc-b3b9-091bb98d72af" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.206:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 15:16:55 crc kubenswrapper[4771]: I1001 15:16:55.636508 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 01 15:16:55 crc kubenswrapper[4771]: I1001 15:16:55.688803 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 01 15:16:56 crc kubenswrapper[4771]: I1001 15:16:56.388676 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 01 15:16:56 crc kubenswrapper[4771]: I1001 15:16:56.962503 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 15:16:56 crc kubenswrapper[4771]: I1001 15:16:56.962568 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 01 15:16:57 crc kubenswrapper[4771]: I1001 15:16:57.977994 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f5ed9a0d-0d21-4432-aae5-dca422c5c331" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.208:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 01 15:16:57 crc kubenswrapper[4771]: I1001 15:16:57.978028 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f5ed9a0d-0d21-4432-aae5-dca422c5c331" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.208:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 01 15:17:01 crc kubenswrapper[4771]: I1001 15:17:01.430166 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 01 15:17:03 crc kubenswrapper[4771]: I1001 15:17:03.561533 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 01 15:17:03 crc kubenswrapper[4771]: I1001 15:17:03.563817 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 01 15:17:03 crc kubenswrapper[4771]: I1001 15:17:03.576165 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 01 15:17:04 crc kubenswrapper[4771]: I1001 15:17:04.437826 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 01 15:17:06 crc kubenswrapper[4771]: I1001 15:17:06.974478 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 01 15:17:06 crc kubenswrapper[4771]: I1001 15:17:06.975723 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 01 15:17:06 crc kubenswrapper[4771]: I1001 15:17:06.983094 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 01 15:17:06 crc kubenswrapper[4771]: I1001 15:17:06.986293 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 01 15:17:07 crc kubenswrapper[4771]: I1001 15:17:07.461651 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 01 15:17:07 crc kubenswrapper[4771]: I1001 15:17:07.469064 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 01 15:17:12 crc kubenswrapper[4771]: I1001 15:17:12.177483 4771 patch_prober.go:28] interesting pod/machine-config-daemon-vck47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 15:17:12 crc kubenswrapper[4771]: I1001 15:17:12.178367 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 15:17:15 crc kubenswrapper[4771]: I1001 15:17:15.623327 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 15:17:16 crc kubenswrapper[4771]: I1001 15:17:16.510050 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 15:17:19 crc kubenswrapper[4771]: I1001 15:17:19.640670 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="4ef68dad-0f62-4a2d-aa86-23997c284df0" containerName="rabbitmq" containerID="cri-o://880f5a58079daaa348b357479c7cf97273062a832a8ce419c6f4d3ec57ffd1f3" gracePeriod=604796 Oct 01 15:17:20 crc kubenswrapper[4771]: I1001 15:17:20.346935 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="f11eb6e5-8306-4db5-af63-ef4d869f7e2c" containerName="rabbitmq" containerID="cri-o://588dc45bf31c18016b37452ed3e4b68b57aad853692243431af12969d4a441e0" gracePeriod=604797 Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.214659 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.381131 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4ef68dad-0f62-4a2d-aa86-23997c284df0-rabbitmq-erlang-cookie\") pod \"4ef68dad-0f62-4a2d-aa86-23997c284df0\" (UID: \"4ef68dad-0f62-4a2d-aa86-23997c284df0\") " Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.381215 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4ef68dad-0f62-4a2d-aa86-23997c284df0-rabbitmq-tls\") pod \"4ef68dad-0f62-4a2d-aa86-23997c284df0\" (UID: \"4ef68dad-0f62-4a2d-aa86-23997c284df0\") " Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.381262 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4ef68dad-0f62-4a2d-aa86-23997c284df0-config-data\") pod \"4ef68dad-0f62-4a2d-aa86-23997c284df0\" (UID: \"4ef68dad-0f62-4a2d-aa86-23997c284df0\") " Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.381299 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"4ef68dad-0f62-4a2d-aa86-23997c284df0\" (UID: \"4ef68dad-0f62-4a2d-aa86-23997c284df0\") " Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.381419 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4ef68dad-0f62-4a2d-aa86-23997c284df0-server-conf\") pod \"4ef68dad-0f62-4a2d-aa86-23997c284df0\" (UID: \"4ef68dad-0f62-4a2d-aa86-23997c284df0\") " Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.381452 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4ef68dad-0f62-4a2d-aa86-23997c284df0-pod-info\") pod \"4ef68dad-0f62-4a2d-aa86-23997c284df0\" (UID: \"4ef68dad-0f62-4a2d-aa86-23997c284df0\") " Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.381477 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4ef68dad-0f62-4a2d-aa86-23997c284df0-rabbitmq-plugins\") pod \"4ef68dad-0f62-4a2d-aa86-23997c284df0\" (UID: \"4ef68dad-0f62-4a2d-aa86-23997c284df0\") " Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.381513 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4ef68dad-0f62-4a2d-aa86-23997c284df0-erlang-cookie-secret\") pod \"4ef68dad-0f62-4a2d-aa86-23997c284df0\" (UID: \"4ef68dad-0f62-4a2d-aa86-23997c284df0\") " Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.381551 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vj5v4\" (UniqueName: \"kubernetes.io/projected/4ef68dad-0f62-4a2d-aa86-23997c284df0-kube-api-access-vj5v4\") pod \"4ef68dad-0f62-4a2d-aa86-23997c284df0\" (UID: \"4ef68dad-0f62-4a2d-aa86-23997c284df0\") " Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.381659 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ef68dad-0f62-4a2d-aa86-23997c284df0-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "4ef68dad-0f62-4a2d-aa86-23997c284df0" (UID: "4ef68dad-0f62-4a2d-aa86-23997c284df0"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.381849 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ef68dad-0f62-4a2d-aa86-23997c284df0-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "4ef68dad-0f62-4a2d-aa86-23997c284df0" (UID: "4ef68dad-0f62-4a2d-aa86-23997c284df0"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.381637 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4ef68dad-0f62-4a2d-aa86-23997c284df0-rabbitmq-confd\") pod \"4ef68dad-0f62-4a2d-aa86-23997c284df0\" (UID: \"4ef68dad-0f62-4a2d-aa86-23997c284df0\") " Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.382312 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4ef68dad-0f62-4a2d-aa86-23997c284df0-plugins-conf\") pod \"4ef68dad-0f62-4a2d-aa86-23997c284df0\" (UID: \"4ef68dad-0f62-4a2d-aa86-23997c284df0\") " Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.382712 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ef68dad-0f62-4a2d-aa86-23997c284df0-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "4ef68dad-0f62-4a2d-aa86-23997c284df0" (UID: "4ef68dad-0f62-4a2d-aa86-23997c284df0"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.382982 4771 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4ef68dad-0f62-4a2d-aa86-23997c284df0-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.383004 4771 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4ef68dad-0f62-4a2d-aa86-23997c284df0-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.383018 4771 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4ef68dad-0f62-4a2d-aa86-23997c284df0-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.388211 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ef68dad-0f62-4a2d-aa86-23997c284df0-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "4ef68dad-0f62-4a2d-aa86-23997c284df0" (UID: "4ef68dad-0f62-4a2d-aa86-23997c284df0"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.402065 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ef68dad-0f62-4a2d-aa86-23997c284df0-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "4ef68dad-0f62-4a2d-aa86-23997c284df0" (UID: "4ef68dad-0f62-4a2d-aa86-23997c284df0"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.402154 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/4ef68dad-0f62-4a2d-aa86-23997c284df0-pod-info" (OuterVolumeSpecName: "pod-info") pod "4ef68dad-0f62-4a2d-aa86-23997c284df0" (UID: "4ef68dad-0f62-4a2d-aa86-23997c284df0"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.402572 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ef68dad-0f62-4a2d-aa86-23997c284df0-kube-api-access-vj5v4" (OuterVolumeSpecName: "kube-api-access-vj5v4") pod "4ef68dad-0f62-4a2d-aa86-23997c284df0" (UID: "4ef68dad-0f62-4a2d-aa86-23997c284df0"). InnerVolumeSpecName "kube-api-access-vj5v4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.404322 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "4ef68dad-0f62-4a2d-aa86-23997c284df0" (UID: "4ef68dad-0f62-4a2d-aa86-23997c284df0"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.428963 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ef68dad-0f62-4a2d-aa86-23997c284df0-config-data" (OuterVolumeSpecName: "config-data") pod "4ef68dad-0f62-4a2d-aa86-23997c284df0" (UID: "4ef68dad-0f62-4a2d-aa86-23997c284df0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.446692 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ef68dad-0f62-4a2d-aa86-23997c284df0-server-conf" (OuterVolumeSpecName: "server-conf") pod "4ef68dad-0f62-4a2d-aa86-23997c284df0" (UID: "4ef68dad-0f62-4a2d-aa86-23997c284df0"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.484511 4771 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4ef68dad-0f62-4a2d-aa86-23997c284df0-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.484536 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4ef68dad-0f62-4a2d-aa86-23997c284df0-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.484563 4771 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.484572 4771 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4ef68dad-0f62-4a2d-aa86-23997c284df0-server-conf\") on node \"crc\" DevicePath \"\"" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.484580 4771 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4ef68dad-0f62-4a2d-aa86-23997c284df0-pod-info\") on node \"crc\" DevicePath \"\"" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.484590 4771 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4ef68dad-0f62-4a2d-aa86-23997c284df0-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.484600 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vj5v4\" (UniqueName: \"kubernetes.io/projected/4ef68dad-0f62-4a2d-aa86-23997c284df0-kube-api-access-vj5v4\") on node \"crc\" DevicePath \"\"" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.508471 4771 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.518967 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ef68dad-0f62-4a2d-aa86-23997c284df0-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "4ef68dad-0f62-4a2d-aa86-23997c284df0" (UID: "4ef68dad-0f62-4a2d-aa86-23997c284df0"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.585892 4771 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.585930 4771 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4ef68dad-0f62-4a2d-aa86-23997c284df0-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.700879 4771 generic.go:334] "Generic (PLEG): container finished" podID="f11eb6e5-8306-4db5-af63-ef4d869f7e2c" containerID="588dc45bf31c18016b37452ed3e4b68b57aad853692243431af12969d4a441e0" exitCode=0 Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.701014 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f11eb6e5-8306-4db5-af63-ef4d869f7e2c","Type":"ContainerDied","Data":"588dc45bf31c18016b37452ed3e4b68b57aad853692243431af12969d4a441e0"} Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.715080 4771 generic.go:334] "Generic (PLEG): container finished" podID="4ef68dad-0f62-4a2d-aa86-23997c284df0" containerID="880f5a58079daaa348b357479c7cf97273062a832a8ce419c6f4d3ec57ffd1f3" exitCode=0 Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.715130 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4ef68dad-0f62-4a2d-aa86-23997c284df0","Type":"ContainerDied","Data":"880f5a58079daaa348b357479c7cf97273062a832a8ce419c6f4d3ec57ffd1f3"} Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.715165 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4ef68dad-0f62-4a2d-aa86-23997c284df0","Type":"ContainerDied","Data":"ba6d3137207476036e159eb1e79b492d458e6bd71f21bb58a90d1f0aafb6d4da"} Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.715189 4771 scope.go:117] "RemoveContainer" containerID="880f5a58079daaa348b357479c7cf97273062a832a8ce419c6f4d3ec57ffd1f3" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.715253 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.735484 4771 scope.go:117] "RemoveContainer" containerID="6d718463a0c87a6bd2faba1cb023cacfebbf31e9054a026b22782b0248bd589e" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.778611 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.786303 4771 scope.go:117] "RemoveContainer" containerID="880f5a58079daaa348b357479c7cf97273062a832a8ce419c6f4d3ec57ffd1f3" Oct 01 15:17:26 crc kubenswrapper[4771]: E1001 15:17:26.790108 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"880f5a58079daaa348b357479c7cf97273062a832a8ce419c6f4d3ec57ffd1f3\": container with ID starting with 880f5a58079daaa348b357479c7cf97273062a832a8ce419c6f4d3ec57ffd1f3 not found: ID does not exist" containerID="880f5a58079daaa348b357479c7cf97273062a832a8ce419c6f4d3ec57ffd1f3" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.790154 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"880f5a58079daaa348b357479c7cf97273062a832a8ce419c6f4d3ec57ffd1f3"} err="failed to get container status \"880f5a58079daaa348b357479c7cf97273062a832a8ce419c6f4d3ec57ffd1f3\": rpc error: code = NotFound desc = could not find container \"880f5a58079daaa348b357479c7cf97273062a832a8ce419c6f4d3ec57ffd1f3\": container with ID starting with 880f5a58079daaa348b357479c7cf97273062a832a8ce419c6f4d3ec57ffd1f3 not found: ID does not exist" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.790186 4771 scope.go:117] "RemoveContainer" containerID="6d718463a0c87a6bd2faba1cb023cacfebbf31e9054a026b22782b0248bd589e" Oct 01 15:17:26 crc kubenswrapper[4771]: E1001 15:17:26.791362 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d718463a0c87a6bd2faba1cb023cacfebbf31e9054a026b22782b0248bd589e\": container with ID starting with 6d718463a0c87a6bd2faba1cb023cacfebbf31e9054a026b22782b0248bd589e not found: ID does not exist" containerID="6d718463a0c87a6bd2faba1cb023cacfebbf31e9054a026b22782b0248bd589e" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.791390 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d718463a0c87a6bd2faba1cb023cacfebbf31e9054a026b22782b0248bd589e"} err="failed to get container status \"6d718463a0c87a6bd2faba1cb023cacfebbf31e9054a026b22782b0248bd589e\": rpc error: code = NotFound desc = could not find container \"6d718463a0c87a6bd2faba1cb023cacfebbf31e9054a026b22782b0248bd589e\": container with ID starting with 6d718463a0c87a6bd2faba1cb023cacfebbf31e9054a026b22782b0248bd589e not found: ID does not exist" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.793311 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.811560 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 15:17:26 crc kubenswrapper[4771]: E1001 15:17:26.812112 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ef68dad-0f62-4a2d-aa86-23997c284df0" containerName="rabbitmq" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.812126 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ef68dad-0f62-4a2d-aa86-23997c284df0" containerName="rabbitmq" Oct 01 15:17:26 crc kubenswrapper[4771]: E1001 15:17:26.812138 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ef68dad-0f62-4a2d-aa86-23997c284df0" containerName="setup-container" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.812146 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ef68dad-0f62-4a2d-aa86-23997c284df0" containerName="setup-container" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.812327 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ef68dad-0f62-4a2d-aa86-23997c284df0" containerName="rabbitmq" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.813360 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.815278 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.815336 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.815877 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.816198 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.816373 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.816565 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-p24np" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.819068 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.825147 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.952228 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.992660 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ad32a9ec-a803-4d44-a4c1-03447e26e983-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ad32a9ec-a803-4d44-a4c1-03447e26e983\") " pod="openstack/rabbitmq-server-0" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.992707 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ad32a9ec-a803-4d44-a4c1-03447e26e983-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ad32a9ec-a803-4d44-a4c1-03447e26e983\") " pod="openstack/rabbitmq-server-0" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.992796 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ad32a9ec-a803-4d44-a4c1-03447e26e983-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ad32a9ec-a803-4d44-a4c1-03447e26e983\") " pod="openstack/rabbitmq-server-0" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.992817 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"ad32a9ec-a803-4d44-a4c1-03447e26e983\") " pod="openstack/rabbitmq-server-0" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.992833 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ad32a9ec-a803-4d44-a4c1-03447e26e983-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ad32a9ec-a803-4d44-a4c1-03447e26e983\") " pod="openstack/rabbitmq-server-0" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.992887 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sdtc\" (UniqueName: \"kubernetes.io/projected/ad32a9ec-a803-4d44-a4c1-03447e26e983-kube-api-access-2sdtc\") pod \"rabbitmq-server-0\" (UID: \"ad32a9ec-a803-4d44-a4c1-03447e26e983\") " pod="openstack/rabbitmq-server-0" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.992924 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ad32a9ec-a803-4d44-a4c1-03447e26e983-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ad32a9ec-a803-4d44-a4c1-03447e26e983\") " pod="openstack/rabbitmq-server-0" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.992951 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad32a9ec-a803-4d44-a4c1-03447e26e983-config-data\") pod \"rabbitmq-server-0\" (UID: \"ad32a9ec-a803-4d44-a4c1-03447e26e983\") " pod="openstack/rabbitmq-server-0" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.993010 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ad32a9ec-a803-4d44-a4c1-03447e26e983-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ad32a9ec-a803-4d44-a4c1-03447e26e983\") " pod="openstack/rabbitmq-server-0" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.993045 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ad32a9ec-a803-4d44-a4c1-03447e26e983-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ad32a9ec-a803-4d44-a4c1-03447e26e983\") " pod="openstack/rabbitmq-server-0" Oct 01 15:17:26 crc kubenswrapper[4771]: I1001 15:17:26.993131 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ad32a9ec-a803-4d44-a4c1-03447e26e983-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ad32a9ec-a803-4d44-a4c1-03447e26e983\") " pod="openstack/rabbitmq-server-0" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.094069 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-rabbitmq-erlang-cookie\") pod \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\" (UID: \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\") " Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.094232 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-plugins-conf\") pod \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\" (UID: \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\") " Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.094267 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-rabbitmq-plugins\") pod \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\" (UID: \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\") " Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.094291 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-config-data\") pod \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\" (UID: \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\") " Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.094366 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-erlang-cookie-secret\") pod \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\" (UID: \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\") " Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.094406 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-server-conf\") pod \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\" (UID: \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\") " Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.094804 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "f11eb6e5-8306-4db5-af63-ef4d869f7e2c" (UID: "f11eb6e5-8306-4db5-af63-ef4d869f7e2c"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.095002 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "f11eb6e5-8306-4db5-af63-ef4d869f7e2c" (UID: "f11eb6e5-8306-4db5-af63-ef4d869f7e2c"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.095022 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "f11eb6e5-8306-4db5-af63-ef4d869f7e2c" (UID: "f11eb6e5-8306-4db5-af63-ef4d869f7e2c"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.095185 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\" (UID: \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\") " Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.095459 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-rabbitmq-tls\") pod \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\" (UID: \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\") " Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.095545 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-rabbitmq-confd\") pod \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\" (UID: \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\") " Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.095581 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-pod-info\") pod \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\" (UID: \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\") " Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.095608 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcsk9\" (UniqueName: \"kubernetes.io/projected/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-kube-api-access-tcsk9\") pod \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\" (UID: \"f11eb6e5-8306-4db5-af63-ef4d869f7e2c\") " Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.095962 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ad32a9ec-a803-4d44-a4c1-03447e26e983-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ad32a9ec-a803-4d44-a4c1-03447e26e983\") " pod="openstack/rabbitmq-server-0" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.096008 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ad32a9ec-a803-4d44-a4c1-03447e26e983-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ad32a9ec-a803-4d44-a4c1-03447e26e983\") " pod="openstack/rabbitmq-server-0" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.096072 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ad32a9ec-a803-4d44-a4c1-03447e26e983-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ad32a9ec-a803-4d44-a4c1-03447e26e983\") " pod="openstack/rabbitmq-server-0" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.096099 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ad32a9ec-a803-4d44-a4c1-03447e26e983-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ad32a9ec-a803-4d44-a4c1-03447e26e983\") " pod="openstack/rabbitmq-server-0" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.096152 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ad32a9ec-a803-4d44-a4c1-03447e26e983-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ad32a9ec-a803-4d44-a4c1-03447e26e983\") " pod="openstack/rabbitmq-server-0" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.096171 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ad32a9ec-a803-4d44-a4c1-03447e26e983-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ad32a9ec-a803-4d44-a4c1-03447e26e983\") " pod="openstack/rabbitmq-server-0" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.096206 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"ad32a9ec-a803-4d44-a4c1-03447e26e983\") " pod="openstack/rabbitmq-server-0" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.096250 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sdtc\" (UniqueName: \"kubernetes.io/projected/ad32a9ec-a803-4d44-a4c1-03447e26e983-kube-api-access-2sdtc\") pod \"rabbitmq-server-0\" (UID: \"ad32a9ec-a803-4d44-a4c1-03447e26e983\") " pod="openstack/rabbitmq-server-0" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.096320 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ad32a9ec-a803-4d44-a4c1-03447e26e983-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ad32a9ec-a803-4d44-a4c1-03447e26e983\") " pod="openstack/rabbitmq-server-0" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.096354 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad32a9ec-a803-4d44-a4c1-03447e26e983-config-data\") pod \"rabbitmq-server-0\" (UID: \"ad32a9ec-a803-4d44-a4c1-03447e26e983\") " pod="openstack/rabbitmq-server-0" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.096464 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ad32a9ec-a803-4d44-a4c1-03447e26e983-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ad32a9ec-a803-4d44-a4c1-03447e26e983\") " pod="openstack/rabbitmq-server-0" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.096516 4771 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.096527 4771 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.096537 4771 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.098727 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "persistence") pod "f11eb6e5-8306-4db5-af63-ef4d869f7e2c" (UID: "f11eb6e5-8306-4db5-af63-ef4d869f7e2c"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.100287 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ad32a9ec-a803-4d44-a4c1-03447e26e983-server-conf\") pod \"rabbitmq-server-0\" (UID: \"ad32a9ec-a803-4d44-a4c1-03447e26e983\") " pod="openstack/rabbitmq-server-0" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.100727 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ad32a9ec-a803-4d44-a4c1-03447e26e983-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"ad32a9ec-a803-4d44-a4c1-03447e26e983\") " pod="openstack/rabbitmq-server-0" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.100960 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"ad32a9ec-a803-4d44-a4c1-03447e26e983\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.101240 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ad32a9ec-a803-4d44-a4c1-03447e26e983-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"ad32a9ec-a803-4d44-a4c1-03447e26e983\") " pod="openstack/rabbitmq-server-0" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.103535 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-pod-info" (OuterVolumeSpecName: "pod-info") pod "f11eb6e5-8306-4db5-af63-ef4d869f7e2c" (UID: "f11eb6e5-8306-4db5-af63-ef4d869f7e2c"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.103657 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ad32a9ec-a803-4d44-a4c1-03447e26e983-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"ad32a9ec-a803-4d44-a4c1-03447e26e983\") " pod="openstack/rabbitmq-server-0" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.100961 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ad32a9ec-a803-4d44-a4c1-03447e26e983-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"ad32a9ec-a803-4d44-a4c1-03447e26e983\") " pod="openstack/rabbitmq-server-0" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.105268 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ad32a9ec-a803-4d44-a4c1-03447e26e983-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"ad32a9ec-a803-4d44-a4c1-03447e26e983\") " pod="openstack/rabbitmq-server-0" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.106115 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ad32a9ec-a803-4d44-a4c1-03447e26e983-config-data\") pod \"rabbitmq-server-0\" (UID: \"ad32a9ec-a803-4d44-a4c1-03447e26e983\") " pod="openstack/rabbitmq-server-0" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.107521 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "f11eb6e5-8306-4db5-af63-ef4d869f7e2c" (UID: "f11eb6e5-8306-4db5-af63-ef4d869f7e2c"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.107622 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-kube-api-access-tcsk9" (OuterVolumeSpecName: "kube-api-access-tcsk9") pod "f11eb6e5-8306-4db5-af63-ef4d869f7e2c" (UID: "f11eb6e5-8306-4db5-af63-ef4d869f7e2c"). InnerVolumeSpecName "kube-api-access-tcsk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.109243 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ad32a9ec-a803-4d44-a4c1-03447e26e983-pod-info\") pod \"rabbitmq-server-0\" (UID: \"ad32a9ec-a803-4d44-a4c1-03447e26e983\") " pod="openstack/rabbitmq-server-0" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.109486 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ad32a9ec-a803-4d44-a4c1-03447e26e983-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"ad32a9ec-a803-4d44-a4c1-03447e26e983\") " pod="openstack/rabbitmq-server-0" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.113864 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "f11eb6e5-8306-4db5-af63-ef4d869f7e2c" (UID: "f11eb6e5-8306-4db5-af63-ef4d869f7e2c"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.124070 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sdtc\" (UniqueName: \"kubernetes.io/projected/ad32a9ec-a803-4d44-a4c1-03447e26e983-kube-api-access-2sdtc\") pod \"rabbitmq-server-0\" (UID: \"ad32a9ec-a803-4d44-a4c1-03447e26e983\") " pod="openstack/rabbitmq-server-0" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.139242 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-config-data" (OuterVolumeSpecName: "config-data") pod "f11eb6e5-8306-4db5-af63-ef4d869f7e2c" (UID: "f11eb6e5-8306-4db5-af63-ef4d869f7e2c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.148691 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"ad32a9ec-a803-4d44-a4c1-03447e26e983\") " pod="openstack/rabbitmq-server-0" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.153897 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-server-conf" (OuterVolumeSpecName: "server-conf") pod "f11eb6e5-8306-4db5-af63-ef4d869f7e2c" (UID: "f11eb6e5-8306-4db5-af63-ef4d869f7e2c"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.198098 4771 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.198474 4771 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-server-conf\") on node \"crc\" DevicePath \"\"" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.198509 4771 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.198526 4771 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.198537 4771 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-pod-info\") on node \"crc\" DevicePath \"\"" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.198550 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcsk9\" (UniqueName: \"kubernetes.io/projected/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-kube-api-access-tcsk9\") on node \"crc\" DevicePath \"\"" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.198562 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.217498 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "f11eb6e5-8306-4db5-af63-ef4d869f7e2c" (UID: "f11eb6e5-8306-4db5-af63-ef4d869f7e2c"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.232034 4771 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.300978 4771 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.301032 4771 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f11eb6e5-8306-4db5-af63-ef4d869f7e2c-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.450175 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.722388 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.728157 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"f11eb6e5-8306-4db5-af63-ef4d869f7e2c","Type":"ContainerDied","Data":"990a821ffedc702f5b50a8e7d9be36c92d0f3e39dacf89d89368fbcb6026ccf2"} Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.728241 4771 scope.go:117] "RemoveContainer" containerID="588dc45bf31c18016b37452ed3e4b68b57aad853692243431af12969d4a441e0" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.728247 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.778425 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.794132 4771 scope.go:117] "RemoveContainer" containerID="c3eb3e3832dbb71c6552b52fffc9d46d565d2ed58dc9ada50391280b5d29480a" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.802837 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.832970 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 15:17:27 crc kubenswrapper[4771]: E1001 15:17:27.835849 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f11eb6e5-8306-4db5-af63-ef4d869f7e2c" containerName="rabbitmq" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.835872 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f11eb6e5-8306-4db5-af63-ef4d869f7e2c" containerName="rabbitmq" Oct 01 15:17:27 crc kubenswrapper[4771]: E1001 15:17:27.835987 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f11eb6e5-8306-4db5-af63-ef4d869f7e2c" containerName="setup-container" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.835997 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f11eb6e5-8306-4db5-af63-ef4d869f7e2c" containerName="setup-container" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.852434 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f11eb6e5-8306-4db5-af63-ef4d869f7e2c" containerName="rabbitmq" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.860686 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.860816 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.862519 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.862670 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.862898 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.863126 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.863227 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-7mknv" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.863257 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.863286 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.994683 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ef68dad-0f62-4a2d-aa86-23997c284df0" path="/var/lib/kubelet/pods/4ef68dad-0f62-4a2d-aa86-23997c284df0/volumes" Oct 01 15:17:27 crc kubenswrapper[4771]: I1001 15:17:27.995490 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f11eb6e5-8306-4db5-af63-ef4d869f7e2c" path="/var/lib/kubelet/pods/f11eb6e5-8306-4db5-af63-ef4d869f7e2c/volumes" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.042562 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/15eb6248-64ff-4f3d-bcb4-4d78026673d4-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"15eb6248-64ff-4f3d-bcb4-4d78026673d4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.042600 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4shx6\" (UniqueName: \"kubernetes.io/projected/15eb6248-64ff-4f3d-bcb4-4d78026673d4-kube-api-access-4shx6\") pod \"rabbitmq-cell1-server-0\" (UID: \"15eb6248-64ff-4f3d-bcb4-4d78026673d4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.042630 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/15eb6248-64ff-4f3d-bcb4-4d78026673d4-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"15eb6248-64ff-4f3d-bcb4-4d78026673d4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.042656 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/15eb6248-64ff-4f3d-bcb4-4d78026673d4-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"15eb6248-64ff-4f3d-bcb4-4d78026673d4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.042670 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/15eb6248-64ff-4f3d-bcb4-4d78026673d4-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"15eb6248-64ff-4f3d-bcb4-4d78026673d4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.042747 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/15eb6248-64ff-4f3d-bcb4-4d78026673d4-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"15eb6248-64ff-4f3d-bcb4-4d78026673d4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.042771 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"15eb6248-64ff-4f3d-bcb4-4d78026673d4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.042808 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/15eb6248-64ff-4f3d-bcb4-4d78026673d4-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"15eb6248-64ff-4f3d-bcb4-4d78026673d4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.042827 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/15eb6248-64ff-4f3d-bcb4-4d78026673d4-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"15eb6248-64ff-4f3d-bcb4-4d78026673d4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.042850 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/15eb6248-64ff-4f3d-bcb4-4d78026673d4-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"15eb6248-64ff-4f3d-bcb4-4d78026673d4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.042884 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/15eb6248-64ff-4f3d-bcb4-4d78026673d4-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"15eb6248-64ff-4f3d-bcb4-4d78026673d4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.143912 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/15eb6248-64ff-4f3d-bcb4-4d78026673d4-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"15eb6248-64ff-4f3d-bcb4-4d78026673d4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.143961 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"15eb6248-64ff-4f3d-bcb4-4d78026673d4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.144019 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/15eb6248-64ff-4f3d-bcb4-4d78026673d4-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"15eb6248-64ff-4f3d-bcb4-4d78026673d4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.144038 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/15eb6248-64ff-4f3d-bcb4-4d78026673d4-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"15eb6248-64ff-4f3d-bcb4-4d78026673d4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.144065 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/15eb6248-64ff-4f3d-bcb4-4d78026673d4-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"15eb6248-64ff-4f3d-bcb4-4d78026673d4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.144490 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"15eb6248-64ff-4f3d-bcb4-4d78026673d4\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.144567 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/15eb6248-64ff-4f3d-bcb4-4d78026673d4-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"15eb6248-64ff-4f3d-bcb4-4d78026673d4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.144594 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/15eb6248-64ff-4f3d-bcb4-4d78026673d4-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"15eb6248-64ff-4f3d-bcb4-4d78026673d4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.144612 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4shx6\" (UniqueName: \"kubernetes.io/projected/15eb6248-64ff-4f3d-bcb4-4d78026673d4-kube-api-access-4shx6\") pod \"rabbitmq-cell1-server-0\" (UID: \"15eb6248-64ff-4f3d-bcb4-4d78026673d4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.144639 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/15eb6248-64ff-4f3d-bcb4-4d78026673d4-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"15eb6248-64ff-4f3d-bcb4-4d78026673d4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.144665 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/15eb6248-64ff-4f3d-bcb4-4d78026673d4-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"15eb6248-64ff-4f3d-bcb4-4d78026673d4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.144682 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/15eb6248-64ff-4f3d-bcb4-4d78026673d4-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"15eb6248-64ff-4f3d-bcb4-4d78026673d4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.144903 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/15eb6248-64ff-4f3d-bcb4-4d78026673d4-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"15eb6248-64ff-4f3d-bcb4-4d78026673d4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.145308 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/15eb6248-64ff-4f3d-bcb4-4d78026673d4-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"15eb6248-64ff-4f3d-bcb4-4d78026673d4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.145514 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/15eb6248-64ff-4f3d-bcb4-4d78026673d4-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"15eb6248-64ff-4f3d-bcb4-4d78026673d4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.146118 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/15eb6248-64ff-4f3d-bcb4-4d78026673d4-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"15eb6248-64ff-4f3d-bcb4-4d78026673d4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.147334 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/15eb6248-64ff-4f3d-bcb4-4d78026673d4-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"15eb6248-64ff-4f3d-bcb4-4d78026673d4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.148377 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/15eb6248-64ff-4f3d-bcb4-4d78026673d4-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"15eb6248-64ff-4f3d-bcb4-4d78026673d4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.150035 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/15eb6248-64ff-4f3d-bcb4-4d78026673d4-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"15eb6248-64ff-4f3d-bcb4-4d78026673d4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.150041 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/15eb6248-64ff-4f3d-bcb4-4d78026673d4-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"15eb6248-64ff-4f3d-bcb4-4d78026673d4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.153197 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/15eb6248-64ff-4f3d-bcb4-4d78026673d4-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"15eb6248-64ff-4f3d-bcb4-4d78026673d4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.164981 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4shx6\" (UniqueName: \"kubernetes.io/projected/15eb6248-64ff-4f3d-bcb4-4d78026673d4-kube-api-access-4shx6\") pod \"rabbitmq-cell1-server-0\" (UID: \"15eb6248-64ff-4f3d-bcb4-4d78026673d4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.180370 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"15eb6248-64ff-4f3d-bcb4-4d78026673d4\") " pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.236577 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.637240 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-kz9fk"] Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.638705 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-kz9fk" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.640798 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.708323 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.719981 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-kz9fk"] Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.750712 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ad32a9ec-a803-4d44-a4c1-03447e26e983","Type":"ContainerStarted","Data":"e85bbf5df2e7c2a2d0e22a679979ff120dfa478e571483010580ba11f7852c50"} Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.759720 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ba463217-6da6-4b95-9c2c-15f24ec34d0e-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-kz9fk\" (UID: \"ba463217-6da6-4b95-9c2c-15f24ec34d0e\") " pod="openstack/dnsmasq-dns-67b789f86c-kz9fk" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.759817 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba463217-6da6-4b95-9c2c-15f24ec34d0e-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-kz9fk\" (UID: \"ba463217-6da6-4b95-9c2c-15f24ec34d0e\") " pod="openstack/dnsmasq-dns-67b789f86c-kz9fk" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.759838 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ba463217-6da6-4b95-9c2c-15f24ec34d0e-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-kz9fk\" (UID: \"ba463217-6da6-4b95-9c2c-15f24ec34d0e\") " pod="openstack/dnsmasq-dns-67b789f86c-kz9fk" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.759864 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x92k\" (UniqueName: \"kubernetes.io/projected/ba463217-6da6-4b95-9c2c-15f24ec34d0e-kube-api-access-6x92k\") pod \"dnsmasq-dns-67b789f86c-kz9fk\" (UID: \"ba463217-6da6-4b95-9c2c-15f24ec34d0e\") " pod="openstack/dnsmasq-dns-67b789f86c-kz9fk" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.759888 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba463217-6da6-4b95-9c2c-15f24ec34d0e-dns-svc\") pod \"dnsmasq-dns-67b789f86c-kz9fk\" (UID: \"ba463217-6da6-4b95-9c2c-15f24ec34d0e\") " pod="openstack/dnsmasq-dns-67b789f86c-kz9fk" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.759922 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba463217-6da6-4b95-9c2c-15f24ec34d0e-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-kz9fk\" (UID: \"ba463217-6da6-4b95-9c2c-15f24ec34d0e\") " pod="openstack/dnsmasq-dns-67b789f86c-kz9fk" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.759945 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba463217-6da6-4b95-9c2c-15f24ec34d0e-config\") pod \"dnsmasq-dns-67b789f86c-kz9fk\" (UID: \"ba463217-6da6-4b95-9c2c-15f24ec34d0e\") " pod="openstack/dnsmasq-dns-67b789f86c-kz9fk" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.785001 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"15eb6248-64ff-4f3d-bcb4-4d78026673d4","Type":"ContainerStarted","Data":"329f82aa733ca40071a0a45e933dd580a7f9fcf48a91344c97a9cc6188a0ce2b"} Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.864916 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x92k\" (UniqueName: \"kubernetes.io/projected/ba463217-6da6-4b95-9c2c-15f24ec34d0e-kube-api-access-6x92k\") pod \"dnsmasq-dns-67b789f86c-kz9fk\" (UID: \"ba463217-6da6-4b95-9c2c-15f24ec34d0e\") " pod="openstack/dnsmasq-dns-67b789f86c-kz9fk" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.865374 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba463217-6da6-4b95-9c2c-15f24ec34d0e-dns-svc\") pod \"dnsmasq-dns-67b789f86c-kz9fk\" (UID: \"ba463217-6da6-4b95-9c2c-15f24ec34d0e\") " pod="openstack/dnsmasq-dns-67b789f86c-kz9fk" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.865449 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba463217-6da6-4b95-9c2c-15f24ec34d0e-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-kz9fk\" (UID: \"ba463217-6da6-4b95-9c2c-15f24ec34d0e\") " pod="openstack/dnsmasq-dns-67b789f86c-kz9fk" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.865487 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba463217-6da6-4b95-9c2c-15f24ec34d0e-config\") pod \"dnsmasq-dns-67b789f86c-kz9fk\" (UID: \"ba463217-6da6-4b95-9c2c-15f24ec34d0e\") " pod="openstack/dnsmasq-dns-67b789f86c-kz9fk" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.865577 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ba463217-6da6-4b95-9c2c-15f24ec34d0e-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-kz9fk\" (UID: \"ba463217-6da6-4b95-9c2c-15f24ec34d0e\") " pod="openstack/dnsmasq-dns-67b789f86c-kz9fk" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.865658 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba463217-6da6-4b95-9c2c-15f24ec34d0e-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-kz9fk\" (UID: \"ba463217-6da6-4b95-9c2c-15f24ec34d0e\") " pod="openstack/dnsmasq-dns-67b789f86c-kz9fk" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.865680 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ba463217-6da6-4b95-9c2c-15f24ec34d0e-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-kz9fk\" (UID: \"ba463217-6da6-4b95-9c2c-15f24ec34d0e\") " pod="openstack/dnsmasq-dns-67b789f86c-kz9fk" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.866559 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ba463217-6da6-4b95-9c2c-15f24ec34d0e-openstack-edpm-ipam\") pod \"dnsmasq-dns-67b789f86c-kz9fk\" (UID: \"ba463217-6da6-4b95-9c2c-15f24ec34d0e\") " pod="openstack/dnsmasq-dns-67b789f86c-kz9fk" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.866629 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba463217-6da6-4b95-9c2c-15f24ec34d0e-ovsdbserver-sb\") pod \"dnsmasq-dns-67b789f86c-kz9fk\" (UID: \"ba463217-6da6-4b95-9c2c-15f24ec34d0e\") " pod="openstack/dnsmasq-dns-67b789f86c-kz9fk" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.867249 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba463217-6da6-4b95-9c2c-15f24ec34d0e-config\") pod \"dnsmasq-dns-67b789f86c-kz9fk\" (UID: \"ba463217-6da6-4b95-9c2c-15f24ec34d0e\") " pod="openstack/dnsmasq-dns-67b789f86c-kz9fk" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.867353 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba463217-6da6-4b95-9c2c-15f24ec34d0e-dns-svc\") pod \"dnsmasq-dns-67b789f86c-kz9fk\" (UID: \"ba463217-6da6-4b95-9c2c-15f24ec34d0e\") " pod="openstack/dnsmasq-dns-67b789f86c-kz9fk" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.867493 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ba463217-6da6-4b95-9c2c-15f24ec34d0e-dns-swift-storage-0\") pod \"dnsmasq-dns-67b789f86c-kz9fk\" (UID: \"ba463217-6da6-4b95-9c2c-15f24ec34d0e\") " pod="openstack/dnsmasq-dns-67b789f86c-kz9fk" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.867813 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba463217-6da6-4b95-9c2c-15f24ec34d0e-ovsdbserver-nb\") pod \"dnsmasq-dns-67b789f86c-kz9fk\" (UID: \"ba463217-6da6-4b95-9c2c-15f24ec34d0e\") " pod="openstack/dnsmasq-dns-67b789f86c-kz9fk" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.897439 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x92k\" (UniqueName: \"kubernetes.io/projected/ba463217-6da6-4b95-9c2c-15f24ec34d0e-kube-api-access-6x92k\") pod \"dnsmasq-dns-67b789f86c-kz9fk\" (UID: \"ba463217-6da6-4b95-9c2c-15f24ec34d0e\") " pod="openstack/dnsmasq-dns-67b789f86c-kz9fk" Oct 01 15:17:28 crc kubenswrapper[4771]: I1001 15:17:28.959900 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-kz9fk" Oct 01 15:17:29 crc kubenswrapper[4771]: I1001 15:17:29.548616 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-kz9fk"] Oct 01 15:17:29 crc kubenswrapper[4771]: I1001 15:17:29.797822 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ad32a9ec-a803-4d44-a4c1-03447e26e983","Type":"ContainerStarted","Data":"1b2344734abadfa4b9aacd41d393b3ed16e330ce5df5f1bef5443146b58ae1b5"} Oct 01 15:17:29 crc kubenswrapper[4771]: I1001 15:17:29.800265 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-kz9fk" event={"ID":"ba463217-6da6-4b95-9c2c-15f24ec34d0e","Type":"ContainerStarted","Data":"5cf73ec58cc1123af22b25b02109acfdc7aed3f7122abf6f1fcebe651196e936"} Oct 01 15:17:30 crc kubenswrapper[4771]: I1001 15:17:30.814072 4771 generic.go:334] "Generic (PLEG): container finished" podID="ba463217-6da6-4b95-9c2c-15f24ec34d0e" containerID="56eef2c37071396c9650602f25de34e4b5dc4e39b37639652b357006f0c0f99c" exitCode=0 Oct 01 15:17:30 crc kubenswrapper[4771]: I1001 15:17:30.814172 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-kz9fk" event={"ID":"ba463217-6da6-4b95-9c2c-15f24ec34d0e","Type":"ContainerDied","Data":"56eef2c37071396c9650602f25de34e4b5dc4e39b37639652b357006f0c0f99c"} Oct 01 15:17:30 crc kubenswrapper[4771]: I1001 15:17:30.816935 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"15eb6248-64ff-4f3d-bcb4-4d78026673d4","Type":"ContainerStarted","Data":"5f3490a57e0088143080649335d20bb8ba3c62353ec3bd546b5e759fe24e4c9a"} Oct 01 15:17:31 crc kubenswrapper[4771]: I1001 15:17:31.829263 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-kz9fk" event={"ID":"ba463217-6da6-4b95-9c2c-15f24ec34d0e","Type":"ContainerStarted","Data":"6db9647877c86fa0e538911431e4e7b925059f199c33a36783f09ee88e010bca"} Oct 01 15:17:31 crc kubenswrapper[4771]: I1001 15:17:31.863545 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67b789f86c-kz9fk" podStartSLOduration=3.863455444 podStartE2EDuration="3.863455444s" podCreationTimestamp="2025-10-01 15:17:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:17:31.856452844 +0000 UTC m=+1296.475628025" watchObservedRunningTime="2025-10-01 15:17:31.863455444 +0000 UTC m=+1296.482630655" Oct 01 15:17:32 crc kubenswrapper[4771]: I1001 15:17:32.845821 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67b789f86c-kz9fk" Oct 01 15:17:38 crc kubenswrapper[4771]: I1001 15:17:38.962405 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67b789f86c-kz9fk" Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.038925 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-h7bvh"] Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.039215 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59cf4bdb65-h7bvh" podUID="368f9218-00d0-4a40-a746-6ba4e5a67d1d" containerName="dnsmasq-dns" containerID="cri-o://d1446140f30b83ae0a9125c0a87210fe2e569343df251b2a5a27eb7e33a7da6e" gracePeriod=10 Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.237612 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cb6ffcf87-952sj"] Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.240997 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb6ffcf87-952sj" Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.263686 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb6ffcf87-952sj"] Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.288839 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5a19de7-6ecc-4f22-bc0c-18f3761eef3c-config\") pod \"dnsmasq-dns-cb6ffcf87-952sj\" (UID: \"e5a19de7-6ecc-4f22-bc0c-18f3761eef3c\") " pod="openstack/dnsmasq-dns-cb6ffcf87-952sj" Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.291480 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5a19de7-6ecc-4f22-bc0c-18f3761eef3c-ovsdbserver-sb\") pod \"dnsmasq-dns-cb6ffcf87-952sj\" (UID: \"e5a19de7-6ecc-4f22-bc0c-18f3761eef3c\") " pod="openstack/dnsmasq-dns-cb6ffcf87-952sj" Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.291638 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhpcw\" (UniqueName: \"kubernetes.io/projected/e5a19de7-6ecc-4f22-bc0c-18f3761eef3c-kube-api-access-xhpcw\") pod \"dnsmasq-dns-cb6ffcf87-952sj\" (UID: \"e5a19de7-6ecc-4f22-bc0c-18f3761eef3c\") " pod="openstack/dnsmasq-dns-cb6ffcf87-952sj" Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.291673 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e5a19de7-6ecc-4f22-bc0c-18f3761eef3c-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb6ffcf87-952sj\" (UID: \"e5a19de7-6ecc-4f22-bc0c-18f3761eef3c\") " pod="openstack/dnsmasq-dns-cb6ffcf87-952sj" Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.291698 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e5a19de7-6ecc-4f22-bc0c-18f3761eef3c-dns-swift-storage-0\") pod \"dnsmasq-dns-cb6ffcf87-952sj\" (UID: \"e5a19de7-6ecc-4f22-bc0c-18f3761eef3c\") " pod="openstack/dnsmasq-dns-cb6ffcf87-952sj" Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.291831 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5a19de7-6ecc-4f22-bc0c-18f3761eef3c-dns-svc\") pod \"dnsmasq-dns-cb6ffcf87-952sj\" (UID: \"e5a19de7-6ecc-4f22-bc0c-18f3761eef3c\") " pod="openstack/dnsmasq-dns-cb6ffcf87-952sj" Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.291883 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5a19de7-6ecc-4f22-bc0c-18f3761eef3c-ovsdbserver-nb\") pod \"dnsmasq-dns-cb6ffcf87-952sj\" (UID: \"e5a19de7-6ecc-4f22-bc0c-18f3761eef3c\") " pod="openstack/dnsmasq-dns-cb6ffcf87-952sj" Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.393437 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5a19de7-6ecc-4f22-bc0c-18f3761eef3c-ovsdbserver-nb\") pod \"dnsmasq-dns-cb6ffcf87-952sj\" (UID: \"e5a19de7-6ecc-4f22-bc0c-18f3761eef3c\") " pod="openstack/dnsmasq-dns-cb6ffcf87-952sj" Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.393919 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5a19de7-6ecc-4f22-bc0c-18f3761eef3c-config\") pod \"dnsmasq-dns-cb6ffcf87-952sj\" (UID: \"e5a19de7-6ecc-4f22-bc0c-18f3761eef3c\") " pod="openstack/dnsmasq-dns-cb6ffcf87-952sj" Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.393943 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5a19de7-6ecc-4f22-bc0c-18f3761eef3c-ovsdbserver-sb\") pod \"dnsmasq-dns-cb6ffcf87-952sj\" (UID: \"e5a19de7-6ecc-4f22-bc0c-18f3761eef3c\") " pod="openstack/dnsmasq-dns-cb6ffcf87-952sj" Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.394000 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhpcw\" (UniqueName: \"kubernetes.io/projected/e5a19de7-6ecc-4f22-bc0c-18f3761eef3c-kube-api-access-xhpcw\") pod \"dnsmasq-dns-cb6ffcf87-952sj\" (UID: \"e5a19de7-6ecc-4f22-bc0c-18f3761eef3c\") " pod="openstack/dnsmasq-dns-cb6ffcf87-952sj" Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.394020 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e5a19de7-6ecc-4f22-bc0c-18f3761eef3c-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb6ffcf87-952sj\" (UID: \"e5a19de7-6ecc-4f22-bc0c-18f3761eef3c\") " pod="openstack/dnsmasq-dns-cb6ffcf87-952sj" Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.394038 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e5a19de7-6ecc-4f22-bc0c-18f3761eef3c-dns-swift-storage-0\") pod \"dnsmasq-dns-cb6ffcf87-952sj\" (UID: \"e5a19de7-6ecc-4f22-bc0c-18f3761eef3c\") " pod="openstack/dnsmasq-dns-cb6ffcf87-952sj" Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.394099 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5a19de7-6ecc-4f22-bc0c-18f3761eef3c-dns-svc\") pod \"dnsmasq-dns-cb6ffcf87-952sj\" (UID: \"e5a19de7-6ecc-4f22-bc0c-18f3761eef3c\") " pod="openstack/dnsmasq-dns-cb6ffcf87-952sj" Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.394256 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e5a19de7-6ecc-4f22-bc0c-18f3761eef3c-ovsdbserver-nb\") pod \"dnsmasq-dns-cb6ffcf87-952sj\" (UID: \"e5a19de7-6ecc-4f22-bc0c-18f3761eef3c\") " pod="openstack/dnsmasq-dns-cb6ffcf87-952sj" Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.395067 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5a19de7-6ecc-4f22-bc0c-18f3761eef3c-config\") pod \"dnsmasq-dns-cb6ffcf87-952sj\" (UID: \"e5a19de7-6ecc-4f22-bc0c-18f3761eef3c\") " pod="openstack/dnsmasq-dns-cb6ffcf87-952sj" Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.395636 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/e5a19de7-6ecc-4f22-bc0c-18f3761eef3c-openstack-edpm-ipam\") pod \"dnsmasq-dns-cb6ffcf87-952sj\" (UID: \"e5a19de7-6ecc-4f22-bc0c-18f3761eef3c\") " pod="openstack/dnsmasq-dns-cb6ffcf87-952sj" Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.395952 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e5a19de7-6ecc-4f22-bc0c-18f3761eef3c-dns-svc\") pod \"dnsmasq-dns-cb6ffcf87-952sj\" (UID: \"e5a19de7-6ecc-4f22-bc0c-18f3761eef3c\") " pod="openstack/dnsmasq-dns-cb6ffcf87-952sj" Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.396993 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e5a19de7-6ecc-4f22-bc0c-18f3761eef3c-dns-swift-storage-0\") pod \"dnsmasq-dns-cb6ffcf87-952sj\" (UID: \"e5a19de7-6ecc-4f22-bc0c-18f3761eef3c\") " pod="openstack/dnsmasq-dns-cb6ffcf87-952sj" Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.396055 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e5a19de7-6ecc-4f22-bc0c-18f3761eef3c-ovsdbserver-sb\") pod \"dnsmasq-dns-cb6ffcf87-952sj\" (UID: \"e5a19de7-6ecc-4f22-bc0c-18f3761eef3c\") " pod="openstack/dnsmasq-dns-cb6ffcf87-952sj" Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.414589 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhpcw\" (UniqueName: \"kubernetes.io/projected/e5a19de7-6ecc-4f22-bc0c-18f3761eef3c-kube-api-access-xhpcw\") pod \"dnsmasq-dns-cb6ffcf87-952sj\" (UID: \"e5a19de7-6ecc-4f22-bc0c-18f3761eef3c\") " pod="openstack/dnsmasq-dns-cb6ffcf87-952sj" Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.569821 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cb6ffcf87-952sj" Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.578329 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-h7bvh" Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.704308 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/368f9218-00d0-4a40-a746-6ba4e5a67d1d-dns-svc\") pod \"368f9218-00d0-4a40-a746-6ba4e5a67d1d\" (UID: \"368f9218-00d0-4a40-a746-6ba4e5a67d1d\") " Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.704819 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6ngs\" (UniqueName: \"kubernetes.io/projected/368f9218-00d0-4a40-a746-6ba4e5a67d1d-kube-api-access-d6ngs\") pod \"368f9218-00d0-4a40-a746-6ba4e5a67d1d\" (UID: \"368f9218-00d0-4a40-a746-6ba4e5a67d1d\") " Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.704849 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/368f9218-00d0-4a40-a746-6ba4e5a67d1d-config\") pod \"368f9218-00d0-4a40-a746-6ba4e5a67d1d\" (UID: \"368f9218-00d0-4a40-a746-6ba4e5a67d1d\") " Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.705004 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/368f9218-00d0-4a40-a746-6ba4e5a67d1d-dns-swift-storage-0\") pod \"368f9218-00d0-4a40-a746-6ba4e5a67d1d\" (UID: \"368f9218-00d0-4a40-a746-6ba4e5a67d1d\") " Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.705058 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/368f9218-00d0-4a40-a746-6ba4e5a67d1d-ovsdbserver-sb\") pod \"368f9218-00d0-4a40-a746-6ba4e5a67d1d\" (UID: \"368f9218-00d0-4a40-a746-6ba4e5a67d1d\") " Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.705095 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/368f9218-00d0-4a40-a746-6ba4e5a67d1d-ovsdbserver-nb\") pod \"368f9218-00d0-4a40-a746-6ba4e5a67d1d\" (UID: \"368f9218-00d0-4a40-a746-6ba4e5a67d1d\") " Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.707796 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/368f9218-00d0-4a40-a746-6ba4e5a67d1d-kube-api-access-d6ngs" (OuterVolumeSpecName: "kube-api-access-d6ngs") pod "368f9218-00d0-4a40-a746-6ba4e5a67d1d" (UID: "368f9218-00d0-4a40-a746-6ba4e5a67d1d"). InnerVolumeSpecName "kube-api-access-d6ngs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.753960 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/368f9218-00d0-4a40-a746-6ba4e5a67d1d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "368f9218-00d0-4a40-a746-6ba4e5a67d1d" (UID: "368f9218-00d0-4a40-a746-6ba4e5a67d1d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.779897 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/368f9218-00d0-4a40-a746-6ba4e5a67d1d-config" (OuterVolumeSpecName: "config") pod "368f9218-00d0-4a40-a746-6ba4e5a67d1d" (UID: "368f9218-00d0-4a40-a746-6ba4e5a67d1d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.781311 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/368f9218-00d0-4a40-a746-6ba4e5a67d1d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "368f9218-00d0-4a40-a746-6ba4e5a67d1d" (UID: "368f9218-00d0-4a40-a746-6ba4e5a67d1d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.782395 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/368f9218-00d0-4a40-a746-6ba4e5a67d1d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "368f9218-00d0-4a40-a746-6ba4e5a67d1d" (UID: "368f9218-00d0-4a40-a746-6ba4e5a67d1d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.801298 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/368f9218-00d0-4a40-a746-6ba4e5a67d1d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "368f9218-00d0-4a40-a746-6ba4e5a67d1d" (UID: "368f9218-00d0-4a40-a746-6ba4e5a67d1d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.807824 4771 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/368f9218-00d0-4a40-a746-6ba4e5a67d1d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.807880 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/368f9218-00d0-4a40-a746-6ba4e5a67d1d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.807898 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/368f9218-00d0-4a40-a746-6ba4e5a67d1d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.807917 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/368f9218-00d0-4a40-a746-6ba4e5a67d1d-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.807932 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6ngs\" (UniqueName: \"kubernetes.io/projected/368f9218-00d0-4a40-a746-6ba4e5a67d1d-kube-api-access-d6ngs\") on node \"crc\" DevicePath \"\"" Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.807949 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/368f9218-00d0-4a40-a746-6ba4e5a67d1d-config\") on node \"crc\" DevicePath \"\"" Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.931981 4771 generic.go:334] "Generic (PLEG): container finished" podID="368f9218-00d0-4a40-a746-6ba4e5a67d1d" containerID="d1446140f30b83ae0a9125c0a87210fe2e569343df251b2a5a27eb7e33a7da6e" exitCode=0 Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.932027 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-h7bvh" event={"ID":"368f9218-00d0-4a40-a746-6ba4e5a67d1d","Type":"ContainerDied","Data":"d1446140f30b83ae0a9125c0a87210fe2e569343df251b2a5a27eb7e33a7da6e"} Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.932050 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59cf4bdb65-h7bvh" Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.932070 4771 scope.go:117] "RemoveContainer" containerID="d1446140f30b83ae0a9125c0a87210fe2e569343df251b2a5a27eb7e33a7da6e" Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.932058 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59cf4bdb65-h7bvh" event={"ID":"368f9218-00d0-4a40-a746-6ba4e5a67d1d","Type":"ContainerDied","Data":"103122cd0a58b9480a1dffa15a11694133f3affc921745a5e14c3e537463e291"} Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.957592 4771 scope.go:117] "RemoveContainer" containerID="c944dbe155ae0fe15aab08aa5ba510e55107af409a3a3b5f7541c3bf81ec74e4" Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.987839 4771 scope.go:117] "RemoveContainer" containerID="d1446140f30b83ae0a9125c0a87210fe2e569343df251b2a5a27eb7e33a7da6e" Oct 01 15:17:39 crc kubenswrapper[4771]: E1001 15:17:39.988565 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1446140f30b83ae0a9125c0a87210fe2e569343df251b2a5a27eb7e33a7da6e\": container with ID starting with d1446140f30b83ae0a9125c0a87210fe2e569343df251b2a5a27eb7e33a7da6e not found: ID does not exist" containerID="d1446140f30b83ae0a9125c0a87210fe2e569343df251b2a5a27eb7e33a7da6e" Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.988592 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1446140f30b83ae0a9125c0a87210fe2e569343df251b2a5a27eb7e33a7da6e"} err="failed to get container status \"d1446140f30b83ae0a9125c0a87210fe2e569343df251b2a5a27eb7e33a7da6e\": rpc error: code = NotFound desc = could not find container \"d1446140f30b83ae0a9125c0a87210fe2e569343df251b2a5a27eb7e33a7da6e\": container with ID starting with d1446140f30b83ae0a9125c0a87210fe2e569343df251b2a5a27eb7e33a7da6e not found: ID does not exist" Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.988612 4771 scope.go:117] "RemoveContainer" containerID="c944dbe155ae0fe15aab08aa5ba510e55107af409a3a3b5f7541c3bf81ec74e4" Oct 01 15:17:39 crc kubenswrapper[4771]: E1001 15:17:39.988972 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c944dbe155ae0fe15aab08aa5ba510e55107af409a3a3b5f7541c3bf81ec74e4\": container with ID starting with c944dbe155ae0fe15aab08aa5ba510e55107af409a3a3b5f7541c3bf81ec74e4 not found: ID does not exist" containerID="c944dbe155ae0fe15aab08aa5ba510e55107af409a3a3b5f7541c3bf81ec74e4" Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.988995 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c944dbe155ae0fe15aab08aa5ba510e55107af409a3a3b5f7541c3bf81ec74e4"} err="failed to get container status \"c944dbe155ae0fe15aab08aa5ba510e55107af409a3a3b5f7541c3bf81ec74e4\": rpc error: code = NotFound desc = could not find container \"c944dbe155ae0fe15aab08aa5ba510e55107af409a3a3b5f7541c3bf81ec74e4\": container with ID starting with c944dbe155ae0fe15aab08aa5ba510e55107af409a3a3b5f7541c3bf81ec74e4 not found: ID does not exist" Oct 01 15:17:39 crc kubenswrapper[4771]: I1001 15:17:39.997947 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-h7bvh"] Oct 01 15:17:40 crc kubenswrapper[4771]: I1001 15:17:40.009717 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59cf4bdb65-h7bvh"] Oct 01 15:17:40 crc kubenswrapper[4771]: I1001 15:17:40.016312 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cb6ffcf87-952sj"] Oct 01 15:17:40 crc kubenswrapper[4771]: W1001 15:17:40.018316 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5a19de7_6ecc_4f22_bc0c_18f3761eef3c.slice/crio-90ad71596b15bfa43e28538ba33b6c35545eb424160222c8a259a2ee3f68d093 WatchSource:0}: Error finding container 90ad71596b15bfa43e28538ba33b6c35545eb424160222c8a259a2ee3f68d093: Status 404 returned error can't find the container with id 90ad71596b15bfa43e28538ba33b6c35545eb424160222c8a259a2ee3f68d093 Oct 01 15:17:40 crc kubenswrapper[4771]: I1001 15:17:40.951381 4771 generic.go:334] "Generic (PLEG): container finished" podID="e5a19de7-6ecc-4f22-bc0c-18f3761eef3c" containerID="7f63c98389408e050e925010b0eda705de70d0f119ca08d502b5089acc8db484" exitCode=0 Oct 01 15:17:40 crc kubenswrapper[4771]: I1001 15:17:40.951465 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb6ffcf87-952sj" event={"ID":"e5a19de7-6ecc-4f22-bc0c-18f3761eef3c","Type":"ContainerDied","Data":"7f63c98389408e050e925010b0eda705de70d0f119ca08d502b5089acc8db484"} Oct 01 15:17:40 crc kubenswrapper[4771]: I1001 15:17:40.951691 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb6ffcf87-952sj" event={"ID":"e5a19de7-6ecc-4f22-bc0c-18f3761eef3c","Type":"ContainerStarted","Data":"90ad71596b15bfa43e28538ba33b6c35545eb424160222c8a259a2ee3f68d093"} Oct 01 15:17:41 crc kubenswrapper[4771]: I1001 15:17:41.982599 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cb6ffcf87-952sj" event={"ID":"e5a19de7-6ecc-4f22-bc0c-18f3761eef3c","Type":"ContainerStarted","Data":"57b7b9b1d401cfbe5578d5e003f3dfdb4a7f699d5885aad75dbc08f9bd73d7df"} Oct 01 15:17:41 crc kubenswrapper[4771]: I1001 15:17:41.983168 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cb6ffcf87-952sj" Oct 01 15:17:42 crc kubenswrapper[4771]: I1001 15:17:42.003797 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="368f9218-00d0-4a40-a746-6ba4e5a67d1d" path="/var/lib/kubelet/pods/368f9218-00d0-4a40-a746-6ba4e5a67d1d/volumes" Oct 01 15:17:42 crc kubenswrapper[4771]: I1001 15:17:42.017120 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cb6ffcf87-952sj" podStartSLOduration=3.017091246 podStartE2EDuration="3.017091246s" podCreationTimestamp="2025-10-01 15:17:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:17:42.01549513 +0000 UTC m=+1306.634670311" watchObservedRunningTime="2025-10-01 15:17:42.017091246 +0000 UTC m=+1306.636266447" Oct 01 15:17:42 crc kubenswrapper[4771]: I1001 15:17:42.178072 4771 patch_prober.go:28] interesting pod/machine-config-daemon-vck47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 15:17:42 crc kubenswrapper[4771]: I1001 15:17:42.178140 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 15:17:49 crc kubenswrapper[4771]: I1001 15:17:49.571167 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cb6ffcf87-952sj" Oct 01 15:17:49 crc kubenswrapper[4771]: I1001 15:17:49.672901 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-kz9fk"] Oct 01 15:17:49 crc kubenswrapper[4771]: I1001 15:17:49.674072 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67b789f86c-kz9fk" podUID="ba463217-6da6-4b95-9c2c-15f24ec34d0e" containerName="dnsmasq-dns" containerID="cri-o://6db9647877c86fa0e538911431e4e7b925059f199c33a36783f09ee88e010bca" gracePeriod=10 Oct 01 15:17:50 crc kubenswrapper[4771]: I1001 15:17:50.063223 4771 generic.go:334] "Generic (PLEG): container finished" podID="ba463217-6da6-4b95-9c2c-15f24ec34d0e" containerID="6db9647877c86fa0e538911431e4e7b925059f199c33a36783f09ee88e010bca" exitCode=0 Oct 01 15:17:50 crc kubenswrapper[4771]: I1001 15:17:50.063268 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-kz9fk" event={"ID":"ba463217-6da6-4b95-9c2c-15f24ec34d0e","Type":"ContainerDied","Data":"6db9647877c86fa0e538911431e4e7b925059f199c33a36783f09ee88e010bca"} Oct 01 15:17:50 crc kubenswrapper[4771]: I1001 15:17:50.063297 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b789f86c-kz9fk" event={"ID":"ba463217-6da6-4b95-9c2c-15f24ec34d0e","Type":"ContainerDied","Data":"5cf73ec58cc1123af22b25b02109acfdc7aed3f7122abf6f1fcebe651196e936"} Oct 01 15:17:50 crc kubenswrapper[4771]: I1001 15:17:50.063316 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5cf73ec58cc1123af22b25b02109acfdc7aed3f7122abf6f1fcebe651196e936" Oct 01 15:17:50 crc kubenswrapper[4771]: I1001 15:17:50.124649 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-kz9fk" Oct 01 15:17:50 crc kubenswrapper[4771]: I1001 15:17:50.280473 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ba463217-6da6-4b95-9c2c-15f24ec34d0e-dns-swift-storage-0\") pod \"ba463217-6da6-4b95-9c2c-15f24ec34d0e\" (UID: \"ba463217-6da6-4b95-9c2c-15f24ec34d0e\") " Oct 01 15:17:50 crc kubenswrapper[4771]: I1001 15:17:50.280555 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba463217-6da6-4b95-9c2c-15f24ec34d0e-ovsdbserver-nb\") pod \"ba463217-6da6-4b95-9c2c-15f24ec34d0e\" (UID: \"ba463217-6da6-4b95-9c2c-15f24ec34d0e\") " Oct 01 15:17:50 crc kubenswrapper[4771]: I1001 15:17:50.280601 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba463217-6da6-4b95-9c2c-15f24ec34d0e-ovsdbserver-sb\") pod \"ba463217-6da6-4b95-9c2c-15f24ec34d0e\" (UID: \"ba463217-6da6-4b95-9c2c-15f24ec34d0e\") " Oct 01 15:17:50 crc kubenswrapper[4771]: I1001 15:17:50.280624 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba463217-6da6-4b95-9c2c-15f24ec34d0e-dns-svc\") pod \"ba463217-6da6-4b95-9c2c-15f24ec34d0e\" (UID: \"ba463217-6da6-4b95-9c2c-15f24ec34d0e\") " Oct 01 15:17:50 crc kubenswrapper[4771]: I1001 15:17:50.280650 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ba463217-6da6-4b95-9c2c-15f24ec34d0e-openstack-edpm-ipam\") pod \"ba463217-6da6-4b95-9c2c-15f24ec34d0e\" (UID: \"ba463217-6da6-4b95-9c2c-15f24ec34d0e\") " Oct 01 15:17:50 crc kubenswrapper[4771]: I1001 15:17:50.280725 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba463217-6da6-4b95-9c2c-15f24ec34d0e-config\") pod \"ba463217-6da6-4b95-9c2c-15f24ec34d0e\" (UID: \"ba463217-6da6-4b95-9c2c-15f24ec34d0e\") " Oct 01 15:17:50 crc kubenswrapper[4771]: I1001 15:17:50.280815 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6x92k\" (UniqueName: \"kubernetes.io/projected/ba463217-6da6-4b95-9c2c-15f24ec34d0e-kube-api-access-6x92k\") pod \"ba463217-6da6-4b95-9c2c-15f24ec34d0e\" (UID: \"ba463217-6da6-4b95-9c2c-15f24ec34d0e\") " Oct 01 15:17:50 crc kubenswrapper[4771]: I1001 15:17:50.288008 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba463217-6da6-4b95-9c2c-15f24ec34d0e-kube-api-access-6x92k" (OuterVolumeSpecName: "kube-api-access-6x92k") pod "ba463217-6da6-4b95-9c2c-15f24ec34d0e" (UID: "ba463217-6da6-4b95-9c2c-15f24ec34d0e"). InnerVolumeSpecName "kube-api-access-6x92k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:17:50 crc kubenswrapper[4771]: I1001 15:17:50.328323 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba463217-6da6-4b95-9c2c-15f24ec34d0e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ba463217-6da6-4b95-9c2c-15f24ec34d0e" (UID: "ba463217-6da6-4b95-9c2c-15f24ec34d0e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:17:50 crc kubenswrapper[4771]: I1001 15:17:50.345226 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba463217-6da6-4b95-9c2c-15f24ec34d0e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ba463217-6da6-4b95-9c2c-15f24ec34d0e" (UID: "ba463217-6da6-4b95-9c2c-15f24ec34d0e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:17:50 crc kubenswrapper[4771]: I1001 15:17:50.345418 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba463217-6da6-4b95-9c2c-15f24ec34d0e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ba463217-6da6-4b95-9c2c-15f24ec34d0e" (UID: "ba463217-6da6-4b95-9c2c-15f24ec34d0e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:17:50 crc kubenswrapper[4771]: I1001 15:17:50.347412 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba463217-6da6-4b95-9c2c-15f24ec34d0e-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "ba463217-6da6-4b95-9c2c-15f24ec34d0e" (UID: "ba463217-6da6-4b95-9c2c-15f24ec34d0e"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:17:50 crc kubenswrapper[4771]: I1001 15:17:50.360877 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba463217-6da6-4b95-9c2c-15f24ec34d0e-config" (OuterVolumeSpecName: "config") pod "ba463217-6da6-4b95-9c2c-15f24ec34d0e" (UID: "ba463217-6da6-4b95-9c2c-15f24ec34d0e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:17:50 crc kubenswrapper[4771]: I1001 15:17:50.367009 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba463217-6da6-4b95-9c2c-15f24ec34d0e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ba463217-6da6-4b95-9c2c-15f24ec34d0e" (UID: "ba463217-6da6-4b95-9c2c-15f24ec34d0e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:17:50 crc kubenswrapper[4771]: I1001 15:17:50.383564 4771 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ba463217-6da6-4b95-9c2c-15f24ec34d0e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 01 15:17:50 crc kubenswrapper[4771]: I1001 15:17:50.383601 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba463217-6da6-4b95-9c2c-15f24ec34d0e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 01 15:17:50 crc kubenswrapper[4771]: I1001 15:17:50.383613 4771 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba463217-6da6-4b95-9c2c-15f24ec34d0e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 01 15:17:50 crc kubenswrapper[4771]: I1001 15:17:50.383625 4771 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba463217-6da6-4b95-9c2c-15f24ec34d0e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 01 15:17:50 crc kubenswrapper[4771]: I1001 15:17:50.383639 4771 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ba463217-6da6-4b95-9c2c-15f24ec34d0e-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 01 15:17:50 crc kubenswrapper[4771]: I1001 15:17:50.385028 4771 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba463217-6da6-4b95-9c2c-15f24ec34d0e-config\") on node \"crc\" DevicePath \"\"" Oct 01 15:17:50 crc kubenswrapper[4771]: I1001 15:17:50.385060 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6x92k\" (UniqueName: \"kubernetes.io/projected/ba463217-6da6-4b95-9c2c-15f24ec34d0e-kube-api-access-6x92k\") on node \"crc\" DevicePath \"\"" Oct 01 15:17:51 crc kubenswrapper[4771]: I1001 15:17:51.072719 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b789f86c-kz9fk" Oct 01 15:17:51 crc kubenswrapper[4771]: I1001 15:17:51.111413 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-kz9fk"] Oct 01 15:17:51 crc kubenswrapper[4771]: I1001 15:17:51.118803 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67b789f86c-kz9fk"] Oct 01 15:17:52 crc kubenswrapper[4771]: I1001 15:17:52.001572 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba463217-6da6-4b95-9c2c-15f24ec34d0e" path="/var/lib/kubelet/pods/ba463217-6da6-4b95-9c2c-15f24ec34d0e/volumes" Oct 01 15:18:02 crc kubenswrapper[4771]: I1001 15:18:02.207449 4771 generic.go:334] "Generic (PLEG): container finished" podID="ad32a9ec-a803-4d44-a4c1-03447e26e983" containerID="1b2344734abadfa4b9aacd41d393b3ed16e330ce5df5f1bef5443146b58ae1b5" exitCode=0 Oct 01 15:18:02 crc kubenswrapper[4771]: I1001 15:18:02.207568 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ad32a9ec-a803-4d44-a4c1-03447e26e983","Type":"ContainerDied","Data":"1b2344734abadfa4b9aacd41d393b3ed16e330ce5df5f1bef5443146b58ae1b5"} Oct 01 15:18:02 crc kubenswrapper[4771]: I1001 15:18:02.858579 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r8pxs"] Oct 01 15:18:02 crc kubenswrapper[4771]: E1001 15:18:02.860335 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba463217-6da6-4b95-9c2c-15f24ec34d0e" containerName="dnsmasq-dns" Oct 01 15:18:02 crc kubenswrapper[4771]: I1001 15:18:02.860362 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba463217-6da6-4b95-9c2c-15f24ec34d0e" containerName="dnsmasq-dns" Oct 01 15:18:02 crc kubenswrapper[4771]: E1001 15:18:02.860378 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="368f9218-00d0-4a40-a746-6ba4e5a67d1d" containerName="init" Oct 01 15:18:02 crc kubenswrapper[4771]: I1001 15:18:02.860387 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="368f9218-00d0-4a40-a746-6ba4e5a67d1d" containerName="init" Oct 01 15:18:02 crc kubenswrapper[4771]: E1001 15:18:02.860402 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="368f9218-00d0-4a40-a746-6ba4e5a67d1d" containerName="dnsmasq-dns" Oct 01 15:18:02 crc kubenswrapper[4771]: I1001 15:18:02.860412 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="368f9218-00d0-4a40-a746-6ba4e5a67d1d" containerName="dnsmasq-dns" Oct 01 15:18:02 crc kubenswrapper[4771]: E1001 15:18:02.860430 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba463217-6da6-4b95-9c2c-15f24ec34d0e" containerName="init" Oct 01 15:18:02 crc kubenswrapper[4771]: I1001 15:18:02.860439 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba463217-6da6-4b95-9c2c-15f24ec34d0e" containerName="init" Oct 01 15:18:02 crc kubenswrapper[4771]: I1001 15:18:02.860695 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba463217-6da6-4b95-9c2c-15f24ec34d0e" containerName="dnsmasq-dns" Oct 01 15:18:02 crc kubenswrapper[4771]: I1001 15:18:02.860715 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="368f9218-00d0-4a40-a746-6ba4e5a67d1d" containerName="dnsmasq-dns" Oct 01 15:18:02 crc kubenswrapper[4771]: I1001 15:18:02.861587 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r8pxs" Oct 01 15:18:02 crc kubenswrapper[4771]: I1001 15:18:02.864563 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 15:18:02 crc kubenswrapper[4771]: I1001 15:18:02.864568 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 15:18:02 crc kubenswrapper[4771]: I1001 15:18:02.864701 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 15:18:02 crc kubenswrapper[4771]: I1001 15:18:02.864706 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fv9b7" Oct 01 15:18:02 crc kubenswrapper[4771]: I1001 15:18:02.900768 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r8pxs"] Oct 01 15:18:02 crc kubenswrapper[4771]: I1001 15:18:02.978916 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a91e8801-f8f9-4ce4-ba42-a4fa54057ec1-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r8pxs\" (UID: \"a91e8801-f8f9-4ce4-ba42-a4fa54057ec1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r8pxs" Oct 01 15:18:02 crc kubenswrapper[4771]: I1001 15:18:02.979287 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a91e8801-f8f9-4ce4-ba42-a4fa54057ec1-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r8pxs\" (UID: \"a91e8801-f8f9-4ce4-ba42-a4fa54057ec1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r8pxs" Oct 01 15:18:02 crc kubenswrapper[4771]: I1001 15:18:02.979551 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7bs8\" (UniqueName: \"kubernetes.io/projected/a91e8801-f8f9-4ce4-ba42-a4fa54057ec1-kube-api-access-t7bs8\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r8pxs\" (UID: \"a91e8801-f8f9-4ce4-ba42-a4fa54057ec1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r8pxs" Oct 01 15:18:02 crc kubenswrapper[4771]: I1001 15:18:02.979596 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a91e8801-f8f9-4ce4-ba42-a4fa54057ec1-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r8pxs\" (UID: \"a91e8801-f8f9-4ce4-ba42-a4fa54057ec1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r8pxs" Oct 01 15:18:03 crc kubenswrapper[4771]: I1001 15:18:03.082303 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a91e8801-f8f9-4ce4-ba42-a4fa54057ec1-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r8pxs\" (UID: \"a91e8801-f8f9-4ce4-ba42-a4fa54057ec1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r8pxs" Oct 01 15:18:03 crc kubenswrapper[4771]: I1001 15:18:03.082449 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7bs8\" (UniqueName: \"kubernetes.io/projected/a91e8801-f8f9-4ce4-ba42-a4fa54057ec1-kube-api-access-t7bs8\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r8pxs\" (UID: \"a91e8801-f8f9-4ce4-ba42-a4fa54057ec1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r8pxs" Oct 01 15:18:03 crc kubenswrapper[4771]: I1001 15:18:03.082489 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a91e8801-f8f9-4ce4-ba42-a4fa54057ec1-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r8pxs\" (UID: \"a91e8801-f8f9-4ce4-ba42-a4fa54057ec1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r8pxs" Oct 01 15:18:03 crc kubenswrapper[4771]: I1001 15:18:03.082582 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a91e8801-f8f9-4ce4-ba42-a4fa54057ec1-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r8pxs\" (UID: \"a91e8801-f8f9-4ce4-ba42-a4fa54057ec1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r8pxs" Oct 01 15:18:03 crc kubenswrapper[4771]: I1001 15:18:03.088781 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a91e8801-f8f9-4ce4-ba42-a4fa54057ec1-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r8pxs\" (UID: \"a91e8801-f8f9-4ce4-ba42-a4fa54057ec1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r8pxs" Oct 01 15:18:03 crc kubenswrapper[4771]: I1001 15:18:03.090468 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a91e8801-f8f9-4ce4-ba42-a4fa54057ec1-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r8pxs\" (UID: \"a91e8801-f8f9-4ce4-ba42-a4fa54057ec1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r8pxs" Oct 01 15:18:03 crc kubenswrapper[4771]: I1001 15:18:03.090543 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a91e8801-f8f9-4ce4-ba42-a4fa54057ec1-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r8pxs\" (UID: \"a91e8801-f8f9-4ce4-ba42-a4fa54057ec1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r8pxs" Oct 01 15:18:03 crc kubenswrapper[4771]: I1001 15:18:03.108076 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7bs8\" (UniqueName: \"kubernetes.io/projected/a91e8801-f8f9-4ce4-ba42-a4fa54057ec1-kube-api-access-t7bs8\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-r8pxs\" (UID: \"a91e8801-f8f9-4ce4-ba42-a4fa54057ec1\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r8pxs" Oct 01 15:18:03 crc kubenswrapper[4771]: I1001 15:18:03.183021 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r8pxs" Oct 01 15:18:03 crc kubenswrapper[4771]: I1001 15:18:03.230232 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"ad32a9ec-a803-4d44-a4c1-03447e26e983","Type":"ContainerStarted","Data":"e51fa16e121c338bc9992c0956fb2d8877119055f9a510dc3cbb3c4ff0f57b0b"} Oct 01 15:18:03 crc kubenswrapper[4771]: I1001 15:18:03.230784 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 01 15:18:03 crc kubenswrapper[4771]: I1001 15:18:03.259936 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.259915435 podStartE2EDuration="37.259915435s" podCreationTimestamp="2025-10-01 15:17:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:18:03.259625598 +0000 UTC m=+1327.878800809" watchObservedRunningTime="2025-10-01 15:18:03.259915435 +0000 UTC m=+1327.879090606" Oct 01 15:18:03 crc kubenswrapper[4771]: I1001 15:18:03.761537 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r8pxs"] Oct 01 15:18:03 crc kubenswrapper[4771]: I1001 15:18:03.769815 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 15:18:04 crc kubenswrapper[4771]: I1001 15:18:04.243955 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r8pxs" event={"ID":"a91e8801-f8f9-4ce4-ba42-a4fa54057ec1","Type":"ContainerStarted","Data":"faea4ca594374afc3a8507e43e7e0e78757b8182cc359b927e87532d5ef134c8"} Oct 01 15:18:04 crc kubenswrapper[4771]: I1001 15:18:04.246493 4771 generic.go:334] "Generic (PLEG): container finished" podID="15eb6248-64ff-4f3d-bcb4-4d78026673d4" containerID="5f3490a57e0088143080649335d20bb8ba3c62353ec3bd546b5e759fe24e4c9a" exitCode=0 Oct 01 15:18:04 crc kubenswrapper[4771]: I1001 15:18:04.246566 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"15eb6248-64ff-4f3d-bcb4-4d78026673d4","Type":"ContainerDied","Data":"5f3490a57e0088143080649335d20bb8ba3c62353ec3bd546b5e759fe24e4c9a"} Oct 01 15:18:05 crc kubenswrapper[4771]: I1001 15:18:05.257457 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"15eb6248-64ff-4f3d-bcb4-4d78026673d4","Type":"ContainerStarted","Data":"2997fd9d2575eff41488be81a808d01d744f388ccc3382bfc4103d2d675af17e"} Oct 01 15:18:05 crc kubenswrapper[4771]: I1001 15:18:05.257695 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:18:05 crc kubenswrapper[4771]: I1001 15:18:05.287235 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.287209195 podStartE2EDuration="38.287209195s" podCreationTimestamp="2025-10-01 15:17:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:18:05.280610573 +0000 UTC m=+1329.899785734" watchObservedRunningTime="2025-10-01 15:18:05.287209195 +0000 UTC m=+1329.906384366" Oct 01 15:18:12 crc kubenswrapper[4771]: I1001 15:18:12.177788 4771 patch_prober.go:28] interesting pod/machine-config-daemon-vck47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 15:18:12 crc kubenswrapper[4771]: I1001 15:18:12.178371 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 15:18:12 crc kubenswrapper[4771]: I1001 15:18:12.178468 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vck47" Oct 01 15:18:12 crc kubenswrapper[4771]: I1001 15:18:12.179310 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6fb5c90115d7e5e2b881eca81e834dd62c83b4059c941e9b801dafa27eac271e"} pod="openshift-machine-config-operator/machine-config-daemon-vck47" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 15:18:12 crc kubenswrapper[4771]: I1001 15:18:12.179368 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" containerID="cri-o://6fb5c90115d7e5e2b881eca81e834dd62c83b4059c941e9b801dafa27eac271e" gracePeriod=600 Oct 01 15:18:12 crc kubenswrapper[4771]: I1001 15:18:12.339070 4771 generic.go:334] "Generic (PLEG): container finished" podID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerID="6fb5c90115d7e5e2b881eca81e834dd62c83b4059c941e9b801dafa27eac271e" exitCode=0 Oct 01 15:18:12 crc kubenswrapper[4771]: I1001 15:18:12.339157 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" event={"ID":"289ee6d3-fabe-417f-964c-76ca03c143cc","Type":"ContainerDied","Data":"6fb5c90115d7e5e2b881eca81e834dd62c83b4059c941e9b801dafa27eac271e"} Oct 01 15:18:12 crc kubenswrapper[4771]: I1001 15:18:12.339455 4771 scope.go:117] "RemoveContainer" containerID="a4ed1f9c5d09bf489c874da2478bf24ba55dbfc7c07deabe55036c8bafeb8e52" Oct 01 15:18:13 crc kubenswrapper[4771]: I1001 15:18:13.354012 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" event={"ID":"289ee6d3-fabe-417f-964c-76ca03c143cc","Type":"ContainerStarted","Data":"14f82a58b71f640691d4b9ebb4629f11abf0ca28aa3c0c30ba09d2fe31d6a0a2"} Oct 01 15:18:13 crc kubenswrapper[4771]: I1001 15:18:13.356284 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r8pxs" event={"ID":"a91e8801-f8f9-4ce4-ba42-a4fa54057ec1","Type":"ContainerStarted","Data":"822d88fb5fd84e7a8f7d57a621c98dc252af8b4e6e2be3a7e0ec426ce983d698"} Oct 01 15:18:13 crc kubenswrapper[4771]: I1001 15:18:13.398915 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r8pxs" podStartSLOduration=2.760186712 podStartE2EDuration="11.398893122s" podCreationTimestamp="2025-10-01 15:18:02 +0000 UTC" firstStartedPulling="2025-10-01 15:18:03.76957316 +0000 UTC m=+1328.388748351" lastFinishedPulling="2025-10-01 15:18:12.40827958 +0000 UTC m=+1337.027454761" observedRunningTime="2025-10-01 15:18:13.392659568 +0000 UTC m=+1338.011834779" watchObservedRunningTime="2025-10-01 15:18:13.398893122 +0000 UTC m=+1338.018068293" Oct 01 15:18:17 crc kubenswrapper[4771]: I1001 15:18:17.453945 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 01 15:18:18 crc kubenswrapper[4771]: I1001 15:18:18.241054 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 01 15:18:28 crc kubenswrapper[4771]: I1001 15:18:28.553552 4771 generic.go:334] "Generic (PLEG): container finished" podID="a91e8801-f8f9-4ce4-ba42-a4fa54057ec1" containerID="822d88fb5fd84e7a8f7d57a621c98dc252af8b4e6e2be3a7e0ec426ce983d698" exitCode=0 Oct 01 15:18:28 crc kubenswrapper[4771]: I1001 15:18:28.553898 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r8pxs" event={"ID":"a91e8801-f8f9-4ce4-ba42-a4fa54057ec1","Type":"ContainerDied","Data":"822d88fb5fd84e7a8f7d57a621c98dc252af8b4e6e2be3a7e0ec426ce983d698"} Oct 01 15:18:29 crc kubenswrapper[4771]: I1001 15:18:29.977240 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r8pxs" Oct 01 15:18:30 crc kubenswrapper[4771]: I1001 15:18:30.087961 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a91e8801-f8f9-4ce4-ba42-a4fa54057ec1-inventory\") pod \"a91e8801-f8f9-4ce4-ba42-a4fa54057ec1\" (UID: \"a91e8801-f8f9-4ce4-ba42-a4fa54057ec1\") " Oct 01 15:18:30 crc kubenswrapper[4771]: I1001 15:18:30.088138 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a91e8801-f8f9-4ce4-ba42-a4fa54057ec1-repo-setup-combined-ca-bundle\") pod \"a91e8801-f8f9-4ce4-ba42-a4fa54057ec1\" (UID: \"a91e8801-f8f9-4ce4-ba42-a4fa54057ec1\") " Oct 01 15:18:30 crc kubenswrapper[4771]: I1001 15:18:30.088185 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a91e8801-f8f9-4ce4-ba42-a4fa54057ec1-ssh-key\") pod \"a91e8801-f8f9-4ce4-ba42-a4fa54057ec1\" (UID: \"a91e8801-f8f9-4ce4-ba42-a4fa54057ec1\") " Oct 01 15:18:30 crc kubenswrapper[4771]: I1001 15:18:30.088271 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7bs8\" (UniqueName: \"kubernetes.io/projected/a91e8801-f8f9-4ce4-ba42-a4fa54057ec1-kube-api-access-t7bs8\") pod \"a91e8801-f8f9-4ce4-ba42-a4fa54057ec1\" (UID: \"a91e8801-f8f9-4ce4-ba42-a4fa54057ec1\") " Oct 01 15:18:30 crc kubenswrapper[4771]: I1001 15:18:30.103050 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a91e8801-f8f9-4ce4-ba42-a4fa54057ec1-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "a91e8801-f8f9-4ce4-ba42-a4fa54057ec1" (UID: "a91e8801-f8f9-4ce4-ba42-a4fa54057ec1"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:18:30 crc kubenswrapper[4771]: I1001 15:18:30.103165 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a91e8801-f8f9-4ce4-ba42-a4fa54057ec1-kube-api-access-t7bs8" (OuterVolumeSpecName: "kube-api-access-t7bs8") pod "a91e8801-f8f9-4ce4-ba42-a4fa54057ec1" (UID: "a91e8801-f8f9-4ce4-ba42-a4fa54057ec1"). InnerVolumeSpecName "kube-api-access-t7bs8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:18:30 crc kubenswrapper[4771]: I1001 15:18:30.116250 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a91e8801-f8f9-4ce4-ba42-a4fa54057ec1-inventory" (OuterVolumeSpecName: "inventory") pod "a91e8801-f8f9-4ce4-ba42-a4fa54057ec1" (UID: "a91e8801-f8f9-4ce4-ba42-a4fa54057ec1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:18:30 crc kubenswrapper[4771]: I1001 15:18:30.116858 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a91e8801-f8f9-4ce4-ba42-a4fa54057ec1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a91e8801-f8f9-4ce4-ba42-a4fa54057ec1" (UID: "a91e8801-f8f9-4ce4-ba42-a4fa54057ec1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:18:30 crc kubenswrapper[4771]: I1001 15:18:30.189936 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a91e8801-f8f9-4ce4-ba42-a4fa54057ec1-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 15:18:30 crc kubenswrapper[4771]: I1001 15:18:30.190111 4771 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a91e8801-f8f9-4ce4-ba42-a4fa54057ec1-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:18:30 crc kubenswrapper[4771]: I1001 15:18:30.190169 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a91e8801-f8f9-4ce4-ba42-a4fa54057ec1-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 15:18:30 crc kubenswrapper[4771]: I1001 15:18:30.190243 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7bs8\" (UniqueName: \"kubernetes.io/projected/a91e8801-f8f9-4ce4-ba42-a4fa54057ec1-kube-api-access-t7bs8\") on node \"crc\" DevicePath \"\"" Oct 01 15:18:30 crc kubenswrapper[4771]: I1001 15:18:30.605970 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r8pxs" event={"ID":"a91e8801-f8f9-4ce4-ba42-a4fa54057ec1","Type":"ContainerDied","Data":"faea4ca594374afc3a8507e43e7e0e78757b8182cc359b927e87532d5ef134c8"} Oct 01 15:18:30 crc kubenswrapper[4771]: I1001 15:18:30.606026 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="faea4ca594374afc3a8507e43e7e0e78757b8182cc359b927e87532d5ef134c8" Oct 01 15:18:30 crc kubenswrapper[4771]: I1001 15:18:30.606245 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-r8pxs" Oct 01 15:18:30 crc kubenswrapper[4771]: I1001 15:18:30.696063 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-l4srh"] Oct 01 15:18:30 crc kubenswrapper[4771]: E1001 15:18:30.697091 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a91e8801-f8f9-4ce4-ba42-a4fa54057ec1" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 01 15:18:30 crc kubenswrapper[4771]: I1001 15:18:30.697120 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a91e8801-f8f9-4ce4-ba42-a4fa54057ec1" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 01 15:18:30 crc kubenswrapper[4771]: I1001 15:18:30.697501 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a91e8801-f8f9-4ce4-ba42-a4fa54057ec1" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 01 15:18:30 crc kubenswrapper[4771]: I1001 15:18:30.698573 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l4srh" Oct 01 15:18:30 crc kubenswrapper[4771]: I1001 15:18:30.701813 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 15:18:30 crc kubenswrapper[4771]: I1001 15:18:30.701981 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fv9b7" Oct 01 15:18:30 crc kubenswrapper[4771]: I1001 15:18:30.702030 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 15:18:30 crc kubenswrapper[4771]: I1001 15:18:30.702236 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 15:18:30 crc kubenswrapper[4771]: I1001 15:18:30.711496 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-l4srh"] Oct 01 15:18:30 crc kubenswrapper[4771]: I1001 15:18:30.801538 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/856ab139-589a-4b24-89ab-37ef20ef1762-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-l4srh\" (UID: \"856ab139-589a-4b24-89ab-37ef20ef1762\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l4srh" Oct 01 15:18:30 crc kubenswrapper[4771]: I1001 15:18:30.801699 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sssh\" (UniqueName: \"kubernetes.io/projected/856ab139-589a-4b24-89ab-37ef20ef1762-kube-api-access-6sssh\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-l4srh\" (UID: \"856ab139-589a-4b24-89ab-37ef20ef1762\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l4srh" Oct 01 15:18:30 crc kubenswrapper[4771]: I1001 15:18:30.801771 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/856ab139-589a-4b24-89ab-37ef20ef1762-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-l4srh\" (UID: \"856ab139-589a-4b24-89ab-37ef20ef1762\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l4srh" Oct 01 15:18:30 crc kubenswrapper[4771]: I1001 15:18:30.903182 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sssh\" (UniqueName: \"kubernetes.io/projected/856ab139-589a-4b24-89ab-37ef20ef1762-kube-api-access-6sssh\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-l4srh\" (UID: \"856ab139-589a-4b24-89ab-37ef20ef1762\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l4srh" Oct 01 15:18:30 crc kubenswrapper[4771]: I1001 15:18:30.903248 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/856ab139-589a-4b24-89ab-37ef20ef1762-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-l4srh\" (UID: \"856ab139-589a-4b24-89ab-37ef20ef1762\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l4srh" Oct 01 15:18:30 crc kubenswrapper[4771]: I1001 15:18:30.903325 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/856ab139-589a-4b24-89ab-37ef20ef1762-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-l4srh\" (UID: \"856ab139-589a-4b24-89ab-37ef20ef1762\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l4srh" Oct 01 15:18:30 crc kubenswrapper[4771]: I1001 15:18:30.912663 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/856ab139-589a-4b24-89ab-37ef20ef1762-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-l4srh\" (UID: \"856ab139-589a-4b24-89ab-37ef20ef1762\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l4srh" Oct 01 15:18:30 crc kubenswrapper[4771]: I1001 15:18:30.912765 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/856ab139-589a-4b24-89ab-37ef20ef1762-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-l4srh\" (UID: \"856ab139-589a-4b24-89ab-37ef20ef1762\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l4srh" Oct 01 15:18:30 crc kubenswrapper[4771]: I1001 15:18:30.928500 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sssh\" (UniqueName: \"kubernetes.io/projected/856ab139-589a-4b24-89ab-37ef20ef1762-kube-api-access-6sssh\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-l4srh\" (UID: \"856ab139-589a-4b24-89ab-37ef20ef1762\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l4srh" Oct 01 15:18:31 crc kubenswrapper[4771]: I1001 15:18:31.030029 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l4srh" Oct 01 15:18:31 crc kubenswrapper[4771]: I1001 15:18:31.608276 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-l4srh"] Oct 01 15:18:31 crc kubenswrapper[4771]: W1001 15:18:31.613061 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod856ab139_589a_4b24_89ab_37ef20ef1762.slice/crio-27c6691c17b571ce2f12f40e05f7d8aa038ffa6f948b17f624d0ca2ecebeae98 WatchSource:0}: Error finding container 27c6691c17b571ce2f12f40e05f7d8aa038ffa6f948b17f624d0ca2ecebeae98: Status 404 returned error can't find the container with id 27c6691c17b571ce2f12f40e05f7d8aa038ffa6f948b17f624d0ca2ecebeae98 Oct 01 15:18:32 crc kubenswrapper[4771]: I1001 15:18:32.626896 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l4srh" event={"ID":"856ab139-589a-4b24-89ab-37ef20ef1762","Type":"ContainerStarted","Data":"2e63bfc0732afd4612f504810e7c7bfe58aa51859337f55ab25c63e8646af12e"} Oct 01 15:18:32 crc kubenswrapper[4771]: I1001 15:18:32.627246 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l4srh" event={"ID":"856ab139-589a-4b24-89ab-37ef20ef1762","Type":"ContainerStarted","Data":"27c6691c17b571ce2f12f40e05f7d8aa038ffa6f948b17f624d0ca2ecebeae98"} Oct 01 15:18:32 crc kubenswrapper[4771]: I1001 15:18:32.649719 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l4srh" podStartSLOduration=2.230677169 podStartE2EDuration="2.649700536s" podCreationTimestamp="2025-10-01 15:18:30 +0000 UTC" firstStartedPulling="2025-10-01 15:18:31.616471987 +0000 UTC m=+1356.235647158" lastFinishedPulling="2025-10-01 15:18:32.035495344 +0000 UTC m=+1356.654670525" observedRunningTime="2025-10-01 15:18:32.644103888 +0000 UTC m=+1357.263279079" watchObservedRunningTime="2025-10-01 15:18:32.649700536 +0000 UTC m=+1357.268875707" Oct 01 15:18:35 crc kubenswrapper[4771]: I1001 15:18:35.654879 4771 generic.go:334] "Generic (PLEG): container finished" podID="856ab139-589a-4b24-89ab-37ef20ef1762" containerID="2e63bfc0732afd4612f504810e7c7bfe58aa51859337f55ab25c63e8646af12e" exitCode=0 Oct 01 15:18:35 crc kubenswrapper[4771]: I1001 15:18:35.654971 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l4srh" event={"ID":"856ab139-589a-4b24-89ab-37ef20ef1762","Type":"ContainerDied","Data":"2e63bfc0732afd4612f504810e7c7bfe58aa51859337f55ab25c63e8646af12e"} Oct 01 15:18:37 crc kubenswrapper[4771]: I1001 15:18:37.151641 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l4srh" Oct 01 15:18:37 crc kubenswrapper[4771]: I1001 15:18:37.224124 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/856ab139-589a-4b24-89ab-37ef20ef1762-inventory\") pod \"856ab139-589a-4b24-89ab-37ef20ef1762\" (UID: \"856ab139-589a-4b24-89ab-37ef20ef1762\") " Oct 01 15:18:37 crc kubenswrapper[4771]: I1001 15:18:37.224242 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/856ab139-589a-4b24-89ab-37ef20ef1762-ssh-key\") pod \"856ab139-589a-4b24-89ab-37ef20ef1762\" (UID: \"856ab139-589a-4b24-89ab-37ef20ef1762\") " Oct 01 15:18:37 crc kubenswrapper[4771]: I1001 15:18:37.224442 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6sssh\" (UniqueName: \"kubernetes.io/projected/856ab139-589a-4b24-89ab-37ef20ef1762-kube-api-access-6sssh\") pod \"856ab139-589a-4b24-89ab-37ef20ef1762\" (UID: \"856ab139-589a-4b24-89ab-37ef20ef1762\") " Oct 01 15:18:37 crc kubenswrapper[4771]: I1001 15:18:37.237811 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/856ab139-589a-4b24-89ab-37ef20ef1762-kube-api-access-6sssh" (OuterVolumeSpecName: "kube-api-access-6sssh") pod "856ab139-589a-4b24-89ab-37ef20ef1762" (UID: "856ab139-589a-4b24-89ab-37ef20ef1762"). InnerVolumeSpecName "kube-api-access-6sssh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:18:37 crc kubenswrapper[4771]: I1001 15:18:37.268142 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/856ab139-589a-4b24-89ab-37ef20ef1762-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "856ab139-589a-4b24-89ab-37ef20ef1762" (UID: "856ab139-589a-4b24-89ab-37ef20ef1762"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:18:37 crc kubenswrapper[4771]: I1001 15:18:37.269176 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/856ab139-589a-4b24-89ab-37ef20ef1762-inventory" (OuterVolumeSpecName: "inventory") pod "856ab139-589a-4b24-89ab-37ef20ef1762" (UID: "856ab139-589a-4b24-89ab-37ef20ef1762"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:18:37 crc kubenswrapper[4771]: I1001 15:18:37.327661 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/856ab139-589a-4b24-89ab-37ef20ef1762-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 15:18:37 crc kubenswrapper[4771]: I1001 15:18:37.327695 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/856ab139-589a-4b24-89ab-37ef20ef1762-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 15:18:37 crc kubenswrapper[4771]: I1001 15:18:37.327709 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6sssh\" (UniqueName: \"kubernetes.io/projected/856ab139-589a-4b24-89ab-37ef20ef1762-kube-api-access-6sssh\") on node \"crc\" DevicePath \"\"" Oct 01 15:18:37 crc kubenswrapper[4771]: I1001 15:18:37.686408 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l4srh" event={"ID":"856ab139-589a-4b24-89ab-37ef20ef1762","Type":"ContainerDied","Data":"27c6691c17b571ce2f12f40e05f7d8aa038ffa6f948b17f624d0ca2ecebeae98"} Oct 01 15:18:37 crc kubenswrapper[4771]: I1001 15:18:37.686459 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27c6691c17b571ce2f12f40e05f7d8aa038ffa6f948b17f624d0ca2ecebeae98" Oct 01 15:18:37 crc kubenswrapper[4771]: I1001 15:18:37.686532 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-l4srh" Oct 01 15:18:37 crc kubenswrapper[4771]: I1001 15:18:37.754992 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2hsb8"] Oct 01 15:18:37 crc kubenswrapper[4771]: E1001 15:18:37.759873 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="856ab139-589a-4b24-89ab-37ef20ef1762" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 01 15:18:37 crc kubenswrapper[4771]: I1001 15:18:37.759905 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="856ab139-589a-4b24-89ab-37ef20ef1762" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 01 15:18:37 crc kubenswrapper[4771]: I1001 15:18:37.760203 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="856ab139-589a-4b24-89ab-37ef20ef1762" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 01 15:18:37 crc kubenswrapper[4771]: I1001 15:18:37.761044 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2hsb8" Oct 01 15:18:37 crc kubenswrapper[4771]: I1001 15:18:37.763200 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 15:18:37 crc kubenswrapper[4771]: I1001 15:18:37.763200 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fv9b7" Oct 01 15:18:37 crc kubenswrapper[4771]: I1001 15:18:37.763686 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 15:18:37 crc kubenswrapper[4771]: I1001 15:18:37.764166 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 15:18:37 crc kubenswrapper[4771]: I1001 15:18:37.768132 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2hsb8"] Oct 01 15:18:37 crc kubenswrapper[4771]: I1001 15:18:37.840038 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0498d724-f802-4a21-9197-f87079f3c96e-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2hsb8\" (UID: \"0498d724-f802-4a21-9197-f87079f3c96e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2hsb8" Oct 01 15:18:37 crc kubenswrapper[4771]: I1001 15:18:37.840163 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnz2r\" (UniqueName: \"kubernetes.io/projected/0498d724-f802-4a21-9197-f87079f3c96e-kube-api-access-vnz2r\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2hsb8\" (UID: \"0498d724-f802-4a21-9197-f87079f3c96e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2hsb8" Oct 01 15:18:37 crc kubenswrapper[4771]: I1001 15:18:37.840233 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0498d724-f802-4a21-9197-f87079f3c96e-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2hsb8\" (UID: \"0498d724-f802-4a21-9197-f87079f3c96e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2hsb8" Oct 01 15:18:37 crc kubenswrapper[4771]: I1001 15:18:37.840275 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0498d724-f802-4a21-9197-f87079f3c96e-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2hsb8\" (UID: \"0498d724-f802-4a21-9197-f87079f3c96e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2hsb8" Oct 01 15:18:37 crc kubenswrapper[4771]: I1001 15:18:37.942465 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnz2r\" (UniqueName: \"kubernetes.io/projected/0498d724-f802-4a21-9197-f87079f3c96e-kube-api-access-vnz2r\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2hsb8\" (UID: \"0498d724-f802-4a21-9197-f87079f3c96e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2hsb8" Oct 01 15:18:37 crc kubenswrapper[4771]: I1001 15:18:37.942899 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0498d724-f802-4a21-9197-f87079f3c96e-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2hsb8\" (UID: \"0498d724-f802-4a21-9197-f87079f3c96e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2hsb8" Oct 01 15:18:37 crc kubenswrapper[4771]: I1001 15:18:37.942937 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0498d724-f802-4a21-9197-f87079f3c96e-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2hsb8\" (UID: \"0498d724-f802-4a21-9197-f87079f3c96e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2hsb8" Oct 01 15:18:37 crc kubenswrapper[4771]: I1001 15:18:37.942994 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0498d724-f802-4a21-9197-f87079f3c96e-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2hsb8\" (UID: \"0498d724-f802-4a21-9197-f87079f3c96e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2hsb8" Oct 01 15:18:37 crc kubenswrapper[4771]: I1001 15:18:37.948069 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0498d724-f802-4a21-9197-f87079f3c96e-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2hsb8\" (UID: \"0498d724-f802-4a21-9197-f87079f3c96e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2hsb8" Oct 01 15:18:37 crc kubenswrapper[4771]: I1001 15:18:37.948648 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0498d724-f802-4a21-9197-f87079f3c96e-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2hsb8\" (UID: \"0498d724-f802-4a21-9197-f87079f3c96e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2hsb8" Oct 01 15:18:37 crc kubenswrapper[4771]: I1001 15:18:37.958648 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0498d724-f802-4a21-9197-f87079f3c96e-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2hsb8\" (UID: \"0498d724-f802-4a21-9197-f87079f3c96e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2hsb8" Oct 01 15:18:37 crc kubenswrapper[4771]: I1001 15:18:37.959045 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnz2r\" (UniqueName: \"kubernetes.io/projected/0498d724-f802-4a21-9197-f87079f3c96e-kube-api-access-vnz2r\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2hsb8\" (UID: \"0498d724-f802-4a21-9197-f87079f3c96e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2hsb8" Oct 01 15:18:38 crc kubenswrapper[4771]: I1001 15:18:38.080375 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2hsb8" Oct 01 15:18:38 crc kubenswrapper[4771]: I1001 15:18:38.706478 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2hsb8"] Oct 01 15:18:39 crc kubenswrapper[4771]: I1001 15:18:39.708420 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2hsb8" event={"ID":"0498d724-f802-4a21-9197-f87079f3c96e","Type":"ContainerStarted","Data":"789ca64d4f7cce4d88bb94cba1857cfc928ee441052a6bad568868c7f4f550e6"} Oct 01 15:18:39 crc kubenswrapper[4771]: I1001 15:18:39.708898 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2hsb8" event={"ID":"0498d724-f802-4a21-9197-f87079f3c96e","Type":"ContainerStarted","Data":"cc63894a3d6cda7a516a22e25d6912b67495d16b2a21d5b448685b00b0924335"} Oct 01 15:18:39 crc kubenswrapper[4771]: I1001 15:18:39.731401 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2hsb8" podStartSLOduration=2.138827435 podStartE2EDuration="2.731376324s" podCreationTimestamp="2025-10-01 15:18:37 +0000 UTC" firstStartedPulling="2025-10-01 15:18:38.701217988 +0000 UTC m=+1363.320393169" lastFinishedPulling="2025-10-01 15:18:39.293766877 +0000 UTC m=+1363.912942058" observedRunningTime="2025-10-01 15:18:39.72912172 +0000 UTC m=+1364.348296891" watchObservedRunningTime="2025-10-01 15:18:39.731376324 +0000 UTC m=+1364.350551525" Oct 01 15:20:00 crc kubenswrapper[4771]: I1001 15:20:00.506369 4771 scope.go:117] "RemoveContainer" containerID="0d62b92665c1a93f086952661ef09c1be90e0a57654d03292b9451c0e1a66451" Oct 01 15:20:00 crc kubenswrapper[4771]: I1001 15:20:00.547900 4771 scope.go:117] "RemoveContainer" containerID="b7156af422e3c6aaf0282ef2b69573b0ade9765298ff0bfabec5bc5907220610" Oct 01 15:20:00 crc kubenswrapper[4771]: I1001 15:20:00.598959 4771 scope.go:117] "RemoveContainer" containerID="f749f84155fd265a1e036a5648224fc00c532993dd4d7c8e0cd2bc56722c553a" Oct 01 15:20:00 crc kubenswrapper[4771]: I1001 15:20:00.634442 4771 scope.go:117] "RemoveContainer" containerID="78f6f199594e13c88dab6b12df66e120e384a65068fea0849457119f74ae27fc" Oct 01 15:20:00 crc kubenswrapper[4771]: I1001 15:20:00.688637 4771 scope.go:117] "RemoveContainer" containerID="67d4f49b7771a6cd394edbf1f7a2ab2572e824e9ea022772d3e883951c3927c0" Oct 01 15:20:07 crc kubenswrapper[4771]: I1001 15:20:07.105802 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tvszx"] Oct 01 15:20:07 crc kubenswrapper[4771]: I1001 15:20:07.109563 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tvszx" Oct 01 15:20:07 crc kubenswrapper[4771]: I1001 15:20:07.135299 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tvszx"] Oct 01 15:20:07 crc kubenswrapper[4771]: I1001 15:20:07.197039 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b739e63-624b-487b-9108-7cd02eb190ee-catalog-content\") pod \"redhat-marketplace-tvszx\" (UID: \"3b739e63-624b-487b-9108-7cd02eb190ee\") " pod="openshift-marketplace/redhat-marketplace-tvszx" Oct 01 15:20:07 crc kubenswrapper[4771]: I1001 15:20:07.197300 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zq69\" (UniqueName: \"kubernetes.io/projected/3b739e63-624b-487b-9108-7cd02eb190ee-kube-api-access-6zq69\") pod \"redhat-marketplace-tvszx\" (UID: \"3b739e63-624b-487b-9108-7cd02eb190ee\") " pod="openshift-marketplace/redhat-marketplace-tvszx" Oct 01 15:20:07 crc kubenswrapper[4771]: I1001 15:20:07.197422 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b739e63-624b-487b-9108-7cd02eb190ee-utilities\") pod \"redhat-marketplace-tvszx\" (UID: \"3b739e63-624b-487b-9108-7cd02eb190ee\") " pod="openshift-marketplace/redhat-marketplace-tvszx" Oct 01 15:20:07 crc kubenswrapper[4771]: I1001 15:20:07.298874 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b739e63-624b-487b-9108-7cd02eb190ee-catalog-content\") pod \"redhat-marketplace-tvszx\" (UID: \"3b739e63-624b-487b-9108-7cd02eb190ee\") " pod="openshift-marketplace/redhat-marketplace-tvszx" Oct 01 15:20:07 crc kubenswrapper[4771]: I1001 15:20:07.298954 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zq69\" (UniqueName: \"kubernetes.io/projected/3b739e63-624b-487b-9108-7cd02eb190ee-kube-api-access-6zq69\") pod \"redhat-marketplace-tvszx\" (UID: \"3b739e63-624b-487b-9108-7cd02eb190ee\") " pod="openshift-marketplace/redhat-marketplace-tvszx" Oct 01 15:20:07 crc kubenswrapper[4771]: I1001 15:20:07.298997 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b739e63-624b-487b-9108-7cd02eb190ee-utilities\") pod \"redhat-marketplace-tvszx\" (UID: \"3b739e63-624b-487b-9108-7cd02eb190ee\") " pod="openshift-marketplace/redhat-marketplace-tvszx" Oct 01 15:20:07 crc kubenswrapper[4771]: I1001 15:20:07.299810 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b739e63-624b-487b-9108-7cd02eb190ee-catalog-content\") pod \"redhat-marketplace-tvszx\" (UID: \"3b739e63-624b-487b-9108-7cd02eb190ee\") " pod="openshift-marketplace/redhat-marketplace-tvszx" Oct 01 15:20:07 crc kubenswrapper[4771]: I1001 15:20:07.299899 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b739e63-624b-487b-9108-7cd02eb190ee-utilities\") pod \"redhat-marketplace-tvszx\" (UID: \"3b739e63-624b-487b-9108-7cd02eb190ee\") " pod="openshift-marketplace/redhat-marketplace-tvszx" Oct 01 15:20:07 crc kubenswrapper[4771]: I1001 15:20:07.321358 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zq69\" (UniqueName: \"kubernetes.io/projected/3b739e63-624b-487b-9108-7cd02eb190ee-kube-api-access-6zq69\") pod \"redhat-marketplace-tvszx\" (UID: \"3b739e63-624b-487b-9108-7cd02eb190ee\") " pod="openshift-marketplace/redhat-marketplace-tvszx" Oct 01 15:20:07 crc kubenswrapper[4771]: I1001 15:20:07.441144 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tvszx" Oct 01 15:20:07 crc kubenswrapper[4771]: I1001 15:20:07.901823 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tvszx"] Oct 01 15:20:08 crc kubenswrapper[4771]: I1001 15:20:08.845947 4771 generic.go:334] "Generic (PLEG): container finished" podID="3b739e63-624b-487b-9108-7cd02eb190ee" containerID="e614b87744725c7f1c82dd3bf232cd8015ed342e6901c0c25f45c17998945e18" exitCode=0 Oct 01 15:20:08 crc kubenswrapper[4771]: I1001 15:20:08.846230 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tvszx" event={"ID":"3b739e63-624b-487b-9108-7cd02eb190ee","Type":"ContainerDied","Data":"e614b87744725c7f1c82dd3bf232cd8015ed342e6901c0c25f45c17998945e18"} Oct 01 15:20:08 crc kubenswrapper[4771]: I1001 15:20:08.846320 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tvszx" event={"ID":"3b739e63-624b-487b-9108-7cd02eb190ee","Type":"ContainerStarted","Data":"1b87c701d77d8e36d6f0fd97cc141852b66fa93bacd621a67e542179fe1e64b0"} Oct 01 15:20:10 crc kubenswrapper[4771]: I1001 15:20:10.876270 4771 generic.go:334] "Generic (PLEG): container finished" podID="3b739e63-624b-487b-9108-7cd02eb190ee" containerID="6004217d6e944e11280e1d625721a5a117c84270c7f7fa4ef378b68cdd017416" exitCode=0 Oct 01 15:20:10 crc kubenswrapper[4771]: I1001 15:20:10.876719 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tvszx" event={"ID":"3b739e63-624b-487b-9108-7cd02eb190ee","Type":"ContainerDied","Data":"6004217d6e944e11280e1d625721a5a117c84270c7f7fa4ef378b68cdd017416"} Oct 01 15:20:11 crc kubenswrapper[4771]: I1001 15:20:11.888924 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tvszx" event={"ID":"3b739e63-624b-487b-9108-7cd02eb190ee","Type":"ContainerStarted","Data":"0dc1d8770e407cbbf2b490eca29feaab102443ad9a832a75cb0e1aaa166b5a9b"} Oct 01 15:20:11 crc kubenswrapper[4771]: I1001 15:20:11.921290 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tvszx" podStartSLOduration=2.308577285 podStartE2EDuration="4.921269995s" podCreationTimestamp="2025-10-01 15:20:07 +0000 UTC" firstStartedPulling="2025-10-01 15:20:08.850912145 +0000 UTC m=+1453.470087346" lastFinishedPulling="2025-10-01 15:20:11.463604855 +0000 UTC m=+1456.082780056" observedRunningTime="2025-10-01 15:20:11.909076167 +0000 UTC m=+1456.528251358" watchObservedRunningTime="2025-10-01 15:20:11.921269995 +0000 UTC m=+1456.540445166" Oct 01 15:20:12 crc kubenswrapper[4771]: I1001 15:20:12.177547 4771 patch_prober.go:28] interesting pod/machine-config-daemon-vck47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 15:20:12 crc kubenswrapper[4771]: I1001 15:20:12.177611 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 15:20:12 crc kubenswrapper[4771]: I1001 15:20:12.895758 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nwh6z"] Oct 01 15:20:12 crc kubenswrapper[4771]: I1001 15:20:12.898934 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nwh6z" Oct 01 15:20:12 crc kubenswrapper[4771]: I1001 15:20:12.913066 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc2497a6-91b8-42cc-8909-774b7bd7e83e-utilities\") pod \"community-operators-nwh6z\" (UID: \"fc2497a6-91b8-42cc-8909-774b7bd7e83e\") " pod="openshift-marketplace/community-operators-nwh6z" Oct 01 15:20:12 crc kubenswrapper[4771]: I1001 15:20:12.913240 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc2497a6-91b8-42cc-8909-774b7bd7e83e-catalog-content\") pod \"community-operators-nwh6z\" (UID: \"fc2497a6-91b8-42cc-8909-774b7bd7e83e\") " pod="openshift-marketplace/community-operators-nwh6z" Oct 01 15:20:12 crc kubenswrapper[4771]: I1001 15:20:12.913326 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htdnw\" (UniqueName: \"kubernetes.io/projected/fc2497a6-91b8-42cc-8909-774b7bd7e83e-kube-api-access-htdnw\") pod \"community-operators-nwh6z\" (UID: \"fc2497a6-91b8-42cc-8909-774b7bd7e83e\") " pod="openshift-marketplace/community-operators-nwh6z" Oct 01 15:20:12 crc kubenswrapper[4771]: I1001 15:20:12.931297 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nwh6z"] Oct 01 15:20:13 crc kubenswrapper[4771]: I1001 15:20:13.015820 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc2497a6-91b8-42cc-8909-774b7bd7e83e-catalog-content\") pod \"community-operators-nwh6z\" (UID: \"fc2497a6-91b8-42cc-8909-774b7bd7e83e\") " pod="openshift-marketplace/community-operators-nwh6z" Oct 01 15:20:13 crc kubenswrapper[4771]: I1001 15:20:13.015953 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htdnw\" (UniqueName: \"kubernetes.io/projected/fc2497a6-91b8-42cc-8909-774b7bd7e83e-kube-api-access-htdnw\") pod \"community-operators-nwh6z\" (UID: \"fc2497a6-91b8-42cc-8909-774b7bd7e83e\") " pod="openshift-marketplace/community-operators-nwh6z" Oct 01 15:20:13 crc kubenswrapper[4771]: I1001 15:20:13.016078 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc2497a6-91b8-42cc-8909-774b7bd7e83e-utilities\") pod \"community-operators-nwh6z\" (UID: \"fc2497a6-91b8-42cc-8909-774b7bd7e83e\") " pod="openshift-marketplace/community-operators-nwh6z" Oct 01 15:20:13 crc kubenswrapper[4771]: I1001 15:20:13.017451 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc2497a6-91b8-42cc-8909-774b7bd7e83e-catalog-content\") pod \"community-operators-nwh6z\" (UID: \"fc2497a6-91b8-42cc-8909-774b7bd7e83e\") " pod="openshift-marketplace/community-operators-nwh6z" Oct 01 15:20:13 crc kubenswrapper[4771]: I1001 15:20:13.018092 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc2497a6-91b8-42cc-8909-774b7bd7e83e-utilities\") pod \"community-operators-nwh6z\" (UID: \"fc2497a6-91b8-42cc-8909-774b7bd7e83e\") " pod="openshift-marketplace/community-operators-nwh6z" Oct 01 15:20:13 crc kubenswrapper[4771]: I1001 15:20:13.048854 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htdnw\" (UniqueName: \"kubernetes.io/projected/fc2497a6-91b8-42cc-8909-774b7bd7e83e-kube-api-access-htdnw\") pod \"community-operators-nwh6z\" (UID: \"fc2497a6-91b8-42cc-8909-774b7bd7e83e\") " pod="openshift-marketplace/community-operators-nwh6z" Oct 01 15:20:13 crc kubenswrapper[4771]: I1001 15:20:13.229202 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nwh6z" Oct 01 15:20:13 crc kubenswrapper[4771]: I1001 15:20:13.804416 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nwh6z"] Oct 01 15:20:13 crc kubenswrapper[4771]: I1001 15:20:13.923649 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nwh6z" event={"ID":"fc2497a6-91b8-42cc-8909-774b7bd7e83e","Type":"ContainerStarted","Data":"76e682c173b4cbc0b438abbec233250eab5f37c9e8e9827889d0f5bc0d9885f0"} Oct 01 15:20:14 crc kubenswrapper[4771]: I1001 15:20:14.940420 4771 generic.go:334] "Generic (PLEG): container finished" podID="fc2497a6-91b8-42cc-8909-774b7bd7e83e" containerID="7fcbef590acd8bb621adb1bdbfcc4a9ddddee0558153184adf3f0803a157e143" exitCode=0 Oct 01 15:20:14 crc kubenswrapper[4771]: I1001 15:20:14.940499 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nwh6z" event={"ID":"fc2497a6-91b8-42cc-8909-774b7bd7e83e","Type":"ContainerDied","Data":"7fcbef590acd8bb621adb1bdbfcc4a9ddddee0558153184adf3f0803a157e143"} Oct 01 15:20:16 crc kubenswrapper[4771]: I1001 15:20:16.974593 4771 generic.go:334] "Generic (PLEG): container finished" podID="fc2497a6-91b8-42cc-8909-774b7bd7e83e" containerID="7c956fbc2a3e1b7420d33a576e7b5def3ded45f553221f7868c695b8ce93315f" exitCode=0 Oct 01 15:20:16 crc kubenswrapper[4771]: I1001 15:20:16.975161 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nwh6z" event={"ID":"fc2497a6-91b8-42cc-8909-774b7bd7e83e","Type":"ContainerDied","Data":"7c956fbc2a3e1b7420d33a576e7b5def3ded45f553221f7868c695b8ce93315f"} Oct 01 15:20:17 crc kubenswrapper[4771]: I1001 15:20:17.442006 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tvszx" Oct 01 15:20:17 crc kubenswrapper[4771]: I1001 15:20:17.442094 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tvszx" Oct 01 15:20:17 crc kubenswrapper[4771]: I1001 15:20:17.514963 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tvszx" Oct 01 15:20:18 crc kubenswrapper[4771]: I1001 15:20:18.007908 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nwh6z" event={"ID":"fc2497a6-91b8-42cc-8909-774b7bd7e83e","Type":"ContainerStarted","Data":"61ac40fd7f619f774b8993d09a55123c2660375d3bc370ce6a92bc51a3ac8729"} Oct 01 15:20:18 crc kubenswrapper[4771]: I1001 15:20:18.026118 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nwh6z" podStartSLOduration=3.536291452 podStartE2EDuration="6.026096447s" podCreationTimestamp="2025-10-01 15:20:12 +0000 UTC" firstStartedPulling="2025-10-01 15:20:14.94489372 +0000 UTC m=+1459.564068901" lastFinishedPulling="2025-10-01 15:20:17.434698725 +0000 UTC m=+1462.053873896" observedRunningTime="2025-10-01 15:20:18.015752803 +0000 UTC m=+1462.634928024" watchObservedRunningTime="2025-10-01 15:20:18.026096447 +0000 UTC m=+1462.645271618" Oct 01 15:20:18 crc kubenswrapper[4771]: I1001 15:20:18.072847 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tvszx" Oct 01 15:20:19 crc kubenswrapper[4771]: I1001 15:20:19.277070 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tvszx"] Oct 01 15:20:20 crc kubenswrapper[4771]: I1001 15:20:20.013324 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tvszx" podUID="3b739e63-624b-487b-9108-7cd02eb190ee" containerName="registry-server" containerID="cri-o://0dc1d8770e407cbbf2b490eca29feaab102443ad9a832a75cb0e1aaa166b5a9b" gracePeriod=2 Oct 01 15:20:20 crc kubenswrapper[4771]: I1001 15:20:20.560978 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tvszx" Oct 01 15:20:20 crc kubenswrapper[4771]: I1001 15:20:20.696402 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b739e63-624b-487b-9108-7cd02eb190ee-catalog-content\") pod \"3b739e63-624b-487b-9108-7cd02eb190ee\" (UID: \"3b739e63-624b-487b-9108-7cd02eb190ee\") " Oct 01 15:20:20 crc kubenswrapper[4771]: I1001 15:20:20.696466 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zq69\" (UniqueName: \"kubernetes.io/projected/3b739e63-624b-487b-9108-7cd02eb190ee-kube-api-access-6zq69\") pod \"3b739e63-624b-487b-9108-7cd02eb190ee\" (UID: \"3b739e63-624b-487b-9108-7cd02eb190ee\") " Oct 01 15:20:20 crc kubenswrapper[4771]: I1001 15:20:20.696505 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b739e63-624b-487b-9108-7cd02eb190ee-utilities\") pod \"3b739e63-624b-487b-9108-7cd02eb190ee\" (UID: \"3b739e63-624b-487b-9108-7cd02eb190ee\") " Oct 01 15:20:20 crc kubenswrapper[4771]: I1001 15:20:20.698068 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b739e63-624b-487b-9108-7cd02eb190ee-utilities" (OuterVolumeSpecName: "utilities") pod "3b739e63-624b-487b-9108-7cd02eb190ee" (UID: "3b739e63-624b-487b-9108-7cd02eb190ee"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:20:20 crc kubenswrapper[4771]: I1001 15:20:20.703403 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b739e63-624b-487b-9108-7cd02eb190ee-kube-api-access-6zq69" (OuterVolumeSpecName: "kube-api-access-6zq69") pod "3b739e63-624b-487b-9108-7cd02eb190ee" (UID: "3b739e63-624b-487b-9108-7cd02eb190ee"). InnerVolumeSpecName "kube-api-access-6zq69". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:20:20 crc kubenswrapper[4771]: I1001 15:20:20.714060 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b739e63-624b-487b-9108-7cd02eb190ee-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3b739e63-624b-487b-9108-7cd02eb190ee" (UID: "3b739e63-624b-487b-9108-7cd02eb190ee"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:20:20 crc kubenswrapper[4771]: I1001 15:20:20.799376 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b739e63-624b-487b-9108-7cd02eb190ee-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 15:20:20 crc kubenswrapper[4771]: I1001 15:20:20.799422 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zq69\" (UniqueName: \"kubernetes.io/projected/3b739e63-624b-487b-9108-7cd02eb190ee-kube-api-access-6zq69\") on node \"crc\" DevicePath \"\"" Oct 01 15:20:20 crc kubenswrapper[4771]: I1001 15:20:20.799433 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b739e63-624b-487b-9108-7cd02eb190ee-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 15:20:21 crc kubenswrapper[4771]: I1001 15:20:21.022586 4771 generic.go:334] "Generic (PLEG): container finished" podID="3b739e63-624b-487b-9108-7cd02eb190ee" containerID="0dc1d8770e407cbbf2b490eca29feaab102443ad9a832a75cb0e1aaa166b5a9b" exitCode=0 Oct 01 15:20:21 crc kubenswrapper[4771]: I1001 15:20:21.022625 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tvszx" event={"ID":"3b739e63-624b-487b-9108-7cd02eb190ee","Type":"ContainerDied","Data":"0dc1d8770e407cbbf2b490eca29feaab102443ad9a832a75cb0e1aaa166b5a9b"} Oct 01 15:20:21 crc kubenswrapper[4771]: I1001 15:20:21.022650 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tvszx" event={"ID":"3b739e63-624b-487b-9108-7cd02eb190ee","Type":"ContainerDied","Data":"1b87c701d77d8e36d6f0fd97cc141852b66fa93bacd621a67e542179fe1e64b0"} Oct 01 15:20:21 crc kubenswrapper[4771]: I1001 15:20:21.022665 4771 scope.go:117] "RemoveContainer" containerID="0dc1d8770e407cbbf2b490eca29feaab102443ad9a832a75cb0e1aaa166b5a9b" Oct 01 15:20:21 crc kubenswrapper[4771]: I1001 15:20:21.022662 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tvszx" Oct 01 15:20:21 crc kubenswrapper[4771]: I1001 15:20:21.041746 4771 scope.go:117] "RemoveContainer" containerID="6004217d6e944e11280e1d625721a5a117c84270c7f7fa4ef378b68cdd017416" Oct 01 15:20:21 crc kubenswrapper[4771]: I1001 15:20:21.053127 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tvszx"] Oct 01 15:20:21 crc kubenswrapper[4771]: I1001 15:20:21.061657 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tvszx"] Oct 01 15:20:21 crc kubenswrapper[4771]: I1001 15:20:21.082286 4771 scope.go:117] "RemoveContainer" containerID="e614b87744725c7f1c82dd3bf232cd8015ed342e6901c0c25f45c17998945e18" Oct 01 15:20:21 crc kubenswrapper[4771]: I1001 15:20:21.113840 4771 scope.go:117] "RemoveContainer" containerID="0dc1d8770e407cbbf2b490eca29feaab102443ad9a832a75cb0e1aaa166b5a9b" Oct 01 15:20:21 crc kubenswrapper[4771]: E1001 15:20:21.114305 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dc1d8770e407cbbf2b490eca29feaab102443ad9a832a75cb0e1aaa166b5a9b\": container with ID starting with 0dc1d8770e407cbbf2b490eca29feaab102443ad9a832a75cb0e1aaa166b5a9b not found: ID does not exist" containerID="0dc1d8770e407cbbf2b490eca29feaab102443ad9a832a75cb0e1aaa166b5a9b" Oct 01 15:20:21 crc kubenswrapper[4771]: I1001 15:20:21.114343 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dc1d8770e407cbbf2b490eca29feaab102443ad9a832a75cb0e1aaa166b5a9b"} err="failed to get container status \"0dc1d8770e407cbbf2b490eca29feaab102443ad9a832a75cb0e1aaa166b5a9b\": rpc error: code = NotFound desc = could not find container \"0dc1d8770e407cbbf2b490eca29feaab102443ad9a832a75cb0e1aaa166b5a9b\": container with ID starting with 0dc1d8770e407cbbf2b490eca29feaab102443ad9a832a75cb0e1aaa166b5a9b not found: ID does not exist" Oct 01 15:20:21 crc kubenswrapper[4771]: I1001 15:20:21.114368 4771 scope.go:117] "RemoveContainer" containerID="6004217d6e944e11280e1d625721a5a117c84270c7f7fa4ef378b68cdd017416" Oct 01 15:20:21 crc kubenswrapper[4771]: E1001 15:20:21.114654 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6004217d6e944e11280e1d625721a5a117c84270c7f7fa4ef378b68cdd017416\": container with ID starting with 6004217d6e944e11280e1d625721a5a117c84270c7f7fa4ef378b68cdd017416 not found: ID does not exist" containerID="6004217d6e944e11280e1d625721a5a117c84270c7f7fa4ef378b68cdd017416" Oct 01 15:20:21 crc kubenswrapper[4771]: I1001 15:20:21.114682 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6004217d6e944e11280e1d625721a5a117c84270c7f7fa4ef378b68cdd017416"} err="failed to get container status \"6004217d6e944e11280e1d625721a5a117c84270c7f7fa4ef378b68cdd017416\": rpc error: code = NotFound desc = could not find container \"6004217d6e944e11280e1d625721a5a117c84270c7f7fa4ef378b68cdd017416\": container with ID starting with 6004217d6e944e11280e1d625721a5a117c84270c7f7fa4ef378b68cdd017416 not found: ID does not exist" Oct 01 15:20:21 crc kubenswrapper[4771]: I1001 15:20:21.114700 4771 scope.go:117] "RemoveContainer" containerID="e614b87744725c7f1c82dd3bf232cd8015ed342e6901c0c25f45c17998945e18" Oct 01 15:20:21 crc kubenswrapper[4771]: E1001 15:20:21.115036 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e614b87744725c7f1c82dd3bf232cd8015ed342e6901c0c25f45c17998945e18\": container with ID starting with e614b87744725c7f1c82dd3bf232cd8015ed342e6901c0c25f45c17998945e18 not found: ID does not exist" containerID="e614b87744725c7f1c82dd3bf232cd8015ed342e6901c0c25f45c17998945e18" Oct 01 15:20:21 crc kubenswrapper[4771]: I1001 15:20:21.115077 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e614b87744725c7f1c82dd3bf232cd8015ed342e6901c0c25f45c17998945e18"} err="failed to get container status \"e614b87744725c7f1c82dd3bf232cd8015ed342e6901c0c25f45c17998945e18\": rpc error: code = NotFound desc = could not find container \"e614b87744725c7f1c82dd3bf232cd8015ed342e6901c0c25f45c17998945e18\": container with ID starting with e614b87744725c7f1c82dd3bf232cd8015ed342e6901c0c25f45c17998945e18 not found: ID does not exist" Oct 01 15:20:22 crc kubenswrapper[4771]: I1001 15:20:22.004040 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b739e63-624b-487b-9108-7cd02eb190ee" path="/var/lib/kubelet/pods/3b739e63-624b-487b-9108-7cd02eb190ee/volumes" Oct 01 15:20:23 crc kubenswrapper[4771]: I1001 15:20:23.229447 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nwh6z" Oct 01 15:20:23 crc kubenswrapper[4771]: I1001 15:20:23.230224 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nwh6z" Oct 01 15:20:23 crc kubenswrapper[4771]: I1001 15:20:23.289097 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nwh6z" Oct 01 15:20:24 crc kubenswrapper[4771]: I1001 15:20:24.096200 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nwh6z" Oct 01 15:20:24 crc kubenswrapper[4771]: I1001 15:20:24.881948 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nwh6z"] Oct 01 15:20:26 crc kubenswrapper[4771]: I1001 15:20:26.074263 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nwh6z" podUID="fc2497a6-91b8-42cc-8909-774b7bd7e83e" containerName="registry-server" containerID="cri-o://61ac40fd7f619f774b8993d09a55123c2660375d3bc370ce6a92bc51a3ac8729" gracePeriod=2 Oct 01 15:20:26 crc kubenswrapper[4771]: I1001 15:20:26.578163 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nwh6z" Oct 01 15:20:26 crc kubenswrapper[4771]: I1001 15:20:26.749194 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc2497a6-91b8-42cc-8909-774b7bd7e83e-utilities\") pod \"fc2497a6-91b8-42cc-8909-774b7bd7e83e\" (UID: \"fc2497a6-91b8-42cc-8909-774b7bd7e83e\") " Oct 01 15:20:26 crc kubenswrapper[4771]: I1001 15:20:26.749310 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc2497a6-91b8-42cc-8909-774b7bd7e83e-catalog-content\") pod \"fc2497a6-91b8-42cc-8909-774b7bd7e83e\" (UID: \"fc2497a6-91b8-42cc-8909-774b7bd7e83e\") " Oct 01 15:20:26 crc kubenswrapper[4771]: I1001 15:20:26.749489 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htdnw\" (UniqueName: \"kubernetes.io/projected/fc2497a6-91b8-42cc-8909-774b7bd7e83e-kube-api-access-htdnw\") pod \"fc2497a6-91b8-42cc-8909-774b7bd7e83e\" (UID: \"fc2497a6-91b8-42cc-8909-774b7bd7e83e\") " Oct 01 15:20:26 crc kubenswrapper[4771]: I1001 15:20:26.750429 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc2497a6-91b8-42cc-8909-774b7bd7e83e-utilities" (OuterVolumeSpecName: "utilities") pod "fc2497a6-91b8-42cc-8909-774b7bd7e83e" (UID: "fc2497a6-91b8-42cc-8909-774b7bd7e83e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:20:26 crc kubenswrapper[4771]: I1001 15:20:26.757372 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc2497a6-91b8-42cc-8909-774b7bd7e83e-kube-api-access-htdnw" (OuterVolumeSpecName: "kube-api-access-htdnw") pod "fc2497a6-91b8-42cc-8909-774b7bd7e83e" (UID: "fc2497a6-91b8-42cc-8909-774b7bd7e83e"). InnerVolumeSpecName "kube-api-access-htdnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:20:26 crc kubenswrapper[4771]: I1001 15:20:26.813820 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc2497a6-91b8-42cc-8909-774b7bd7e83e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc2497a6-91b8-42cc-8909-774b7bd7e83e" (UID: "fc2497a6-91b8-42cc-8909-774b7bd7e83e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:20:26 crc kubenswrapper[4771]: I1001 15:20:26.851594 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htdnw\" (UniqueName: \"kubernetes.io/projected/fc2497a6-91b8-42cc-8909-774b7bd7e83e-kube-api-access-htdnw\") on node \"crc\" DevicePath \"\"" Oct 01 15:20:26 crc kubenswrapper[4771]: I1001 15:20:26.851638 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc2497a6-91b8-42cc-8909-774b7bd7e83e-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 15:20:26 crc kubenswrapper[4771]: I1001 15:20:26.851647 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc2497a6-91b8-42cc-8909-774b7bd7e83e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 15:20:27 crc kubenswrapper[4771]: I1001 15:20:27.084842 4771 generic.go:334] "Generic (PLEG): container finished" podID="fc2497a6-91b8-42cc-8909-774b7bd7e83e" containerID="61ac40fd7f619f774b8993d09a55123c2660375d3bc370ce6a92bc51a3ac8729" exitCode=0 Oct 01 15:20:27 crc kubenswrapper[4771]: I1001 15:20:27.084894 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nwh6z" Oct 01 15:20:27 crc kubenswrapper[4771]: I1001 15:20:27.084900 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nwh6z" event={"ID":"fc2497a6-91b8-42cc-8909-774b7bd7e83e","Type":"ContainerDied","Data":"61ac40fd7f619f774b8993d09a55123c2660375d3bc370ce6a92bc51a3ac8729"} Oct 01 15:20:27 crc kubenswrapper[4771]: I1001 15:20:27.085844 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nwh6z" event={"ID":"fc2497a6-91b8-42cc-8909-774b7bd7e83e","Type":"ContainerDied","Data":"76e682c173b4cbc0b438abbec233250eab5f37c9e8e9827889d0f5bc0d9885f0"} Oct 01 15:20:27 crc kubenswrapper[4771]: I1001 15:20:27.085885 4771 scope.go:117] "RemoveContainer" containerID="61ac40fd7f619f774b8993d09a55123c2660375d3bc370ce6a92bc51a3ac8729" Oct 01 15:20:27 crc kubenswrapper[4771]: I1001 15:20:27.118880 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nwh6z"] Oct 01 15:20:27 crc kubenswrapper[4771]: I1001 15:20:27.133559 4771 scope.go:117] "RemoveContainer" containerID="7c956fbc2a3e1b7420d33a576e7b5def3ded45f553221f7868c695b8ce93315f" Oct 01 15:20:27 crc kubenswrapper[4771]: I1001 15:20:27.135389 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nwh6z"] Oct 01 15:20:27 crc kubenswrapper[4771]: I1001 15:20:27.153577 4771 scope.go:117] "RemoveContainer" containerID="7fcbef590acd8bb621adb1bdbfcc4a9ddddee0558153184adf3f0803a157e143" Oct 01 15:20:27 crc kubenswrapper[4771]: I1001 15:20:27.193995 4771 scope.go:117] "RemoveContainer" containerID="61ac40fd7f619f774b8993d09a55123c2660375d3bc370ce6a92bc51a3ac8729" Oct 01 15:20:27 crc kubenswrapper[4771]: E1001 15:20:27.194540 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61ac40fd7f619f774b8993d09a55123c2660375d3bc370ce6a92bc51a3ac8729\": container with ID starting with 61ac40fd7f619f774b8993d09a55123c2660375d3bc370ce6a92bc51a3ac8729 not found: ID does not exist" containerID="61ac40fd7f619f774b8993d09a55123c2660375d3bc370ce6a92bc51a3ac8729" Oct 01 15:20:27 crc kubenswrapper[4771]: I1001 15:20:27.194568 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61ac40fd7f619f774b8993d09a55123c2660375d3bc370ce6a92bc51a3ac8729"} err="failed to get container status \"61ac40fd7f619f774b8993d09a55123c2660375d3bc370ce6a92bc51a3ac8729\": rpc error: code = NotFound desc = could not find container \"61ac40fd7f619f774b8993d09a55123c2660375d3bc370ce6a92bc51a3ac8729\": container with ID starting with 61ac40fd7f619f774b8993d09a55123c2660375d3bc370ce6a92bc51a3ac8729 not found: ID does not exist" Oct 01 15:20:27 crc kubenswrapper[4771]: I1001 15:20:27.194589 4771 scope.go:117] "RemoveContainer" containerID="7c956fbc2a3e1b7420d33a576e7b5def3ded45f553221f7868c695b8ce93315f" Oct 01 15:20:27 crc kubenswrapper[4771]: E1001 15:20:27.195202 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c956fbc2a3e1b7420d33a576e7b5def3ded45f553221f7868c695b8ce93315f\": container with ID starting with 7c956fbc2a3e1b7420d33a576e7b5def3ded45f553221f7868c695b8ce93315f not found: ID does not exist" containerID="7c956fbc2a3e1b7420d33a576e7b5def3ded45f553221f7868c695b8ce93315f" Oct 01 15:20:27 crc kubenswrapper[4771]: I1001 15:20:27.195260 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c956fbc2a3e1b7420d33a576e7b5def3ded45f553221f7868c695b8ce93315f"} err="failed to get container status \"7c956fbc2a3e1b7420d33a576e7b5def3ded45f553221f7868c695b8ce93315f\": rpc error: code = NotFound desc = could not find container \"7c956fbc2a3e1b7420d33a576e7b5def3ded45f553221f7868c695b8ce93315f\": container with ID starting with 7c956fbc2a3e1b7420d33a576e7b5def3ded45f553221f7868c695b8ce93315f not found: ID does not exist" Oct 01 15:20:27 crc kubenswrapper[4771]: I1001 15:20:27.195300 4771 scope.go:117] "RemoveContainer" containerID="7fcbef590acd8bb621adb1bdbfcc4a9ddddee0558153184adf3f0803a157e143" Oct 01 15:20:27 crc kubenswrapper[4771]: E1001 15:20:27.195601 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fcbef590acd8bb621adb1bdbfcc4a9ddddee0558153184adf3f0803a157e143\": container with ID starting with 7fcbef590acd8bb621adb1bdbfcc4a9ddddee0558153184adf3f0803a157e143 not found: ID does not exist" containerID="7fcbef590acd8bb621adb1bdbfcc4a9ddddee0558153184adf3f0803a157e143" Oct 01 15:20:27 crc kubenswrapper[4771]: I1001 15:20:27.195626 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fcbef590acd8bb621adb1bdbfcc4a9ddddee0558153184adf3f0803a157e143"} err="failed to get container status \"7fcbef590acd8bb621adb1bdbfcc4a9ddddee0558153184adf3f0803a157e143\": rpc error: code = NotFound desc = could not find container \"7fcbef590acd8bb621adb1bdbfcc4a9ddddee0558153184adf3f0803a157e143\": container with ID starting with 7fcbef590acd8bb621adb1bdbfcc4a9ddddee0558153184adf3f0803a157e143 not found: ID does not exist" Oct 01 15:20:27 crc kubenswrapper[4771]: I1001 15:20:27.994623 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc2497a6-91b8-42cc-8909-774b7bd7e83e" path="/var/lib/kubelet/pods/fc2497a6-91b8-42cc-8909-774b7bd7e83e/volumes" Oct 01 15:20:42 crc kubenswrapper[4771]: I1001 15:20:42.177879 4771 patch_prober.go:28] interesting pod/machine-config-daemon-vck47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 15:20:42 crc kubenswrapper[4771]: I1001 15:20:42.178578 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 15:20:47 crc kubenswrapper[4771]: I1001 15:20:47.673001 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8v7zf"] Oct 01 15:20:47 crc kubenswrapper[4771]: E1001 15:20:47.673885 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b739e63-624b-487b-9108-7cd02eb190ee" containerName="extract-utilities" Oct 01 15:20:47 crc kubenswrapper[4771]: I1001 15:20:47.673898 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b739e63-624b-487b-9108-7cd02eb190ee" containerName="extract-utilities" Oct 01 15:20:47 crc kubenswrapper[4771]: E1001 15:20:47.673909 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc2497a6-91b8-42cc-8909-774b7bd7e83e" containerName="registry-server" Oct 01 15:20:47 crc kubenswrapper[4771]: I1001 15:20:47.673915 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc2497a6-91b8-42cc-8909-774b7bd7e83e" containerName="registry-server" Oct 01 15:20:47 crc kubenswrapper[4771]: E1001 15:20:47.673927 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc2497a6-91b8-42cc-8909-774b7bd7e83e" containerName="extract-utilities" Oct 01 15:20:47 crc kubenswrapper[4771]: I1001 15:20:47.673935 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc2497a6-91b8-42cc-8909-774b7bd7e83e" containerName="extract-utilities" Oct 01 15:20:47 crc kubenswrapper[4771]: E1001 15:20:47.673948 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b739e63-624b-487b-9108-7cd02eb190ee" containerName="extract-content" Oct 01 15:20:47 crc kubenswrapper[4771]: I1001 15:20:47.673954 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b739e63-624b-487b-9108-7cd02eb190ee" containerName="extract-content" Oct 01 15:20:47 crc kubenswrapper[4771]: E1001 15:20:47.673981 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc2497a6-91b8-42cc-8909-774b7bd7e83e" containerName="extract-content" Oct 01 15:20:47 crc kubenswrapper[4771]: I1001 15:20:47.673990 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc2497a6-91b8-42cc-8909-774b7bd7e83e" containerName="extract-content" Oct 01 15:20:47 crc kubenswrapper[4771]: E1001 15:20:47.674004 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b739e63-624b-487b-9108-7cd02eb190ee" containerName="registry-server" Oct 01 15:20:47 crc kubenswrapper[4771]: I1001 15:20:47.674010 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b739e63-624b-487b-9108-7cd02eb190ee" containerName="registry-server" Oct 01 15:20:47 crc kubenswrapper[4771]: I1001 15:20:47.674184 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b739e63-624b-487b-9108-7cd02eb190ee" containerName="registry-server" Oct 01 15:20:47 crc kubenswrapper[4771]: I1001 15:20:47.674203 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc2497a6-91b8-42cc-8909-774b7bd7e83e" containerName="registry-server" Oct 01 15:20:47 crc kubenswrapper[4771]: I1001 15:20:47.675546 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8v7zf" Oct 01 15:20:47 crc kubenswrapper[4771]: I1001 15:20:47.685880 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8v7zf"] Oct 01 15:20:47 crc kubenswrapper[4771]: I1001 15:20:47.694750 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfb6g\" (UniqueName: \"kubernetes.io/projected/c99d7ba3-b8ea-4898-b08c-f8337e10193b-kube-api-access-jfb6g\") pod \"certified-operators-8v7zf\" (UID: \"c99d7ba3-b8ea-4898-b08c-f8337e10193b\") " pod="openshift-marketplace/certified-operators-8v7zf" Oct 01 15:20:47 crc kubenswrapper[4771]: I1001 15:20:47.694885 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c99d7ba3-b8ea-4898-b08c-f8337e10193b-utilities\") pod \"certified-operators-8v7zf\" (UID: \"c99d7ba3-b8ea-4898-b08c-f8337e10193b\") " pod="openshift-marketplace/certified-operators-8v7zf" Oct 01 15:20:47 crc kubenswrapper[4771]: I1001 15:20:47.695189 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c99d7ba3-b8ea-4898-b08c-f8337e10193b-catalog-content\") pod \"certified-operators-8v7zf\" (UID: \"c99d7ba3-b8ea-4898-b08c-f8337e10193b\") " pod="openshift-marketplace/certified-operators-8v7zf" Oct 01 15:20:47 crc kubenswrapper[4771]: I1001 15:20:47.796473 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c99d7ba3-b8ea-4898-b08c-f8337e10193b-catalog-content\") pod \"certified-operators-8v7zf\" (UID: \"c99d7ba3-b8ea-4898-b08c-f8337e10193b\") " pod="openshift-marketplace/certified-operators-8v7zf" Oct 01 15:20:47 crc kubenswrapper[4771]: I1001 15:20:47.796535 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfb6g\" (UniqueName: \"kubernetes.io/projected/c99d7ba3-b8ea-4898-b08c-f8337e10193b-kube-api-access-jfb6g\") pod \"certified-operators-8v7zf\" (UID: \"c99d7ba3-b8ea-4898-b08c-f8337e10193b\") " pod="openshift-marketplace/certified-operators-8v7zf" Oct 01 15:20:47 crc kubenswrapper[4771]: I1001 15:20:47.796609 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c99d7ba3-b8ea-4898-b08c-f8337e10193b-utilities\") pod \"certified-operators-8v7zf\" (UID: \"c99d7ba3-b8ea-4898-b08c-f8337e10193b\") " pod="openshift-marketplace/certified-operators-8v7zf" Oct 01 15:20:47 crc kubenswrapper[4771]: I1001 15:20:47.797150 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c99d7ba3-b8ea-4898-b08c-f8337e10193b-utilities\") pod \"certified-operators-8v7zf\" (UID: \"c99d7ba3-b8ea-4898-b08c-f8337e10193b\") " pod="openshift-marketplace/certified-operators-8v7zf" Oct 01 15:20:47 crc kubenswrapper[4771]: I1001 15:20:47.797411 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c99d7ba3-b8ea-4898-b08c-f8337e10193b-catalog-content\") pod \"certified-operators-8v7zf\" (UID: \"c99d7ba3-b8ea-4898-b08c-f8337e10193b\") " pod="openshift-marketplace/certified-operators-8v7zf" Oct 01 15:20:47 crc kubenswrapper[4771]: I1001 15:20:47.822078 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfb6g\" (UniqueName: \"kubernetes.io/projected/c99d7ba3-b8ea-4898-b08c-f8337e10193b-kube-api-access-jfb6g\") pod \"certified-operators-8v7zf\" (UID: \"c99d7ba3-b8ea-4898-b08c-f8337e10193b\") " pod="openshift-marketplace/certified-operators-8v7zf" Oct 01 15:20:48 crc kubenswrapper[4771]: I1001 15:20:48.008008 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8v7zf" Oct 01 15:20:48 crc kubenswrapper[4771]: I1001 15:20:48.512980 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8v7zf"] Oct 01 15:20:49 crc kubenswrapper[4771]: I1001 15:20:49.357165 4771 generic.go:334] "Generic (PLEG): container finished" podID="c99d7ba3-b8ea-4898-b08c-f8337e10193b" containerID="d575c6b2e3b62f3adc6bc8efc27af25281a4c3c42617545a7446a1b5d65fff19" exitCode=0 Oct 01 15:20:49 crc kubenswrapper[4771]: I1001 15:20:49.357492 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8v7zf" event={"ID":"c99d7ba3-b8ea-4898-b08c-f8337e10193b","Type":"ContainerDied","Data":"d575c6b2e3b62f3adc6bc8efc27af25281a4c3c42617545a7446a1b5d65fff19"} Oct 01 15:20:49 crc kubenswrapper[4771]: I1001 15:20:49.357529 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8v7zf" event={"ID":"c99d7ba3-b8ea-4898-b08c-f8337e10193b","Type":"ContainerStarted","Data":"9a8caa6dbcebdc03f9fee6948ab8f988c220207cdf0db873398df002fa24937b"} Oct 01 15:20:52 crc kubenswrapper[4771]: I1001 15:20:52.398347 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8v7zf" event={"ID":"c99d7ba3-b8ea-4898-b08c-f8337e10193b","Type":"ContainerStarted","Data":"fa8470f98056d28340ad8488e401de173349e4f0fcd1fd69a6a2f7adc675dfd6"} Oct 01 15:20:53 crc kubenswrapper[4771]: I1001 15:20:53.413227 4771 generic.go:334] "Generic (PLEG): container finished" podID="c99d7ba3-b8ea-4898-b08c-f8337e10193b" containerID="fa8470f98056d28340ad8488e401de173349e4f0fcd1fd69a6a2f7adc675dfd6" exitCode=0 Oct 01 15:20:53 crc kubenswrapper[4771]: I1001 15:20:53.413354 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8v7zf" event={"ID":"c99d7ba3-b8ea-4898-b08c-f8337e10193b","Type":"ContainerDied","Data":"fa8470f98056d28340ad8488e401de173349e4f0fcd1fd69a6a2f7adc675dfd6"} Oct 01 15:20:55 crc kubenswrapper[4771]: I1001 15:20:55.437624 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8v7zf" event={"ID":"c99d7ba3-b8ea-4898-b08c-f8337e10193b","Type":"ContainerStarted","Data":"8bb6b0f6a6e7075aaa7eb6cd14d2727d8f374a99dbea78caf2574d07deea28b0"} Oct 01 15:20:55 crc kubenswrapper[4771]: I1001 15:20:55.469247 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8v7zf" podStartSLOduration=3.877833302 podStartE2EDuration="8.469221858s" podCreationTimestamp="2025-10-01 15:20:47 +0000 UTC" firstStartedPulling="2025-10-01 15:20:49.359211689 +0000 UTC m=+1493.978386860" lastFinishedPulling="2025-10-01 15:20:53.950600205 +0000 UTC m=+1498.569775416" observedRunningTime="2025-10-01 15:20:55.460977961 +0000 UTC m=+1500.080153232" watchObservedRunningTime="2025-10-01 15:20:55.469221858 +0000 UTC m=+1500.088397049" Oct 01 15:20:58 crc kubenswrapper[4771]: I1001 15:20:58.008558 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8v7zf" Oct 01 15:20:58 crc kubenswrapper[4771]: I1001 15:20:58.009017 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8v7zf" Oct 01 15:20:58 crc kubenswrapper[4771]: I1001 15:20:58.070059 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8v7zf" Oct 01 15:21:00 crc kubenswrapper[4771]: I1001 15:21:00.797614 4771 scope.go:117] "RemoveContainer" containerID="549414480fe1f01dcde3c7a3370fff68f0e7e8fd016e99e433070e1d295463d0" Oct 01 15:21:00 crc kubenswrapper[4771]: I1001 15:21:00.835877 4771 scope.go:117] "RemoveContainer" containerID="0aef501b8ed99dbcdaf337e52e02094a16b36ba8436428288b3845e0ff56df48" Oct 01 15:21:08 crc kubenswrapper[4771]: I1001 15:21:08.097149 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8v7zf" Oct 01 15:21:08 crc kubenswrapper[4771]: I1001 15:21:08.186261 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8v7zf"] Oct 01 15:21:08 crc kubenswrapper[4771]: I1001 15:21:08.612776 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8v7zf" podUID="c99d7ba3-b8ea-4898-b08c-f8337e10193b" containerName="registry-server" containerID="cri-o://8bb6b0f6a6e7075aaa7eb6cd14d2727d8f374a99dbea78caf2574d07deea28b0" gracePeriod=2 Oct 01 15:21:09 crc kubenswrapper[4771]: I1001 15:21:09.166952 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8v7zf" Oct 01 15:21:09 crc kubenswrapper[4771]: I1001 15:21:09.349927 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c99d7ba3-b8ea-4898-b08c-f8337e10193b-catalog-content\") pod \"c99d7ba3-b8ea-4898-b08c-f8337e10193b\" (UID: \"c99d7ba3-b8ea-4898-b08c-f8337e10193b\") " Oct 01 15:21:09 crc kubenswrapper[4771]: I1001 15:21:09.350330 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfb6g\" (UniqueName: \"kubernetes.io/projected/c99d7ba3-b8ea-4898-b08c-f8337e10193b-kube-api-access-jfb6g\") pod \"c99d7ba3-b8ea-4898-b08c-f8337e10193b\" (UID: \"c99d7ba3-b8ea-4898-b08c-f8337e10193b\") " Oct 01 15:21:09 crc kubenswrapper[4771]: I1001 15:21:09.350364 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c99d7ba3-b8ea-4898-b08c-f8337e10193b-utilities\") pod \"c99d7ba3-b8ea-4898-b08c-f8337e10193b\" (UID: \"c99d7ba3-b8ea-4898-b08c-f8337e10193b\") " Oct 01 15:21:09 crc kubenswrapper[4771]: I1001 15:21:09.351542 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c99d7ba3-b8ea-4898-b08c-f8337e10193b-utilities" (OuterVolumeSpecName: "utilities") pod "c99d7ba3-b8ea-4898-b08c-f8337e10193b" (UID: "c99d7ba3-b8ea-4898-b08c-f8337e10193b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:21:09 crc kubenswrapper[4771]: I1001 15:21:09.360899 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c99d7ba3-b8ea-4898-b08c-f8337e10193b-kube-api-access-jfb6g" (OuterVolumeSpecName: "kube-api-access-jfb6g") pod "c99d7ba3-b8ea-4898-b08c-f8337e10193b" (UID: "c99d7ba3-b8ea-4898-b08c-f8337e10193b"). InnerVolumeSpecName "kube-api-access-jfb6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:21:09 crc kubenswrapper[4771]: I1001 15:21:09.430259 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c99d7ba3-b8ea-4898-b08c-f8337e10193b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c99d7ba3-b8ea-4898-b08c-f8337e10193b" (UID: "c99d7ba3-b8ea-4898-b08c-f8337e10193b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:21:09 crc kubenswrapper[4771]: I1001 15:21:09.453926 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfb6g\" (UniqueName: \"kubernetes.io/projected/c99d7ba3-b8ea-4898-b08c-f8337e10193b-kube-api-access-jfb6g\") on node \"crc\" DevicePath \"\"" Oct 01 15:21:09 crc kubenswrapper[4771]: I1001 15:21:09.454005 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c99d7ba3-b8ea-4898-b08c-f8337e10193b-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 15:21:09 crc kubenswrapper[4771]: I1001 15:21:09.454041 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c99d7ba3-b8ea-4898-b08c-f8337e10193b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 15:21:09 crc kubenswrapper[4771]: I1001 15:21:09.627898 4771 generic.go:334] "Generic (PLEG): container finished" podID="c99d7ba3-b8ea-4898-b08c-f8337e10193b" containerID="8bb6b0f6a6e7075aaa7eb6cd14d2727d8f374a99dbea78caf2574d07deea28b0" exitCode=0 Oct 01 15:21:09 crc kubenswrapper[4771]: I1001 15:21:09.627944 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8v7zf" event={"ID":"c99d7ba3-b8ea-4898-b08c-f8337e10193b","Type":"ContainerDied","Data":"8bb6b0f6a6e7075aaa7eb6cd14d2727d8f374a99dbea78caf2574d07deea28b0"} Oct 01 15:21:09 crc kubenswrapper[4771]: I1001 15:21:09.627985 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8v7zf" event={"ID":"c99d7ba3-b8ea-4898-b08c-f8337e10193b","Type":"ContainerDied","Data":"9a8caa6dbcebdc03f9fee6948ab8f988c220207cdf0db873398df002fa24937b"} Oct 01 15:21:09 crc kubenswrapper[4771]: I1001 15:21:09.628029 4771 scope.go:117] "RemoveContainer" containerID="8bb6b0f6a6e7075aaa7eb6cd14d2727d8f374a99dbea78caf2574d07deea28b0" Oct 01 15:21:09 crc kubenswrapper[4771]: I1001 15:21:09.628132 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8v7zf" Oct 01 15:21:09 crc kubenswrapper[4771]: I1001 15:21:09.656201 4771 scope.go:117] "RemoveContainer" containerID="fa8470f98056d28340ad8488e401de173349e4f0fcd1fd69a6a2f7adc675dfd6" Oct 01 15:21:09 crc kubenswrapper[4771]: I1001 15:21:09.687893 4771 scope.go:117] "RemoveContainer" containerID="d575c6b2e3b62f3adc6bc8efc27af25281a4c3c42617545a7446a1b5d65fff19" Oct 01 15:21:09 crc kubenswrapper[4771]: I1001 15:21:09.697255 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8v7zf"] Oct 01 15:21:09 crc kubenswrapper[4771]: I1001 15:21:09.710660 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8v7zf"] Oct 01 15:21:09 crc kubenswrapper[4771]: I1001 15:21:09.755516 4771 scope.go:117] "RemoveContainer" containerID="8bb6b0f6a6e7075aaa7eb6cd14d2727d8f374a99dbea78caf2574d07deea28b0" Oct 01 15:21:09 crc kubenswrapper[4771]: E1001 15:21:09.756199 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bb6b0f6a6e7075aaa7eb6cd14d2727d8f374a99dbea78caf2574d07deea28b0\": container with ID starting with 8bb6b0f6a6e7075aaa7eb6cd14d2727d8f374a99dbea78caf2574d07deea28b0 not found: ID does not exist" containerID="8bb6b0f6a6e7075aaa7eb6cd14d2727d8f374a99dbea78caf2574d07deea28b0" Oct 01 15:21:09 crc kubenswrapper[4771]: I1001 15:21:09.756240 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bb6b0f6a6e7075aaa7eb6cd14d2727d8f374a99dbea78caf2574d07deea28b0"} err="failed to get container status \"8bb6b0f6a6e7075aaa7eb6cd14d2727d8f374a99dbea78caf2574d07deea28b0\": rpc error: code = NotFound desc = could not find container \"8bb6b0f6a6e7075aaa7eb6cd14d2727d8f374a99dbea78caf2574d07deea28b0\": container with ID starting with 8bb6b0f6a6e7075aaa7eb6cd14d2727d8f374a99dbea78caf2574d07deea28b0 not found: ID does not exist" Oct 01 15:21:09 crc kubenswrapper[4771]: I1001 15:21:09.756270 4771 scope.go:117] "RemoveContainer" containerID="fa8470f98056d28340ad8488e401de173349e4f0fcd1fd69a6a2f7adc675dfd6" Oct 01 15:21:09 crc kubenswrapper[4771]: E1001 15:21:09.756697 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa8470f98056d28340ad8488e401de173349e4f0fcd1fd69a6a2f7adc675dfd6\": container with ID starting with fa8470f98056d28340ad8488e401de173349e4f0fcd1fd69a6a2f7adc675dfd6 not found: ID does not exist" containerID="fa8470f98056d28340ad8488e401de173349e4f0fcd1fd69a6a2f7adc675dfd6" Oct 01 15:21:09 crc kubenswrapper[4771]: I1001 15:21:09.756822 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa8470f98056d28340ad8488e401de173349e4f0fcd1fd69a6a2f7adc675dfd6"} err="failed to get container status \"fa8470f98056d28340ad8488e401de173349e4f0fcd1fd69a6a2f7adc675dfd6\": rpc error: code = NotFound desc = could not find container \"fa8470f98056d28340ad8488e401de173349e4f0fcd1fd69a6a2f7adc675dfd6\": container with ID starting with fa8470f98056d28340ad8488e401de173349e4f0fcd1fd69a6a2f7adc675dfd6 not found: ID does not exist" Oct 01 15:21:09 crc kubenswrapper[4771]: I1001 15:21:09.756883 4771 scope.go:117] "RemoveContainer" containerID="d575c6b2e3b62f3adc6bc8efc27af25281a4c3c42617545a7446a1b5d65fff19" Oct 01 15:21:09 crc kubenswrapper[4771]: E1001 15:21:09.757442 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d575c6b2e3b62f3adc6bc8efc27af25281a4c3c42617545a7446a1b5d65fff19\": container with ID starting with d575c6b2e3b62f3adc6bc8efc27af25281a4c3c42617545a7446a1b5d65fff19 not found: ID does not exist" containerID="d575c6b2e3b62f3adc6bc8efc27af25281a4c3c42617545a7446a1b5d65fff19" Oct 01 15:21:09 crc kubenswrapper[4771]: I1001 15:21:09.757479 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d575c6b2e3b62f3adc6bc8efc27af25281a4c3c42617545a7446a1b5d65fff19"} err="failed to get container status \"d575c6b2e3b62f3adc6bc8efc27af25281a4c3c42617545a7446a1b5d65fff19\": rpc error: code = NotFound desc = could not find container \"d575c6b2e3b62f3adc6bc8efc27af25281a4c3c42617545a7446a1b5d65fff19\": container with ID starting with d575c6b2e3b62f3adc6bc8efc27af25281a4c3c42617545a7446a1b5d65fff19 not found: ID does not exist" Oct 01 15:21:10 crc kubenswrapper[4771]: I1001 15:21:10.013346 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c99d7ba3-b8ea-4898-b08c-f8337e10193b" path="/var/lib/kubelet/pods/c99d7ba3-b8ea-4898-b08c-f8337e10193b/volumes" Oct 01 15:21:12 crc kubenswrapper[4771]: I1001 15:21:12.177049 4771 patch_prober.go:28] interesting pod/machine-config-daemon-vck47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 15:21:12 crc kubenswrapper[4771]: I1001 15:21:12.177664 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 15:21:12 crc kubenswrapper[4771]: I1001 15:21:12.177713 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vck47" Oct 01 15:21:12 crc kubenswrapper[4771]: I1001 15:21:12.178505 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"14f82a58b71f640691d4b9ebb4629f11abf0ca28aa3c0c30ba09d2fe31d6a0a2"} pod="openshift-machine-config-operator/machine-config-daemon-vck47" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 15:21:12 crc kubenswrapper[4771]: I1001 15:21:12.178570 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" containerID="cri-o://14f82a58b71f640691d4b9ebb4629f11abf0ca28aa3c0c30ba09d2fe31d6a0a2" gracePeriod=600 Oct 01 15:21:12 crc kubenswrapper[4771]: E1001 15:21:12.310574 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:21:12 crc kubenswrapper[4771]: I1001 15:21:12.661668 4771 generic.go:334] "Generic (PLEG): container finished" podID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerID="14f82a58b71f640691d4b9ebb4629f11abf0ca28aa3c0c30ba09d2fe31d6a0a2" exitCode=0 Oct 01 15:21:12 crc kubenswrapper[4771]: I1001 15:21:12.661707 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" event={"ID":"289ee6d3-fabe-417f-964c-76ca03c143cc","Type":"ContainerDied","Data":"14f82a58b71f640691d4b9ebb4629f11abf0ca28aa3c0c30ba09d2fe31d6a0a2"} Oct 01 15:21:12 crc kubenswrapper[4771]: I1001 15:21:12.661763 4771 scope.go:117] "RemoveContainer" containerID="6fb5c90115d7e5e2b881eca81e834dd62c83b4059c941e9b801dafa27eac271e" Oct 01 15:21:12 crc kubenswrapper[4771]: I1001 15:21:12.662801 4771 scope.go:117] "RemoveContainer" containerID="14f82a58b71f640691d4b9ebb4629f11abf0ca28aa3c0c30ba09d2fe31d6a0a2" Oct 01 15:21:12 crc kubenswrapper[4771]: E1001 15:21:12.663102 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:21:22 crc kubenswrapper[4771]: I1001 15:21:22.979708 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bwhhg"] Oct 01 15:21:22 crc kubenswrapper[4771]: E1001 15:21:22.980748 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c99d7ba3-b8ea-4898-b08c-f8337e10193b" containerName="extract-content" Oct 01 15:21:22 crc kubenswrapper[4771]: I1001 15:21:22.980766 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="c99d7ba3-b8ea-4898-b08c-f8337e10193b" containerName="extract-content" Oct 01 15:21:22 crc kubenswrapper[4771]: E1001 15:21:22.980785 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c99d7ba3-b8ea-4898-b08c-f8337e10193b" containerName="extract-utilities" Oct 01 15:21:22 crc kubenswrapper[4771]: I1001 15:21:22.980791 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="c99d7ba3-b8ea-4898-b08c-f8337e10193b" containerName="extract-utilities" Oct 01 15:21:22 crc kubenswrapper[4771]: E1001 15:21:22.980822 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c99d7ba3-b8ea-4898-b08c-f8337e10193b" containerName="registry-server" Oct 01 15:21:22 crc kubenswrapper[4771]: I1001 15:21:22.980831 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="c99d7ba3-b8ea-4898-b08c-f8337e10193b" containerName="registry-server" Oct 01 15:21:22 crc kubenswrapper[4771]: I1001 15:21:22.981059 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="c99d7ba3-b8ea-4898-b08c-f8337e10193b" containerName="registry-server" Oct 01 15:21:22 crc kubenswrapper[4771]: I1001 15:21:22.982443 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bwhhg" Oct 01 15:21:23 crc kubenswrapper[4771]: I1001 15:21:23.003527 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bwhhg"] Oct 01 15:21:23 crc kubenswrapper[4771]: I1001 15:21:23.048983 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93b6553c-fa44-4b14-a1e9-1eebb51ccf52-catalog-content\") pod \"redhat-operators-bwhhg\" (UID: \"93b6553c-fa44-4b14-a1e9-1eebb51ccf52\") " pod="openshift-marketplace/redhat-operators-bwhhg" Oct 01 15:21:23 crc kubenswrapper[4771]: I1001 15:21:23.049115 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmctg\" (UniqueName: \"kubernetes.io/projected/93b6553c-fa44-4b14-a1e9-1eebb51ccf52-kube-api-access-rmctg\") pod \"redhat-operators-bwhhg\" (UID: \"93b6553c-fa44-4b14-a1e9-1eebb51ccf52\") " pod="openshift-marketplace/redhat-operators-bwhhg" Oct 01 15:21:23 crc kubenswrapper[4771]: I1001 15:21:23.049253 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93b6553c-fa44-4b14-a1e9-1eebb51ccf52-utilities\") pod \"redhat-operators-bwhhg\" (UID: \"93b6553c-fa44-4b14-a1e9-1eebb51ccf52\") " pod="openshift-marketplace/redhat-operators-bwhhg" Oct 01 15:21:23 crc kubenswrapper[4771]: I1001 15:21:23.151372 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93b6553c-fa44-4b14-a1e9-1eebb51ccf52-catalog-content\") pod \"redhat-operators-bwhhg\" (UID: \"93b6553c-fa44-4b14-a1e9-1eebb51ccf52\") " pod="openshift-marketplace/redhat-operators-bwhhg" Oct 01 15:21:23 crc kubenswrapper[4771]: I1001 15:21:23.151437 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmctg\" (UniqueName: \"kubernetes.io/projected/93b6553c-fa44-4b14-a1e9-1eebb51ccf52-kube-api-access-rmctg\") pod \"redhat-operators-bwhhg\" (UID: \"93b6553c-fa44-4b14-a1e9-1eebb51ccf52\") " pod="openshift-marketplace/redhat-operators-bwhhg" Oct 01 15:21:23 crc kubenswrapper[4771]: I1001 15:21:23.151484 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93b6553c-fa44-4b14-a1e9-1eebb51ccf52-utilities\") pod \"redhat-operators-bwhhg\" (UID: \"93b6553c-fa44-4b14-a1e9-1eebb51ccf52\") " pod="openshift-marketplace/redhat-operators-bwhhg" Oct 01 15:21:23 crc kubenswrapper[4771]: I1001 15:21:23.151947 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93b6553c-fa44-4b14-a1e9-1eebb51ccf52-catalog-content\") pod \"redhat-operators-bwhhg\" (UID: \"93b6553c-fa44-4b14-a1e9-1eebb51ccf52\") " pod="openshift-marketplace/redhat-operators-bwhhg" Oct 01 15:21:23 crc kubenswrapper[4771]: I1001 15:21:23.152000 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93b6553c-fa44-4b14-a1e9-1eebb51ccf52-utilities\") pod \"redhat-operators-bwhhg\" (UID: \"93b6553c-fa44-4b14-a1e9-1eebb51ccf52\") " pod="openshift-marketplace/redhat-operators-bwhhg" Oct 01 15:21:23 crc kubenswrapper[4771]: I1001 15:21:23.170048 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmctg\" (UniqueName: \"kubernetes.io/projected/93b6553c-fa44-4b14-a1e9-1eebb51ccf52-kube-api-access-rmctg\") pod \"redhat-operators-bwhhg\" (UID: \"93b6553c-fa44-4b14-a1e9-1eebb51ccf52\") " pod="openshift-marketplace/redhat-operators-bwhhg" Oct 01 15:21:23 crc kubenswrapper[4771]: I1001 15:21:23.319369 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bwhhg" Oct 01 15:21:23 crc kubenswrapper[4771]: I1001 15:21:23.792834 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bwhhg"] Oct 01 15:21:23 crc kubenswrapper[4771]: I1001 15:21:23.985300 4771 scope.go:117] "RemoveContainer" containerID="14f82a58b71f640691d4b9ebb4629f11abf0ca28aa3c0c30ba09d2fe31d6a0a2" Oct 01 15:21:23 crc kubenswrapper[4771]: E1001 15:21:23.985688 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:21:24 crc kubenswrapper[4771]: I1001 15:21:24.817365 4771 generic.go:334] "Generic (PLEG): container finished" podID="93b6553c-fa44-4b14-a1e9-1eebb51ccf52" containerID="ba0dc3ad6c50277640389d55a21749ea88bcb63fd5d43c1cca1397fb466d93cc" exitCode=0 Oct 01 15:21:24 crc kubenswrapper[4771]: I1001 15:21:24.817409 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bwhhg" event={"ID":"93b6553c-fa44-4b14-a1e9-1eebb51ccf52","Type":"ContainerDied","Data":"ba0dc3ad6c50277640389d55a21749ea88bcb63fd5d43c1cca1397fb466d93cc"} Oct 01 15:21:24 crc kubenswrapper[4771]: I1001 15:21:24.817437 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bwhhg" event={"ID":"93b6553c-fa44-4b14-a1e9-1eebb51ccf52","Type":"ContainerStarted","Data":"aae577945ee0c6e22fa27c87bbc1df75c06757264a7086616ba5251034e7bd1d"} Oct 01 15:21:25 crc kubenswrapper[4771]: I1001 15:21:25.831867 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bwhhg" event={"ID":"93b6553c-fa44-4b14-a1e9-1eebb51ccf52","Type":"ContainerStarted","Data":"25de68f52bb95b830fd405686c17f867e6f8195b9c26bbddcc3beb6e9e3e6ef2"} Oct 01 15:21:26 crc kubenswrapper[4771]: I1001 15:21:26.846045 4771 generic.go:334] "Generic (PLEG): container finished" podID="93b6553c-fa44-4b14-a1e9-1eebb51ccf52" containerID="25de68f52bb95b830fd405686c17f867e6f8195b9c26bbddcc3beb6e9e3e6ef2" exitCode=0 Oct 01 15:21:26 crc kubenswrapper[4771]: I1001 15:21:26.846142 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bwhhg" event={"ID":"93b6553c-fa44-4b14-a1e9-1eebb51ccf52","Type":"ContainerDied","Data":"25de68f52bb95b830fd405686c17f867e6f8195b9c26bbddcc3beb6e9e3e6ef2"} Oct 01 15:21:27 crc kubenswrapper[4771]: I1001 15:21:27.857505 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bwhhg" event={"ID":"93b6553c-fa44-4b14-a1e9-1eebb51ccf52","Type":"ContainerStarted","Data":"fc185d18f63ec0bba152cee7c6d4da54ef26729b7a006ac3aeb2f4c51ca37559"} Oct 01 15:21:27 crc kubenswrapper[4771]: I1001 15:21:27.882020 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bwhhg" podStartSLOduration=3.207214703 podStartE2EDuration="5.882003035s" podCreationTimestamp="2025-10-01 15:21:22 +0000 UTC" firstStartedPulling="2025-10-01 15:21:24.819556556 +0000 UTC m=+1529.438731727" lastFinishedPulling="2025-10-01 15:21:27.494344878 +0000 UTC m=+1532.113520059" observedRunningTime="2025-10-01 15:21:27.880215732 +0000 UTC m=+1532.499390903" watchObservedRunningTime="2025-10-01 15:21:27.882003035 +0000 UTC m=+1532.501178206" Oct 01 15:21:33 crc kubenswrapper[4771]: I1001 15:21:33.320573 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bwhhg" Oct 01 15:21:33 crc kubenswrapper[4771]: I1001 15:21:33.321403 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bwhhg" Oct 01 15:21:33 crc kubenswrapper[4771]: I1001 15:21:33.397660 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bwhhg" Oct 01 15:21:33 crc kubenswrapper[4771]: I1001 15:21:33.976693 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bwhhg" Oct 01 15:21:34 crc kubenswrapper[4771]: I1001 15:21:34.035649 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bwhhg"] Oct 01 15:21:35 crc kubenswrapper[4771]: I1001 15:21:35.939173 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bwhhg" podUID="93b6553c-fa44-4b14-a1e9-1eebb51ccf52" containerName="registry-server" containerID="cri-o://fc185d18f63ec0bba152cee7c6d4da54ef26729b7a006ac3aeb2f4c51ca37559" gracePeriod=2 Oct 01 15:21:36 crc kubenswrapper[4771]: I1001 15:21:36.431966 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bwhhg" Oct 01 15:21:36 crc kubenswrapper[4771]: I1001 15:21:36.533606 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93b6553c-fa44-4b14-a1e9-1eebb51ccf52-catalog-content\") pod \"93b6553c-fa44-4b14-a1e9-1eebb51ccf52\" (UID: \"93b6553c-fa44-4b14-a1e9-1eebb51ccf52\") " Oct 01 15:21:36 crc kubenswrapper[4771]: I1001 15:21:36.533693 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmctg\" (UniqueName: \"kubernetes.io/projected/93b6553c-fa44-4b14-a1e9-1eebb51ccf52-kube-api-access-rmctg\") pod \"93b6553c-fa44-4b14-a1e9-1eebb51ccf52\" (UID: \"93b6553c-fa44-4b14-a1e9-1eebb51ccf52\") " Oct 01 15:21:36 crc kubenswrapper[4771]: I1001 15:21:36.533921 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93b6553c-fa44-4b14-a1e9-1eebb51ccf52-utilities\") pod \"93b6553c-fa44-4b14-a1e9-1eebb51ccf52\" (UID: \"93b6553c-fa44-4b14-a1e9-1eebb51ccf52\") " Oct 01 15:21:36 crc kubenswrapper[4771]: I1001 15:21:36.535201 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93b6553c-fa44-4b14-a1e9-1eebb51ccf52-utilities" (OuterVolumeSpecName: "utilities") pod "93b6553c-fa44-4b14-a1e9-1eebb51ccf52" (UID: "93b6553c-fa44-4b14-a1e9-1eebb51ccf52"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:21:36 crc kubenswrapper[4771]: I1001 15:21:36.541089 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93b6553c-fa44-4b14-a1e9-1eebb51ccf52-kube-api-access-rmctg" (OuterVolumeSpecName: "kube-api-access-rmctg") pod "93b6553c-fa44-4b14-a1e9-1eebb51ccf52" (UID: "93b6553c-fa44-4b14-a1e9-1eebb51ccf52"). InnerVolumeSpecName "kube-api-access-rmctg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:21:36 crc kubenswrapper[4771]: I1001 15:21:36.608670 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93b6553c-fa44-4b14-a1e9-1eebb51ccf52-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "93b6553c-fa44-4b14-a1e9-1eebb51ccf52" (UID: "93b6553c-fa44-4b14-a1e9-1eebb51ccf52"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:21:36 crc kubenswrapper[4771]: I1001 15:21:36.636050 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93b6553c-fa44-4b14-a1e9-1eebb51ccf52-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 15:21:36 crc kubenswrapper[4771]: I1001 15:21:36.636080 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93b6553c-fa44-4b14-a1e9-1eebb51ccf52-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 15:21:36 crc kubenswrapper[4771]: I1001 15:21:36.636096 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmctg\" (UniqueName: \"kubernetes.io/projected/93b6553c-fa44-4b14-a1e9-1eebb51ccf52-kube-api-access-rmctg\") on node \"crc\" DevicePath \"\"" Oct 01 15:21:36 crc kubenswrapper[4771]: I1001 15:21:36.951938 4771 generic.go:334] "Generic (PLEG): container finished" podID="93b6553c-fa44-4b14-a1e9-1eebb51ccf52" containerID="fc185d18f63ec0bba152cee7c6d4da54ef26729b7a006ac3aeb2f4c51ca37559" exitCode=0 Oct 01 15:21:36 crc kubenswrapper[4771]: I1001 15:21:36.952028 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bwhhg" Oct 01 15:21:36 crc kubenswrapper[4771]: I1001 15:21:36.952042 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bwhhg" event={"ID":"93b6553c-fa44-4b14-a1e9-1eebb51ccf52","Type":"ContainerDied","Data":"fc185d18f63ec0bba152cee7c6d4da54ef26729b7a006ac3aeb2f4c51ca37559"} Oct 01 15:21:36 crc kubenswrapper[4771]: I1001 15:21:36.952305 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bwhhg" event={"ID":"93b6553c-fa44-4b14-a1e9-1eebb51ccf52","Type":"ContainerDied","Data":"aae577945ee0c6e22fa27c87bbc1df75c06757264a7086616ba5251034e7bd1d"} Oct 01 15:21:36 crc kubenswrapper[4771]: I1001 15:21:36.952327 4771 scope.go:117] "RemoveContainer" containerID="fc185d18f63ec0bba152cee7c6d4da54ef26729b7a006ac3aeb2f4c51ca37559" Oct 01 15:21:36 crc kubenswrapper[4771]: I1001 15:21:36.985136 4771 scope.go:117] "RemoveContainer" containerID="14f82a58b71f640691d4b9ebb4629f11abf0ca28aa3c0c30ba09d2fe31d6a0a2" Oct 01 15:21:36 crc kubenswrapper[4771]: E1001 15:21:36.985429 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:21:36 crc kubenswrapper[4771]: I1001 15:21:36.994134 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bwhhg"] Oct 01 15:21:36 crc kubenswrapper[4771]: I1001 15:21:36.996766 4771 scope.go:117] "RemoveContainer" containerID="25de68f52bb95b830fd405686c17f867e6f8195b9c26bbddcc3beb6e9e3e6ef2" Oct 01 15:21:37 crc kubenswrapper[4771]: I1001 15:21:37.010162 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bwhhg"] Oct 01 15:21:37 crc kubenswrapper[4771]: I1001 15:21:37.038943 4771 scope.go:117] "RemoveContainer" containerID="ba0dc3ad6c50277640389d55a21749ea88bcb63fd5d43c1cca1397fb466d93cc" Oct 01 15:21:37 crc kubenswrapper[4771]: I1001 15:21:37.073952 4771 scope.go:117] "RemoveContainer" containerID="fc185d18f63ec0bba152cee7c6d4da54ef26729b7a006ac3aeb2f4c51ca37559" Oct 01 15:21:37 crc kubenswrapper[4771]: E1001 15:21:37.074406 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc185d18f63ec0bba152cee7c6d4da54ef26729b7a006ac3aeb2f4c51ca37559\": container with ID starting with fc185d18f63ec0bba152cee7c6d4da54ef26729b7a006ac3aeb2f4c51ca37559 not found: ID does not exist" containerID="fc185d18f63ec0bba152cee7c6d4da54ef26729b7a006ac3aeb2f4c51ca37559" Oct 01 15:21:37 crc kubenswrapper[4771]: I1001 15:21:37.074439 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc185d18f63ec0bba152cee7c6d4da54ef26729b7a006ac3aeb2f4c51ca37559"} err="failed to get container status \"fc185d18f63ec0bba152cee7c6d4da54ef26729b7a006ac3aeb2f4c51ca37559\": rpc error: code = NotFound desc = could not find container \"fc185d18f63ec0bba152cee7c6d4da54ef26729b7a006ac3aeb2f4c51ca37559\": container with ID starting with fc185d18f63ec0bba152cee7c6d4da54ef26729b7a006ac3aeb2f4c51ca37559 not found: ID does not exist" Oct 01 15:21:37 crc kubenswrapper[4771]: I1001 15:21:37.074461 4771 scope.go:117] "RemoveContainer" containerID="25de68f52bb95b830fd405686c17f867e6f8195b9c26bbddcc3beb6e9e3e6ef2" Oct 01 15:21:37 crc kubenswrapper[4771]: E1001 15:21:37.074701 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25de68f52bb95b830fd405686c17f867e6f8195b9c26bbddcc3beb6e9e3e6ef2\": container with ID starting with 25de68f52bb95b830fd405686c17f867e6f8195b9c26bbddcc3beb6e9e3e6ef2 not found: ID does not exist" containerID="25de68f52bb95b830fd405686c17f867e6f8195b9c26bbddcc3beb6e9e3e6ef2" Oct 01 15:21:37 crc kubenswrapper[4771]: I1001 15:21:37.074810 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25de68f52bb95b830fd405686c17f867e6f8195b9c26bbddcc3beb6e9e3e6ef2"} err="failed to get container status \"25de68f52bb95b830fd405686c17f867e6f8195b9c26bbddcc3beb6e9e3e6ef2\": rpc error: code = NotFound desc = could not find container \"25de68f52bb95b830fd405686c17f867e6f8195b9c26bbddcc3beb6e9e3e6ef2\": container with ID starting with 25de68f52bb95b830fd405686c17f867e6f8195b9c26bbddcc3beb6e9e3e6ef2 not found: ID does not exist" Oct 01 15:21:37 crc kubenswrapper[4771]: I1001 15:21:37.074890 4771 scope.go:117] "RemoveContainer" containerID="ba0dc3ad6c50277640389d55a21749ea88bcb63fd5d43c1cca1397fb466d93cc" Oct 01 15:21:37 crc kubenswrapper[4771]: E1001 15:21:37.075303 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba0dc3ad6c50277640389d55a21749ea88bcb63fd5d43c1cca1397fb466d93cc\": container with ID starting with ba0dc3ad6c50277640389d55a21749ea88bcb63fd5d43c1cca1397fb466d93cc not found: ID does not exist" containerID="ba0dc3ad6c50277640389d55a21749ea88bcb63fd5d43c1cca1397fb466d93cc" Oct 01 15:21:37 crc kubenswrapper[4771]: I1001 15:21:37.075346 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba0dc3ad6c50277640389d55a21749ea88bcb63fd5d43c1cca1397fb466d93cc"} err="failed to get container status \"ba0dc3ad6c50277640389d55a21749ea88bcb63fd5d43c1cca1397fb466d93cc\": rpc error: code = NotFound desc = could not find container \"ba0dc3ad6c50277640389d55a21749ea88bcb63fd5d43c1cca1397fb466d93cc\": container with ID starting with ba0dc3ad6c50277640389d55a21749ea88bcb63fd5d43c1cca1397fb466d93cc not found: ID does not exist" Oct 01 15:21:38 crc kubenswrapper[4771]: I1001 15:21:38.003467 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93b6553c-fa44-4b14-a1e9-1eebb51ccf52" path="/var/lib/kubelet/pods/93b6553c-fa44-4b14-a1e9-1eebb51ccf52/volumes" Oct 01 15:21:45 crc kubenswrapper[4771]: I1001 15:21:45.066489 4771 generic.go:334] "Generic (PLEG): container finished" podID="0498d724-f802-4a21-9197-f87079f3c96e" containerID="789ca64d4f7cce4d88bb94cba1857cfc928ee441052a6bad568868c7f4f550e6" exitCode=0 Oct 01 15:21:45 crc kubenswrapper[4771]: I1001 15:21:45.066599 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2hsb8" event={"ID":"0498d724-f802-4a21-9197-f87079f3c96e","Type":"ContainerDied","Data":"789ca64d4f7cce4d88bb94cba1857cfc928ee441052a6bad568868c7f4f550e6"} Oct 01 15:21:46 crc kubenswrapper[4771]: I1001 15:21:46.591398 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2hsb8" Oct 01 15:21:46 crc kubenswrapper[4771]: I1001 15:21:46.764212 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0498d724-f802-4a21-9197-f87079f3c96e-ssh-key\") pod \"0498d724-f802-4a21-9197-f87079f3c96e\" (UID: \"0498d724-f802-4a21-9197-f87079f3c96e\") " Oct 01 15:21:46 crc kubenswrapper[4771]: I1001 15:21:46.764262 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0498d724-f802-4a21-9197-f87079f3c96e-bootstrap-combined-ca-bundle\") pod \"0498d724-f802-4a21-9197-f87079f3c96e\" (UID: \"0498d724-f802-4a21-9197-f87079f3c96e\") " Oct 01 15:21:46 crc kubenswrapper[4771]: I1001 15:21:46.764295 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnz2r\" (UniqueName: \"kubernetes.io/projected/0498d724-f802-4a21-9197-f87079f3c96e-kube-api-access-vnz2r\") pod \"0498d724-f802-4a21-9197-f87079f3c96e\" (UID: \"0498d724-f802-4a21-9197-f87079f3c96e\") " Oct 01 15:21:46 crc kubenswrapper[4771]: I1001 15:21:46.764340 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0498d724-f802-4a21-9197-f87079f3c96e-inventory\") pod \"0498d724-f802-4a21-9197-f87079f3c96e\" (UID: \"0498d724-f802-4a21-9197-f87079f3c96e\") " Oct 01 15:21:46 crc kubenswrapper[4771]: I1001 15:21:46.771218 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0498d724-f802-4a21-9197-f87079f3c96e-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "0498d724-f802-4a21-9197-f87079f3c96e" (UID: "0498d724-f802-4a21-9197-f87079f3c96e"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:21:46 crc kubenswrapper[4771]: I1001 15:21:46.771449 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0498d724-f802-4a21-9197-f87079f3c96e-kube-api-access-vnz2r" (OuterVolumeSpecName: "kube-api-access-vnz2r") pod "0498d724-f802-4a21-9197-f87079f3c96e" (UID: "0498d724-f802-4a21-9197-f87079f3c96e"). InnerVolumeSpecName "kube-api-access-vnz2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:21:46 crc kubenswrapper[4771]: I1001 15:21:46.801962 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0498d724-f802-4a21-9197-f87079f3c96e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0498d724-f802-4a21-9197-f87079f3c96e" (UID: "0498d724-f802-4a21-9197-f87079f3c96e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:21:46 crc kubenswrapper[4771]: I1001 15:21:46.811639 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0498d724-f802-4a21-9197-f87079f3c96e-inventory" (OuterVolumeSpecName: "inventory") pod "0498d724-f802-4a21-9197-f87079f3c96e" (UID: "0498d724-f802-4a21-9197-f87079f3c96e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:21:46 crc kubenswrapper[4771]: I1001 15:21:46.866164 4771 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0498d724-f802-4a21-9197-f87079f3c96e-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:21:46 crc kubenswrapper[4771]: I1001 15:21:46.866519 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnz2r\" (UniqueName: \"kubernetes.io/projected/0498d724-f802-4a21-9197-f87079f3c96e-kube-api-access-vnz2r\") on node \"crc\" DevicePath \"\"" Oct 01 15:21:46 crc kubenswrapper[4771]: I1001 15:21:46.866535 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0498d724-f802-4a21-9197-f87079f3c96e-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 15:21:46 crc kubenswrapper[4771]: I1001 15:21:46.866549 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0498d724-f802-4a21-9197-f87079f3c96e-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 15:21:47 crc kubenswrapper[4771]: I1001 15:21:47.089480 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2hsb8" event={"ID":"0498d724-f802-4a21-9197-f87079f3c96e","Type":"ContainerDied","Data":"cc63894a3d6cda7a516a22e25d6912b67495d16b2a21d5b448685b00b0924335"} Oct 01 15:21:47 crc kubenswrapper[4771]: I1001 15:21:47.089685 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc63894a3d6cda7a516a22e25d6912b67495d16b2a21d5b448685b00b0924335" Oct 01 15:21:47 crc kubenswrapper[4771]: I1001 15:21:47.089579 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2hsb8" Oct 01 15:21:47 crc kubenswrapper[4771]: I1001 15:21:47.191988 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9h2g6"] Oct 01 15:21:47 crc kubenswrapper[4771]: E1001 15:21:47.192673 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93b6553c-fa44-4b14-a1e9-1eebb51ccf52" containerName="registry-server" Oct 01 15:21:47 crc kubenswrapper[4771]: I1001 15:21:47.192702 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="93b6553c-fa44-4b14-a1e9-1eebb51ccf52" containerName="registry-server" Oct 01 15:21:47 crc kubenswrapper[4771]: E1001 15:21:47.192767 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0498d724-f802-4a21-9197-f87079f3c96e" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 01 15:21:47 crc kubenswrapper[4771]: I1001 15:21:47.192782 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="0498d724-f802-4a21-9197-f87079f3c96e" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 01 15:21:47 crc kubenswrapper[4771]: E1001 15:21:47.192818 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93b6553c-fa44-4b14-a1e9-1eebb51ccf52" containerName="extract-content" Oct 01 15:21:47 crc kubenswrapper[4771]: I1001 15:21:47.192831 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="93b6553c-fa44-4b14-a1e9-1eebb51ccf52" containerName="extract-content" Oct 01 15:21:47 crc kubenswrapper[4771]: E1001 15:21:47.192855 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93b6553c-fa44-4b14-a1e9-1eebb51ccf52" containerName="extract-utilities" Oct 01 15:21:47 crc kubenswrapper[4771]: I1001 15:21:47.192869 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="93b6553c-fa44-4b14-a1e9-1eebb51ccf52" containerName="extract-utilities" Oct 01 15:21:47 crc kubenswrapper[4771]: I1001 15:21:47.193215 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="93b6553c-fa44-4b14-a1e9-1eebb51ccf52" containerName="registry-server" Oct 01 15:21:47 crc kubenswrapper[4771]: I1001 15:21:47.193291 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="0498d724-f802-4a21-9197-f87079f3c96e" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 01 15:21:47 crc kubenswrapper[4771]: I1001 15:21:47.194328 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9h2g6" Oct 01 15:21:47 crc kubenswrapper[4771]: I1001 15:21:47.200428 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 15:21:47 crc kubenswrapper[4771]: I1001 15:21:47.200453 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fv9b7" Oct 01 15:21:47 crc kubenswrapper[4771]: I1001 15:21:47.202381 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 15:21:47 crc kubenswrapper[4771]: I1001 15:21:47.202863 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 15:21:47 crc kubenswrapper[4771]: I1001 15:21:47.214474 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9h2g6"] Oct 01 15:21:47 crc kubenswrapper[4771]: I1001 15:21:47.284424 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c69dcf56-20fa-4a9a-992c-a73435ff9102-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-9h2g6\" (UID: \"c69dcf56-20fa-4a9a-992c-a73435ff9102\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9h2g6" Oct 01 15:21:47 crc kubenswrapper[4771]: I1001 15:21:47.284499 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgt5m\" (UniqueName: \"kubernetes.io/projected/c69dcf56-20fa-4a9a-992c-a73435ff9102-kube-api-access-wgt5m\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-9h2g6\" (UID: \"c69dcf56-20fa-4a9a-992c-a73435ff9102\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9h2g6" Oct 01 15:21:47 crc kubenswrapper[4771]: I1001 15:21:47.284792 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c69dcf56-20fa-4a9a-992c-a73435ff9102-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-9h2g6\" (UID: \"c69dcf56-20fa-4a9a-992c-a73435ff9102\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9h2g6" Oct 01 15:21:47 crc kubenswrapper[4771]: I1001 15:21:47.386684 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c69dcf56-20fa-4a9a-992c-a73435ff9102-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-9h2g6\" (UID: \"c69dcf56-20fa-4a9a-992c-a73435ff9102\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9h2g6" Oct 01 15:21:47 crc kubenswrapper[4771]: I1001 15:21:47.386820 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c69dcf56-20fa-4a9a-992c-a73435ff9102-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-9h2g6\" (UID: \"c69dcf56-20fa-4a9a-992c-a73435ff9102\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9h2g6" Oct 01 15:21:47 crc kubenswrapper[4771]: I1001 15:21:47.386843 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgt5m\" (UniqueName: \"kubernetes.io/projected/c69dcf56-20fa-4a9a-992c-a73435ff9102-kube-api-access-wgt5m\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-9h2g6\" (UID: \"c69dcf56-20fa-4a9a-992c-a73435ff9102\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9h2g6" Oct 01 15:21:47 crc kubenswrapper[4771]: I1001 15:21:47.395349 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c69dcf56-20fa-4a9a-992c-a73435ff9102-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-9h2g6\" (UID: \"c69dcf56-20fa-4a9a-992c-a73435ff9102\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9h2g6" Oct 01 15:21:47 crc kubenswrapper[4771]: I1001 15:21:47.395437 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c69dcf56-20fa-4a9a-992c-a73435ff9102-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-9h2g6\" (UID: \"c69dcf56-20fa-4a9a-992c-a73435ff9102\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9h2g6" Oct 01 15:21:47 crc kubenswrapper[4771]: I1001 15:21:47.407079 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgt5m\" (UniqueName: \"kubernetes.io/projected/c69dcf56-20fa-4a9a-992c-a73435ff9102-kube-api-access-wgt5m\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-9h2g6\" (UID: \"c69dcf56-20fa-4a9a-992c-a73435ff9102\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9h2g6" Oct 01 15:21:47 crc kubenswrapper[4771]: I1001 15:21:47.522139 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9h2g6" Oct 01 15:21:48 crc kubenswrapper[4771]: I1001 15:21:48.093345 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9h2g6"] Oct 01 15:21:48 crc kubenswrapper[4771]: I1001 15:21:48.985720 4771 scope.go:117] "RemoveContainer" containerID="14f82a58b71f640691d4b9ebb4629f11abf0ca28aa3c0c30ba09d2fe31d6a0a2" Oct 01 15:21:48 crc kubenswrapper[4771]: E1001 15:21:48.986506 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:21:49 crc kubenswrapper[4771]: I1001 15:21:49.116324 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9h2g6" event={"ID":"c69dcf56-20fa-4a9a-992c-a73435ff9102","Type":"ContainerStarted","Data":"d7f633632ecb0792128a438430f89b0ee1f937ccaef1f15f6754569881b371d9"} Oct 01 15:21:49 crc kubenswrapper[4771]: I1001 15:21:49.116415 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9h2g6" event={"ID":"c69dcf56-20fa-4a9a-992c-a73435ff9102","Type":"ContainerStarted","Data":"fbe65fc1bf79826fe31b28cbb29bb40b5a6025840a881b7f96367df482b33d4d"} Oct 01 15:21:49 crc kubenswrapper[4771]: I1001 15:21:49.147543 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9h2g6" podStartSLOduration=1.6251980769999999 podStartE2EDuration="2.147521993s" podCreationTimestamp="2025-10-01 15:21:47 +0000 UTC" firstStartedPulling="2025-10-01 15:21:48.104890149 +0000 UTC m=+1552.724065330" lastFinishedPulling="2025-10-01 15:21:48.627214075 +0000 UTC m=+1553.246389246" observedRunningTime="2025-10-01 15:21:49.145794371 +0000 UTC m=+1553.764969582" watchObservedRunningTime="2025-10-01 15:21:49.147521993 +0000 UTC m=+1553.766697174" Oct 01 15:22:00 crc kubenswrapper[4771]: I1001 15:22:00.950004 4771 scope.go:117] "RemoveContainer" containerID="521bdcfb3f0193028c6446bb8fddd27bbac8ca17f47c38897bedb817033c17c4" Oct 01 15:22:00 crc kubenswrapper[4771]: I1001 15:22:00.998296 4771 scope.go:117] "RemoveContainer" containerID="fa57cb76bd6f8c0c37a8efb084a35029ebee5e8e822464bf394b1b9b4d945d28" Oct 01 15:22:02 crc kubenswrapper[4771]: I1001 15:22:02.985985 4771 scope.go:117] "RemoveContainer" containerID="14f82a58b71f640691d4b9ebb4629f11abf0ca28aa3c0c30ba09d2fe31d6a0a2" Oct 01 15:22:02 crc kubenswrapper[4771]: E1001 15:22:02.986760 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:22:15 crc kubenswrapper[4771]: I1001 15:22:14.986367 4771 scope.go:117] "RemoveContainer" containerID="14f82a58b71f640691d4b9ebb4629f11abf0ca28aa3c0c30ba09d2fe31d6a0a2" Oct 01 15:22:15 crc kubenswrapper[4771]: E1001 15:22:14.987227 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:22:26 crc kubenswrapper[4771]: I1001 15:22:26.985350 4771 scope.go:117] "RemoveContainer" containerID="14f82a58b71f640691d4b9ebb4629f11abf0ca28aa3c0c30ba09d2fe31d6a0a2" Oct 01 15:22:26 crc kubenswrapper[4771]: E1001 15:22:26.987128 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:22:37 crc kubenswrapper[4771]: I1001 15:22:37.986080 4771 scope.go:117] "RemoveContainer" containerID="14f82a58b71f640691d4b9ebb4629f11abf0ca28aa3c0c30ba09d2fe31d6a0a2" Oct 01 15:22:37 crc kubenswrapper[4771]: E1001 15:22:37.987330 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:22:49 crc kubenswrapper[4771]: I1001 15:22:48.985087 4771 scope.go:117] "RemoveContainer" containerID="14f82a58b71f640691d4b9ebb4629f11abf0ca28aa3c0c30ba09d2fe31d6a0a2" Oct 01 15:22:49 crc kubenswrapper[4771]: E1001 15:22:48.986155 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:22:54 crc kubenswrapper[4771]: I1001 15:22:54.068554 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-c9qvq"] Oct 01 15:22:54 crc kubenswrapper[4771]: I1001 15:22:54.087274 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-7sh6p"] Oct 01 15:22:54 crc kubenswrapper[4771]: I1001 15:22:54.100124 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-7sh6p"] Oct 01 15:22:54 crc kubenswrapper[4771]: I1001 15:22:54.111340 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-c9qvq"] Oct 01 15:22:56 crc kubenswrapper[4771]: I1001 15:22:56.002191 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bccbccff-65c9-487b-b3bd-160f41dc53ee" path="/var/lib/kubelet/pods/bccbccff-65c9-487b-b3bd-160f41dc53ee/volumes" Oct 01 15:22:56 crc kubenswrapper[4771]: I1001 15:22:56.003073 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc054581-8c29-4f8e-b1eb-7903c06dfd17" path="/var/lib/kubelet/pods/fc054581-8c29-4f8e-b1eb-7903c06dfd17/volumes" Oct 01 15:22:59 crc kubenswrapper[4771]: I1001 15:22:59.985284 4771 scope.go:117] "RemoveContainer" containerID="14f82a58b71f640691d4b9ebb4629f11abf0ca28aa3c0c30ba09d2fe31d6a0a2" Oct 01 15:22:59 crc kubenswrapper[4771]: E1001 15:22:59.985707 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:23:00 crc kubenswrapper[4771]: I1001 15:23:00.043590 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-lrw6m"] Oct 01 15:23:00 crc kubenswrapper[4771]: I1001 15:23:00.058443 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-lrw6m"] Oct 01 15:23:01 crc kubenswrapper[4771]: I1001 15:23:01.116406 4771 scope.go:117] "RemoveContainer" containerID="46fdfe9fde4ce0ed6e5020b10715e2684b8498221cf22a391b32800dd7765b89" Oct 01 15:23:01 crc kubenswrapper[4771]: I1001 15:23:01.151961 4771 scope.go:117] "RemoveContainer" containerID="484c257f9380eb59f879f2cbeabad5e5394798cb4e491c7f2d5e94e3c94d1115" Oct 01 15:23:01 crc kubenswrapper[4771]: I1001 15:23:01.998509 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="448dc4ba-7224-4c6b-8448-e50389967c50" path="/var/lib/kubelet/pods/448dc4ba-7224-4c6b-8448-e50389967c50/volumes" Oct 01 15:23:02 crc kubenswrapper[4771]: I1001 15:23:02.033621 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-ddg7l"] Oct 01 15:23:02 crc kubenswrapper[4771]: I1001 15:23:02.042212 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-2vz57"] Oct 01 15:23:02 crc kubenswrapper[4771]: I1001 15:23:02.051218 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-627pv"] Oct 01 15:23:02 crc kubenswrapper[4771]: I1001 15:23:02.058267 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-ddg7l"] Oct 01 15:23:02 crc kubenswrapper[4771]: I1001 15:23:02.065207 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-2vz57"] Oct 01 15:23:02 crc kubenswrapper[4771]: I1001 15:23:02.071955 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-627pv"] Oct 01 15:23:03 crc kubenswrapper[4771]: I1001 15:23:03.999391 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="650906d6-4b76-4503-b0ee-e59f0a3302fb" path="/var/lib/kubelet/pods/650906d6-4b76-4503-b0ee-e59f0a3302fb/volumes" Oct 01 15:23:04 crc kubenswrapper[4771]: I1001 15:23:04.000613 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74e7f311-b206-4643-a136-5d40e30e7e39" path="/var/lib/kubelet/pods/74e7f311-b206-4643-a136-5d40e30e7e39/volumes" Oct 01 15:23:04 crc kubenswrapper[4771]: I1001 15:23:04.001446 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6b325a5-74ef-42ee-89a8-c5855a4dc1f8" path="/var/lib/kubelet/pods/b6b325a5-74ef-42ee-89a8-c5855a4dc1f8/volumes" Oct 01 15:23:04 crc kubenswrapper[4771]: I1001 15:23:04.035264 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-330c-account-create-m6s7j"] Oct 01 15:23:04 crc kubenswrapper[4771]: I1001 15:23:04.046281 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-330c-account-create-m6s7j"] Oct 01 15:23:05 crc kubenswrapper[4771]: I1001 15:23:05.041097 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-e413-account-create-wghx5"] Oct 01 15:23:05 crc kubenswrapper[4771]: I1001 15:23:05.052854 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-e413-account-create-wghx5"] Oct 01 15:23:06 crc kubenswrapper[4771]: I1001 15:23:06.005587 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48bdd44b-bf9e-4349-90de-9c3e1126def6" path="/var/lib/kubelet/pods/48bdd44b-bf9e-4349-90de-9c3e1126def6/volumes" Oct 01 15:23:06 crc kubenswrapper[4771]: I1001 15:23:06.006416 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d31bdfbc-f66b-4a6f-b64c-f56b4a63c481" path="/var/lib/kubelet/pods/d31bdfbc-f66b-4a6f-b64c-f56b4a63c481/volumes" Oct 01 15:23:11 crc kubenswrapper[4771]: I1001 15:23:11.986008 4771 scope.go:117] "RemoveContainer" containerID="14f82a58b71f640691d4b9ebb4629f11abf0ca28aa3c0c30ba09d2fe31d6a0a2" Oct 01 15:23:11 crc kubenswrapper[4771]: E1001 15:23:11.986760 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:23:14 crc kubenswrapper[4771]: I1001 15:23:14.041178 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-ae86-account-create-jtrlh"] Oct 01 15:23:14 crc kubenswrapper[4771]: I1001 15:23:14.052581 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-ae86-account-create-jtrlh"] Oct 01 15:23:14 crc kubenswrapper[4771]: I1001 15:23:14.065080 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-2939-account-create-9lwzm"] Oct 01 15:23:14 crc kubenswrapper[4771]: I1001 15:23:14.075282 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-bcc4-account-create-bgj79"] Oct 01 15:23:14 crc kubenswrapper[4771]: I1001 15:23:14.084517 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-ec97-account-create-fmwdx"] Oct 01 15:23:14 crc kubenswrapper[4771]: I1001 15:23:14.093495 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-ec97-account-create-fmwdx"] Oct 01 15:23:14 crc kubenswrapper[4771]: I1001 15:23:14.100571 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-2939-account-create-9lwzm"] Oct 01 15:23:14 crc kubenswrapper[4771]: I1001 15:23:14.107394 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-bcc4-account-create-bgj79"] Oct 01 15:23:16 crc kubenswrapper[4771]: I1001 15:23:16.002410 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28639e17-35cd-4824-9390-0a1212a73c73" path="/var/lib/kubelet/pods/28639e17-35cd-4824-9390-0a1212a73c73/volumes" Oct 01 15:23:16 crc kubenswrapper[4771]: I1001 15:23:16.003122 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d0dd8a9-f5fe-4838-892d-62ec5db46f3b" path="/var/lib/kubelet/pods/6d0dd8a9-f5fe-4838-892d-62ec5db46f3b/volumes" Oct 01 15:23:16 crc kubenswrapper[4771]: I1001 15:23:16.003779 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86dbf463-9a57-4d6b-b352-b5a6c70d3e9c" path="/var/lib/kubelet/pods/86dbf463-9a57-4d6b-b352-b5a6c70d3e9c/volumes" Oct 01 15:23:16 crc kubenswrapper[4771]: I1001 15:23:16.004481 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd4bd627-838e-4099-9801-c08693e22b9b" path="/var/lib/kubelet/pods/dd4bd627-838e-4099-9801-c08693e22b9b/volumes" Oct 01 15:23:20 crc kubenswrapper[4771]: I1001 15:23:20.030106 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-l6hss"] Oct 01 15:23:20 crc kubenswrapper[4771]: I1001 15:23:20.038584 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-l6hss"] Oct 01 15:23:22 crc kubenswrapper[4771]: I1001 15:23:22.004431 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04988f3e-c2c9-454f-8ec5-7d269d07685a" path="/var/lib/kubelet/pods/04988f3e-c2c9-454f-8ec5-7d269d07685a/volumes" Oct 01 15:23:23 crc kubenswrapper[4771]: I1001 15:23:23.985965 4771 scope.go:117] "RemoveContainer" containerID="14f82a58b71f640691d4b9ebb4629f11abf0ca28aa3c0c30ba09d2fe31d6a0a2" Oct 01 15:23:23 crc kubenswrapper[4771]: E1001 15:23:23.986495 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:23:33 crc kubenswrapper[4771]: I1001 15:23:33.313862 4771 generic.go:334] "Generic (PLEG): container finished" podID="c69dcf56-20fa-4a9a-992c-a73435ff9102" containerID="d7f633632ecb0792128a438430f89b0ee1f937ccaef1f15f6754569881b371d9" exitCode=0 Oct 01 15:23:33 crc kubenswrapper[4771]: I1001 15:23:33.313968 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9h2g6" event={"ID":"c69dcf56-20fa-4a9a-992c-a73435ff9102","Type":"ContainerDied","Data":"d7f633632ecb0792128a438430f89b0ee1f937ccaef1f15f6754569881b371d9"} Oct 01 15:23:34 crc kubenswrapper[4771]: I1001 15:23:34.813788 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9h2g6" Oct 01 15:23:34 crc kubenswrapper[4771]: I1001 15:23:34.996907 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c69dcf56-20fa-4a9a-992c-a73435ff9102-ssh-key\") pod \"c69dcf56-20fa-4a9a-992c-a73435ff9102\" (UID: \"c69dcf56-20fa-4a9a-992c-a73435ff9102\") " Oct 01 15:23:34 crc kubenswrapper[4771]: I1001 15:23:34.997227 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c69dcf56-20fa-4a9a-992c-a73435ff9102-inventory\") pod \"c69dcf56-20fa-4a9a-992c-a73435ff9102\" (UID: \"c69dcf56-20fa-4a9a-992c-a73435ff9102\") " Oct 01 15:23:34 crc kubenswrapper[4771]: I1001 15:23:34.997269 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgt5m\" (UniqueName: \"kubernetes.io/projected/c69dcf56-20fa-4a9a-992c-a73435ff9102-kube-api-access-wgt5m\") pod \"c69dcf56-20fa-4a9a-992c-a73435ff9102\" (UID: \"c69dcf56-20fa-4a9a-992c-a73435ff9102\") " Oct 01 15:23:35 crc kubenswrapper[4771]: I1001 15:23:35.009315 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c69dcf56-20fa-4a9a-992c-a73435ff9102-kube-api-access-wgt5m" (OuterVolumeSpecName: "kube-api-access-wgt5m") pod "c69dcf56-20fa-4a9a-992c-a73435ff9102" (UID: "c69dcf56-20fa-4a9a-992c-a73435ff9102"). InnerVolumeSpecName "kube-api-access-wgt5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:23:35 crc kubenswrapper[4771]: I1001 15:23:35.053560 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c69dcf56-20fa-4a9a-992c-a73435ff9102-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c69dcf56-20fa-4a9a-992c-a73435ff9102" (UID: "c69dcf56-20fa-4a9a-992c-a73435ff9102"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:23:35 crc kubenswrapper[4771]: I1001 15:23:35.054254 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c69dcf56-20fa-4a9a-992c-a73435ff9102-inventory" (OuterVolumeSpecName: "inventory") pod "c69dcf56-20fa-4a9a-992c-a73435ff9102" (UID: "c69dcf56-20fa-4a9a-992c-a73435ff9102"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:23:35 crc kubenswrapper[4771]: I1001 15:23:35.099841 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c69dcf56-20fa-4a9a-992c-a73435ff9102-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 15:23:35 crc kubenswrapper[4771]: I1001 15:23:35.099882 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c69dcf56-20fa-4a9a-992c-a73435ff9102-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 15:23:35 crc kubenswrapper[4771]: I1001 15:23:35.099897 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgt5m\" (UniqueName: \"kubernetes.io/projected/c69dcf56-20fa-4a9a-992c-a73435ff9102-kube-api-access-wgt5m\") on node \"crc\" DevicePath \"\"" Oct 01 15:23:35 crc kubenswrapper[4771]: I1001 15:23:35.337161 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9h2g6" event={"ID":"c69dcf56-20fa-4a9a-992c-a73435ff9102","Type":"ContainerDied","Data":"fbe65fc1bf79826fe31b28cbb29bb40b5a6025840a881b7f96367df482b33d4d"} Oct 01 15:23:35 crc kubenswrapper[4771]: I1001 15:23:35.337220 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fbe65fc1bf79826fe31b28cbb29bb40b5a6025840a881b7f96367df482b33d4d" Oct 01 15:23:35 crc kubenswrapper[4771]: I1001 15:23:35.337245 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-9h2g6" Oct 01 15:23:35 crc kubenswrapper[4771]: I1001 15:23:35.466697 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ptcjt"] Oct 01 15:23:35 crc kubenswrapper[4771]: E1001 15:23:35.467265 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c69dcf56-20fa-4a9a-992c-a73435ff9102" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 01 15:23:35 crc kubenswrapper[4771]: I1001 15:23:35.467292 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="c69dcf56-20fa-4a9a-992c-a73435ff9102" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 01 15:23:35 crc kubenswrapper[4771]: I1001 15:23:35.467497 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="c69dcf56-20fa-4a9a-992c-a73435ff9102" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 01 15:23:35 crc kubenswrapper[4771]: I1001 15:23:35.468226 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ptcjt" Oct 01 15:23:35 crc kubenswrapper[4771]: I1001 15:23:35.470951 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 15:23:35 crc kubenswrapper[4771]: I1001 15:23:35.471423 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 15:23:35 crc kubenswrapper[4771]: I1001 15:23:35.471455 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fv9b7" Oct 01 15:23:35 crc kubenswrapper[4771]: I1001 15:23:35.472836 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 15:23:35 crc kubenswrapper[4771]: I1001 15:23:35.475285 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ptcjt"] Oct 01 15:23:35 crc kubenswrapper[4771]: I1001 15:23:35.611160 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5598c0d1-a4ba-4824-8111-dddf70823911-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ptcjt\" (UID: \"5598c0d1-a4ba-4824-8111-dddf70823911\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ptcjt" Oct 01 15:23:35 crc kubenswrapper[4771]: I1001 15:23:35.611380 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5598c0d1-a4ba-4824-8111-dddf70823911-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ptcjt\" (UID: \"5598c0d1-a4ba-4824-8111-dddf70823911\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ptcjt" Oct 01 15:23:35 crc kubenswrapper[4771]: I1001 15:23:35.611505 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp66t\" (UniqueName: \"kubernetes.io/projected/5598c0d1-a4ba-4824-8111-dddf70823911-kube-api-access-zp66t\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ptcjt\" (UID: \"5598c0d1-a4ba-4824-8111-dddf70823911\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ptcjt" Oct 01 15:23:35 crc kubenswrapper[4771]: I1001 15:23:35.714205 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5598c0d1-a4ba-4824-8111-dddf70823911-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ptcjt\" (UID: \"5598c0d1-a4ba-4824-8111-dddf70823911\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ptcjt" Oct 01 15:23:35 crc kubenswrapper[4771]: I1001 15:23:35.714343 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp66t\" (UniqueName: \"kubernetes.io/projected/5598c0d1-a4ba-4824-8111-dddf70823911-kube-api-access-zp66t\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ptcjt\" (UID: \"5598c0d1-a4ba-4824-8111-dddf70823911\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ptcjt" Oct 01 15:23:35 crc kubenswrapper[4771]: I1001 15:23:35.714586 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5598c0d1-a4ba-4824-8111-dddf70823911-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ptcjt\" (UID: \"5598c0d1-a4ba-4824-8111-dddf70823911\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ptcjt" Oct 01 15:23:35 crc kubenswrapper[4771]: I1001 15:23:35.721267 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5598c0d1-a4ba-4824-8111-dddf70823911-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ptcjt\" (UID: \"5598c0d1-a4ba-4824-8111-dddf70823911\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ptcjt" Oct 01 15:23:35 crc kubenswrapper[4771]: I1001 15:23:35.722158 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5598c0d1-a4ba-4824-8111-dddf70823911-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ptcjt\" (UID: \"5598c0d1-a4ba-4824-8111-dddf70823911\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ptcjt" Oct 01 15:23:35 crc kubenswrapper[4771]: I1001 15:23:35.736602 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp66t\" (UniqueName: \"kubernetes.io/projected/5598c0d1-a4ba-4824-8111-dddf70823911-kube-api-access-zp66t\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ptcjt\" (UID: \"5598c0d1-a4ba-4824-8111-dddf70823911\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ptcjt" Oct 01 15:23:35 crc kubenswrapper[4771]: I1001 15:23:35.806947 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ptcjt" Oct 01 15:23:36 crc kubenswrapper[4771]: I1001 15:23:36.361470 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ptcjt"] Oct 01 15:23:36 crc kubenswrapper[4771]: I1001 15:23:36.370157 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 15:23:37 crc kubenswrapper[4771]: I1001 15:23:37.364681 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ptcjt" event={"ID":"5598c0d1-a4ba-4824-8111-dddf70823911","Type":"ContainerStarted","Data":"a62e9668a63bd4e93d5a92433c68d488c0461d81dcd03170b3c1a227ec9f4700"} Oct 01 15:23:37 crc kubenswrapper[4771]: I1001 15:23:37.367984 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ptcjt" event={"ID":"5598c0d1-a4ba-4824-8111-dddf70823911","Type":"ContainerStarted","Data":"0a484fbaafa6afde2f8c7461a99dfcf3de4e18725c4ed885505643606466b76c"} Oct 01 15:23:37 crc kubenswrapper[4771]: I1001 15:23:37.401334 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ptcjt" podStartSLOduration=1.862726881 podStartE2EDuration="2.401307843s" podCreationTimestamp="2025-10-01 15:23:35 +0000 UTC" firstStartedPulling="2025-10-01 15:23:36.369729081 +0000 UTC m=+1660.988904302" lastFinishedPulling="2025-10-01 15:23:36.908310053 +0000 UTC m=+1661.527485264" observedRunningTime="2025-10-01 15:23:37.39126553 +0000 UTC m=+1662.010440731" watchObservedRunningTime="2025-10-01 15:23:37.401307843 +0000 UTC m=+1662.020483024" Oct 01 15:23:38 crc kubenswrapper[4771]: I1001 15:23:38.985331 4771 scope.go:117] "RemoveContainer" containerID="14f82a58b71f640691d4b9ebb4629f11abf0ca28aa3c0c30ba09d2fe31d6a0a2" Oct 01 15:23:38 crc kubenswrapper[4771]: E1001 15:23:38.985860 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:23:50 crc kubenswrapper[4771]: I1001 15:23:50.985926 4771 scope.go:117] "RemoveContainer" containerID="14f82a58b71f640691d4b9ebb4629f11abf0ca28aa3c0c30ba09d2fe31d6a0a2" Oct 01 15:23:50 crc kubenswrapper[4771]: E1001 15:23:50.986806 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:24:01 crc kubenswrapper[4771]: I1001 15:24:01.286358 4771 scope.go:117] "RemoveContainer" containerID="6db9647877c86fa0e538911431e4e7b925059f199c33a36783f09ee88e010bca" Oct 01 15:24:01 crc kubenswrapper[4771]: I1001 15:24:01.316378 4771 scope.go:117] "RemoveContainer" containerID="cf136dc245827fbe5b3c4d31bb810a8befe8438a3a2fe35e4ca4f5c2614e03b8" Oct 01 15:24:01 crc kubenswrapper[4771]: I1001 15:24:01.353905 4771 scope.go:117] "RemoveContainer" containerID="62eeaba437500f0b27f8a9b401ccf4e5b965dad0ed30093065e403e91fef3b5b" Oct 01 15:24:01 crc kubenswrapper[4771]: I1001 15:24:01.417239 4771 scope.go:117] "RemoveContainer" containerID="5e844da9189ece3df4085c7933f71cfbfda01ddbda537ed4082972000854297a" Oct 01 15:24:01 crc kubenswrapper[4771]: I1001 15:24:01.458405 4771 scope.go:117] "RemoveContainer" containerID="f8ab8f67ce768baea8ebbdebb514dd523e38ac6f89cde17f00bf3b1087231269" Oct 01 15:24:01 crc kubenswrapper[4771]: I1001 15:24:01.503956 4771 scope.go:117] "RemoveContainer" containerID="afb716d4146e55322085752768bfaf7dea05659a898bfd7a35e8f33432e12903" Oct 01 15:24:01 crc kubenswrapper[4771]: I1001 15:24:01.538928 4771 scope.go:117] "RemoveContainer" containerID="36e14aecc4196441acb3b318260ee774c457f14a319476e9aa52221e7d5a388f" Oct 01 15:24:01 crc kubenswrapper[4771]: I1001 15:24:01.587693 4771 scope.go:117] "RemoveContainer" containerID="ddef70121a06df011cd9c0e8719f5dd03127b6c67d02bf94dafe6a280be85d6b" Oct 01 15:24:01 crc kubenswrapper[4771]: I1001 15:24:01.617082 4771 scope.go:117] "RemoveContainer" containerID="9eed105f8bdc1d5e2c48424c5dbef253c3805ae6574dce161114c88b6a63e677" Oct 01 15:24:01 crc kubenswrapper[4771]: I1001 15:24:01.642152 4771 scope.go:117] "RemoveContainer" containerID="877114b5ea07b63ad882069286ac22c0e0cb10181c19aa21af9f4b38a9f2d632" Oct 01 15:24:01 crc kubenswrapper[4771]: I1001 15:24:01.662000 4771 scope.go:117] "RemoveContainer" containerID="d1f4e0400758a0a5710359f819b00860985ebbef25d26a61d9cc05553e6efc80" Oct 01 15:24:01 crc kubenswrapper[4771]: I1001 15:24:01.695558 4771 scope.go:117] "RemoveContainer" containerID="56eef2c37071396c9650602f25de34e4b5dc4e39b37639652b357006f0c0f99c" Oct 01 15:24:01 crc kubenswrapper[4771]: I1001 15:24:01.728671 4771 scope.go:117] "RemoveContainer" containerID="e4374df0d4ede53e93aafc7d6199d0453d62027813d852ca7553887badb4500c" Oct 01 15:24:04 crc kubenswrapper[4771]: I1001 15:24:04.054818 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-r9ffd"] Oct 01 15:24:04 crc kubenswrapper[4771]: I1001 15:24:04.066970 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-r9ffd"] Oct 01 15:24:05 crc kubenswrapper[4771]: I1001 15:24:05.994400 4771 scope.go:117] "RemoveContainer" containerID="14f82a58b71f640691d4b9ebb4629f11abf0ca28aa3c0c30ba09d2fe31d6a0a2" Oct 01 15:24:05 crc kubenswrapper[4771]: E1001 15:24:05.995432 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:24:05 crc kubenswrapper[4771]: I1001 15:24:05.998823 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a10ee2ec-1ec2-4353-ad8b-74ac0e031289" path="/var/lib/kubelet/pods/a10ee2ec-1ec2-4353-ad8b-74ac0e031289/volumes" Oct 01 15:24:09 crc kubenswrapper[4771]: I1001 15:24:09.031666 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-qfvj2"] Oct 01 15:24:09 crc kubenswrapper[4771]: I1001 15:24:09.042462 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-qfvj2"] Oct 01 15:24:10 crc kubenswrapper[4771]: I1001 15:24:10.005263 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7709b00c-a5a2-46c9-a4dc-ffa7fe3c912c" path="/var/lib/kubelet/pods/7709b00c-a5a2-46c9-a4dc-ffa7fe3c912c/volumes" Oct 01 15:24:18 crc kubenswrapper[4771]: I1001 15:24:18.985287 4771 scope.go:117] "RemoveContainer" containerID="14f82a58b71f640691d4b9ebb4629f11abf0ca28aa3c0c30ba09d2fe31d6a0a2" Oct 01 15:24:18 crc kubenswrapper[4771]: E1001 15:24:18.986121 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:24:20 crc kubenswrapper[4771]: I1001 15:24:20.039573 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-d4zpd"] Oct 01 15:24:20 crc kubenswrapper[4771]: I1001 15:24:20.057941 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-d4zpd"] Oct 01 15:24:21 crc kubenswrapper[4771]: I1001 15:24:21.996476 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91950d77-9457-412f-be07-626b553f6b8d" path="/var/lib/kubelet/pods/91950d77-9457-412f-be07-626b553f6b8d/volumes" Oct 01 15:24:24 crc kubenswrapper[4771]: I1001 15:24:24.055316 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-nvszz"] Oct 01 15:24:24 crc kubenswrapper[4771]: I1001 15:24:24.068233 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-nvszz"] Oct 01 15:24:25 crc kubenswrapper[4771]: I1001 15:24:25.999253 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="313f9ab1-8fa8-476f-94cb-1d94bd975a06" path="/var/lib/kubelet/pods/313f9ab1-8fa8-476f-94cb-1d94bd975a06/volumes" Oct 01 15:24:28 crc kubenswrapper[4771]: I1001 15:24:28.030519 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-99rss"] Oct 01 15:24:28 crc kubenswrapper[4771]: I1001 15:24:28.038876 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-99rss"] Oct 01 15:24:29 crc kubenswrapper[4771]: I1001 15:24:29.985876 4771 scope.go:117] "RemoveContainer" containerID="14f82a58b71f640691d4b9ebb4629f11abf0ca28aa3c0c30ba09d2fe31d6a0a2" Oct 01 15:24:29 crc kubenswrapper[4771]: E1001 15:24:29.986677 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:24:29 crc kubenswrapper[4771]: I1001 15:24:29.997922 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a523e37-804a-4173-8012-19848efc8cc0" path="/var/lib/kubelet/pods/9a523e37-804a-4173-8012-19848efc8cc0/volumes" Oct 01 15:24:31 crc kubenswrapper[4771]: I1001 15:24:31.036115 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-gq7gl"] Oct 01 15:24:31 crc kubenswrapper[4771]: I1001 15:24:31.049450 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-gq7gl"] Oct 01 15:24:32 crc kubenswrapper[4771]: I1001 15:24:32.022343 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b7689a2-6ac8-47ac-86f7-7456994c39ca" path="/var/lib/kubelet/pods/7b7689a2-6ac8-47ac-86f7-7456994c39ca/volumes" Oct 01 15:24:43 crc kubenswrapper[4771]: I1001 15:24:43.985995 4771 scope.go:117] "RemoveContainer" containerID="14f82a58b71f640691d4b9ebb4629f11abf0ca28aa3c0c30ba09d2fe31d6a0a2" Oct 01 15:24:43 crc kubenswrapper[4771]: E1001 15:24:43.987102 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:24:47 crc kubenswrapper[4771]: I1001 15:24:47.141409 4771 generic.go:334] "Generic (PLEG): container finished" podID="5598c0d1-a4ba-4824-8111-dddf70823911" containerID="a62e9668a63bd4e93d5a92433c68d488c0461d81dcd03170b3c1a227ec9f4700" exitCode=0 Oct 01 15:24:47 crc kubenswrapper[4771]: I1001 15:24:47.143112 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ptcjt" event={"ID":"5598c0d1-a4ba-4824-8111-dddf70823911","Type":"ContainerDied","Data":"a62e9668a63bd4e93d5a92433c68d488c0461d81dcd03170b3c1a227ec9f4700"} Oct 01 15:24:48 crc kubenswrapper[4771]: I1001 15:24:48.654874 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ptcjt" Oct 01 15:24:48 crc kubenswrapper[4771]: I1001 15:24:48.844933 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5598c0d1-a4ba-4824-8111-dddf70823911-ssh-key\") pod \"5598c0d1-a4ba-4824-8111-dddf70823911\" (UID: \"5598c0d1-a4ba-4824-8111-dddf70823911\") " Oct 01 15:24:48 crc kubenswrapper[4771]: I1001 15:24:48.845132 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zp66t\" (UniqueName: \"kubernetes.io/projected/5598c0d1-a4ba-4824-8111-dddf70823911-kube-api-access-zp66t\") pod \"5598c0d1-a4ba-4824-8111-dddf70823911\" (UID: \"5598c0d1-a4ba-4824-8111-dddf70823911\") " Oct 01 15:24:48 crc kubenswrapper[4771]: I1001 15:24:48.845166 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5598c0d1-a4ba-4824-8111-dddf70823911-inventory\") pod \"5598c0d1-a4ba-4824-8111-dddf70823911\" (UID: \"5598c0d1-a4ba-4824-8111-dddf70823911\") " Oct 01 15:24:48 crc kubenswrapper[4771]: I1001 15:24:48.854374 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5598c0d1-a4ba-4824-8111-dddf70823911-kube-api-access-zp66t" (OuterVolumeSpecName: "kube-api-access-zp66t") pod "5598c0d1-a4ba-4824-8111-dddf70823911" (UID: "5598c0d1-a4ba-4824-8111-dddf70823911"). InnerVolumeSpecName "kube-api-access-zp66t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:24:48 crc kubenswrapper[4771]: I1001 15:24:48.877806 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5598c0d1-a4ba-4824-8111-dddf70823911-inventory" (OuterVolumeSpecName: "inventory") pod "5598c0d1-a4ba-4824-8111-dddf70823911" (UID: "5598c0d1-a4ba-4824-8111-dddf70823911"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:24:48 crc kubenswrapper[4771]: I1001 15:24:48.898786 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5598c0d1-a4ba-4824-8111-dddf70823911-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "5598c0d1-a4ba-4824-8111-dddf70823911" (UID: "5598c0d1-a4ba-4824-8111-dddf70823911"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:24:48 crc kubenswrapper[4771]: I1001 15:24:48.947423 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/5598c0d1-a4ba-4824-8111-dddf70823911-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 15:24:48 crc kubenswrapper[4771]: I1001 15:24:48.947450 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zp66t\" (UniqueName: \"kubernetes.io/projected/5598c0d1-a4ba-4824-8111-dddf70823911-kube-api-access-zp66t\") on node \"crc\" DevicePath \"\"" Oct 01 15:24:48 crc kubenswrapper[4771]: I1001 15:24:48.948258 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5598c0d1-a4ba-4824-8111-dddf70823911-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 15:24:49 crc kubenswrapper[4771]: I1001 15:24:49.169486 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ptcjt" event={"ID":"5598c0d1-a4ba-4824-8111-dddf70823911","Type":"ContainerDied","Data":"0a484fbaafa6afde2f8c7461a99dfcf3de4e18725c4ed885505643606466b76c"} Oct 01 15:24:49 crc kubenswrapper[4771]: I1001 15:24:49.170103 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a484fbaafa6afde2f8c7461a99dfcf3de4e18725c4ed885505643606466b76c" Oct 01 15:24:49 crc kubenswrapper[4771]: I1001 15:24:49.169876 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ptcjt" Oct 01 15:24:49 crc kubenswrapper[4771]: I1001 15:24:49.268302 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nnjkx"] Oct 01 15:24:49 crc kubenswrapper[4771]: E1001 15:24:49.269157 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5598c0d1-a4ba-4824-8111-dddf70823911" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 01 15:24:49 crc kubenswrapper[4771]: I1001 15:24:49.269284 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="5598c0d1-a4ba-4824-8111-dddf70823911" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 01 15:24:49 crc kubenswrapper[4771]: I1001 15:24:49.269680 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="5598c0d1-a4ba-4824-8111-dddf70823911" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 01 15:24:49 crc kubenswrapper[4771]: I1001 15:24:49.270833 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nnjkx" Oct 01 15:24:49 crc kubenswrapper[4771]: I1001 15:24:49.275306 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 15:24:49 crc kubenswrapper[4771]: I1001 15:24:49.275410 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 15:24:49 crc kubenswrapper[4771]: I1001 15:24:49.275880 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 15:24:49 crc kubenswrapper[4771]: I1001 15:24:49.276199 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fv9b7" Oct 01 15:24:49 crc kubenswrapper[4771]: I1001 15:24:49.297981 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nnjkx"] Oct 01 15:24:49 crc kubenswrapper[4771]: I1001 15:24:49.460952 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9fb926c9-80f3-4d82-9de5-a4f0fc314ef5-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nnjkx\" (UID: \"9fb926c9-80f3-4d82-9de5-a4f0fc314ef5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nnjkx" Oct 01 15:24:49 crc kubenswrapper[4771]: I1001 15:24:49.461521 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md8kq\" (UniqueName: \"kubernetes.io/projected/9fb926c9-80f3-4d82-9de5-a4f0fc314ef5-kube-api-access-md8kq\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nnjkx\" (UID: \"9fb926c9-80f3-4d82-9de5-a4f0fc314ef5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nnjkx" Oct 01 15:24:49 crc kubenswrapper[4771]: I1001 15:24:49.461798 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9fb926c9-80f3-4d82-9de5-a4f0fc314ef5-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nnjkx\" (UID: \"9fb926c9-80f3-4d82-9de5-a4f0fc314ef5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nnjkx" Oct 01 15:24:49 crc kubenswrapper[4771]: I1001 15:24:49.563469 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9fb926c9-80f3-4d82-9de5-a4f0fc314ef5-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nnjkx\" (UID: \"9fb926c9-80f3-4d82-9de5-a4f0fc314ef5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nnjkx" Oct 01 15:24:49 crc kubenswrapper[4771]: I1001 15:24:49.563906 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9fb926c9-80f3-4d82-9de5-a4f0fc314ef5-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nnjkx\" (UID: \"9fb926c9-80f3-4d82-9de5-a4f0fc314ef5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nnjkx" Oct 01 15:24:49 crc kubenswrapper[4771]: I1001 15:24:49.564442 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-md8kq\" (UniqueName: \"kubernetes.io/projected/9fb926c9-80f3-4d82-9de5-a4f0fc314ef5-kube-api-access-md8kq\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nnjkx\" (UID: \"9fb926c9-80f3-4d82-9de5-a4f0fc314ef5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nnjkx" Oct 01 15:24:49 crc kubenswrapper[4771]: I1001 15:24:49.574974 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9fb926c9-80f3-4d82-9de5-a4f0fc314ef5-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nnjkx\" (UID: \"9fb926c9-80f3-4d82-9de5-a4f0fc314ef5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nnjkx" Oct 01 15:24:49 crc kubenswrapper[4771]: I1001 15:24:49.574976 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9fb926c9-80f3-4d82-9de5-a4f0fc314ef5-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nnjkx\" (UID: \"9fb926c9-80f3-4d82-9de5-a4f0fc314ef5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nnjkx" Oct 01 15:24:49 crc kubenswrapper[4771]: I1001 15:24:49.602226 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-md8kq\" (UniqueName: \"kubernetes.io/projected/9fb926c9-80f3-4d82-9de5-a4f0fc314ef5-kube-api-access-md8kq\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-nnjkx\" (UID: \"9fb926c9-80f3-4d82-9de5-a4f0fc314ef5\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nnjkx" Oct 01 15:24:49 crc kubenswrapper[4771]: I1001 15:24:49.902683 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nnjkx" Oct 01 15:24:50 crc kubenswrapper[4771]: I1001 15:24:50.508619 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nnjkx"] Oct 01 15:24:51 crc kubenswrapper[4771]: I1001 15:24:51.193591 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nnjkx" event={"ID":"9fb926c9-80f3-4d82-9de5-a4f0fc314ef5","Type":"ContainerStarted","Data":"993db22fe916ec6b39b8e8dfcbffc7984ed12711e5255f6c54dccf9f15b99558"} Oct 01 15:24:52 crc kubenswrapper[4771]: I1001 15:24:52.209616 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nnjkx" event={"ID":"9fb926c9-80f3-4d82-9de5-a4f0fc314ef5","Type":"ContainerStarted","Data":"efc72eb8c640888c24d4b2cfd21b7b90bd57a150036503692e06f0a7522414c3"} Oct 01 15:24:52 crc kubenswrapper[4771]: I1001 15:24:52.232349 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nnjkx" podStartSLOduration=2.60175184 podStartE2EDuration="3.232327156s" podCreationTimestamp="2025-10-01 15:24:49 +0000 UTC" firstStartedPulling="2025-10-01 15:24:50.526918472 +0000 UTC m=+1735.146093663" lastFinishedPulling="2025-10-01 15:24:51.157493798 +0000 UTC m=+1735.776668979" observedRunningTime="2025-10-01 15:24:52.228300218 +0000 UTC m=+1736.847475399" watchObservedRunningTime="2025-10-01 15:24:52.232327156 +0000 UTC m=+1736.851502357" Oct 01 15:24:53 crc kubenswrapper[4771]: I1001 15:24:53.048432 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-bszpn"] Oct 01 15:24:53 crc kubenswrapper[4771]: I1001 15:24:53.059892 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-6sk5b"] Oct 01 15:24:53 crc kubenswrapper[4771]: I1001 15:24:53.068700 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-bshc5"] Oct 01 15:24:53 crc kubenswrapper[4771]: I1001 15:24:53.075343 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-bszpn"] Oct 01 15:24:53 crc kubenswrapper[4771]: I1001 15:24:53.081206 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-6sk5b"] Oct 01 15:24:53 crc kubenswrapper[4771]: I1001 15:24:53.087836 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-bshc5"] Oct 01 15:24:54 crc kubenswrapper[4771]: I1001 15:24:54.006603 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3afe16e8-e581-4752-9128-0516838132ae" path="/var/lib/kubelet/pods/3afe16e8-e581-4752-9128-0516838132ae/volumes" Oct 01 15:24:54 crc kubenswrapper[4771]: I1001 15:24:54.007975 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c4d2578-710e-45af-86c8-8c8677ecc0b6" path="/var/lib/kubelet/pods/8c4d2578-710e-45af-86c8-8c8677ecc0b6/volumes" Oct 01 15:24:54 crc kubenswrapper[4771]: I1001 15:24:54.009185 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6003332-ade7-416e-8165-0b3768b94dc0" path="/var/lib/kubelet/pods/f6003332-ade7-416e-8165-0b3768b94dc0/volumes" Oct 01 15:24:57 crc kubenswrapper[4771]: I1001 15:24:57.271251 4771 generic.go:334] "Generic (PLEG): container finished" podID="9fb926c9-80f3-4d82-9de5-a4f0fc314ef5" containerID="efc72eb8c640888c24d4b2cfd21b7b90bd57a150036503692e06f0a7522414c3" exitCode=0 Oct 01 15:24:57 crc kubenswrapper[4771]: I1001 15:24:57.271423 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nnjkx" event={"ID":"9fb926c9-80f3-4d82-9de5-a4f0fc314ef5","Type":"ContainerDied","Data":"efc72eb8c640888c24d4b2cfd21b7b90bd57a150036503692e06f0a7522414c3"} Oct 01 15:24:58 crc kubenswrapper[4771]: I1001 15:24:58.077707 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-1f68-account-create-d94sr"] Oct 01 15:24:58 crc kubenswrapper[4771]: I1001 15:24:58.085778 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-1f68-account-create-d94sr"] Oct 01 15:24:58 crc kubenswrapper[4771]: I1001 15:24:58.739066 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nnjkx" Oct 01 15:24:58 crc kubenswrapper[4771]: I1001 15:24:58.860874 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9fb926c9-80f3-4d82-9de5-a4f0fc314ef5-ssh-key\") pod \"9fb926c9-80f3-4d82-9de5-a4f0fc314ef5\" (UID: \"9fb926c9-80f3-4d82-9de5-a4f0fc314ef5\") " Oct 01 15:24:58 crc kubenswrapper[4771]: I1001 15:24:58.860947 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9fb926c9-80f3-4d82-9de5-a4f0fc314ef5-inventory\") pod \"9fb926c9-80f3-4d82-9de5-a4f0fc314ef5\" (UID: \"9fb926c9-80f3-4d82-9de5-a4f0fc314ef5\") " Oct 01 15:24:58 crc kubenswrapper[4771]: I1001 15:24:58.861142 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-md8kq\" (UniqueName: \"kubernetes.io/projected/9fb926c9-80f3-4d82-9de5-a4f0fc314ef5-kube-api-access-md8kq\") pod \"9fb926c9-80f3-4d82-9de5-a4f0fc314ef5\" (UID: \"9fb926c9-80f3-4d82-9de5-a4f0fc314ef5\") " Oct 01 15:24:58 crc kubenswrapper[4771]: I1001 15:24:58.866110 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fb926c9-80f3-4d82-9de5-a4f0fc314ef5-kube-api-access-md8kq" (OuterVolumeSpecName: "kube-api-access-md8kq") pod "9fb926c9-80f3-4d82-9de5-a4f0fc314ef5" (UID: "9fb926c9-80f3-4d82-9de5-a4f0fc314ef5"). InnerVolumeSpecName "kube-api-access-md8kq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:24:58 crc kubenswrapper[4771]: I1001 15:24:58.892190 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fb926c9-80f3-4d82-9de5-a4f0fc314ef5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9fb926c9-80f3-4d82-9de5-a4f0fc314ef5" (UID: "9fb926c9-80f3-4d82-9de5-a4f0fc314ef5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:24:58 crc kubenswrapper[4771]: I1001 15:24:58.894675 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fb926c9-80f3-4d82-9de5-a4f0fc314ef5-inventory" (OuterVolumeSpecName: "inventory") pod "9fb926c9-80f3-4d82-9de5-a4f0fc314ef5" (UID: "9fb926c9-80f3-4d82-9de5-a4f0fc314ef5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:24:58 crc kubenswrapper[4771]: I1001 15:24:58.963760 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9fb926c9-80f3-4d82-9de5-a4f0fc314ef5-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 15:24:58 crc kubenswrapper[4771]: I1001 15:24:58.963797 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9fb926c9-80f3-4d82-9de5-a4f0fc314ef5-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 15:24:58 crc kubenswrapper[4771]: I1001 15:24:58.963813 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-md8kq\" (UniqueName: \"kubernetes.io/projected/9fb926c9-80f3-4d82-9de5-a4f0fc314ef5-kube-api-access-md8kq\") on node \"crc\" DevicePath \"\"" Oct 01 15:24:58 crc kubenswrapper[4771]: I1001 15:24:58.985753 4771 scope.go:117] "RemoveContainer" containerID="14f82a58b71f640691d4b9ebb4629f11abf0ca28aa3c0c30ba09d2fe31d6a0a2" Oct 01 15:24:58 crc kubenswrapper[4771]: E1001 15:24:58.986142 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:24:59 crc kubenswrapper[4771]: I1001 15:24:59.291221 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nnjkx" event={"ID":"9fb926c9-80f3-4d82-9de5-a4f0fc314ef5","Type":"ContainerDied","Data":"993db22fe916ec6b39b8e8dfcbffc7984ed12711e5255f6c54dccf9f15b99558"} Oct 01 15:24:59 crc kubenswrapper[4771]: I1001 15:24:59.291266 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-nnjkx" Oct 01 15:24:59 crc kubenswrapper[4771]: I1001 15:24:59.291278 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="993db22fe916ec6b39b8e8dfcbffc7984ed12711e5255f6c54dccf9f15b99558" Oct 01 15:24:59 crc kubenswrapper[4771]: I1001 15:24:59.374864 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-zkh9n"] Oct 01 15:24:59 crc kubenswrapper[4771]: E1001 15:24:59.375380 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fb926c9-80f3-4d82-9de5-a4f0fc314ef5" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 01 15:24:59 crc kubenswrapper[4771]: I1001 15:24:59.375402 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fb926c9-80f3-4d82-9de5-a4f0fc314ef5" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 01 15:24:59 crc kubenswrapper[4771]: I1001 15:24:59.375629 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fb926c9-80f3-4d82-9de5-a4f0fc314ef5" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 01 15:24:59 crc kubenswrapper[4771]: I1001 15:24:59.376443 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zkh9n" Oct 01 15:24:59 crc kubenswrapper[4771]: I1001 15:24:59.379159 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 15:24:59 crc kubenswrapper[4771]: I1001 15:24:59.379213 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fv9b7" Oct 01 15:24:59 crc kubenswrapper[4771]: I1001 15:24:59.379941 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 15:24:59 crc kubenswrapper[4771]: I1001 15:24:59.382564 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 15:24:59 crc kubenswrapper[4771]: I1001 15:24:59.382813 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-zkh9n"] Oct 01 15:24:59 crc kubenswrapper[4771]: I1001 15:24:59.474427 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/333518f5-86a1-4afc-974d-c3dbee185c42-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zkh9n\" (UID: \"333518f5-86a1-4afc-974d-c3dbee185c42\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zkh9n" Oct 01 15:24:59 crc kubenswrapper[4771]: I1001 15:24:59.474933 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/333518f5-86a1-4afc-974d-c3dbee185c42-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zkh9n\" (UID: \"333518f5-86a1-4afc-974d-c3dbee185c42\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zkh9n" Oct 01 15:24:59 crc kubenswrapper[4771]: I1001 15:24:59.475076 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvksh\" (UniqueName: \"kubernetes.io/projected/333518f5-86a1-4afc-974d-c3dbee185c42-kube-api-access-vvksh\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zkh9n\" (UID: \"333518f5-86a1-4afc-974d-c3dbee185c42\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zkh9n" Oct 01 15:24:59 crc kubenswrapper[4771]: I1001 15:24:59.576842 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/333518f5-86a1-4afc-974d-c3dbee185c42-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zkh9n\" (UID: \"333518f5-86a1-4afc-974d-c3dbee185c42\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zkh9n" Oct 01 15:24:59 crc kubenswrapper[4771]: I1001 15:24:59.576979 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvksh\" (UniqueName: \"kubernetes.io/projected/333518f5-86a1-4afc-974d-c3dbee185c42-kube-api-access-vvksh\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zkh9n\" (UID: \"333518f5-86a1-4afc-974d-c3dbee185c42\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zkh9n" Oct 01 15:24:59 crc kubenswrapper[4771]: I1001 15:24:59.577189 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/333518f5-86a1-4afc-974d-c3dbee185c42-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zkh9n\" (UID: \"333518f5-86a1-4afc-974d-c3dbee185c42\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zkh9n" Oct 01 15:24:59 crc kubenswrapper[4771]: I1001 15:24:59.582099 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/333518f5-86a1-4afc-974d-c3dbee185c42-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zkh9n\" (UID: \"333518f5-86a1-4afc-974d-c3dbee185c42\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zkh9n" Oct 01 15:24:59 crc kubenswrapper[4771]: I1001 15:24:59.590331 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/333518f5-86a1-4afc-974d-c3dbee185c42-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zkh9n\" (UID: \"333518f5-86a1-4afc-974d-c3dbee185c42\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zkh9n" Oct 01 15:24:59 crc kubenswrapper[4771]: I1001 15:24:59.601120 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvksh\" (UniqueName: \"kubernetes.io/projected/333518f5-86a1-4afc-974d-c3dbee185c42-kube-api-access-vvksh\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zkh9n\" (UID: \"333518f5-86a1-4afc-974d-c3dbee185c42\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zkh9n" Oct 01 15:24:59 crc kubenswrapper[4771]: I1001 15:24:59.699479 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zkh9n" Oct 01 15:25:00 crc kubenswrapper[4771]: I1001 15:25:00.000574 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6eef84f4-0cf2-4a95-964d-f37667da25da" path="/var/lib/kubelet/pods/6eef84f4-0cf2-4a95-964d-f37667da25da/volumes" Oct 01 15:25:00 crc kubenswrapper[4771]: I1001 15:25:00.183323 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-zkh9n"] Oct 01 15:25:00 crc kubenswrapper[4771]: I1001 15:25:00.301363 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zkh9n" event={"ID":"333518f5-86a1-4afc-974d-c3dbee185c42","Type":"ContainerStarted","Data":"76a3768fbeac397eb7e36ec20eaa2958d5ab737def7abf7a495d22c80bc33fb3"} Oct 01 15:25:01 crc kubenswrapper[4771]: I1001 15:25:01.312196 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zkh9n" event={"ID":"333518f5-86a1-4afc-974d-c3dbee185c42","Type":"ContainerStarted","Data":"4feb9ed4526dc28fa8c656f9a8ad6c3e1bf4a2ba38e6598c8685d9f4de8ef2f8"} Oct 01 15:25:01 crc kubenswrapper[4771]: I1001 15:25:01.341276 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zkh9n" podStartSLOduration=1.837079871 podStartE2EDuration="2.341256061s" podCreationTimestamp="2025-10-01 15:24:59 +0000 UTC" firstStartedPulling="2025-10-01 15:25:00.187958226 +0000 UTC m=+1744.807133417" lastFinishedPulling="2025-10-01 15:25:00.692134406 +0000 UTC m=+1745.311309607" observedRunningTime="2025-10-01 15:25:01.333928334 +0000 UTC m=+1745.953103525" watchObservedRunningTime="2025-10-01 15:25:01.341256061 +0000 UTC m=+1745.960431232" Oct 01 15:25:02 crc kubenswrapper[4771]: I1001 15:25:02.086663 4771 scope.go:117] "RemoveContainer" containerID="70a1fd9d2c3a6eaff72432f6b177e5779069a3032e30900ecc29c59f70de7c24" Oct 01 15:25:02 crc kubenswrapper[4771]: I1001 15:25:02.114336 4771 scope.go:117] "RemoveContainer" containerID="758ccba9aed842f21dfca918e8574ac6b220bd4a6467406d8a7aafee26ba08a9" Oct 01 15:25:02 crc kubenswrapper[4771]: I1001 15:25:02.209693 4771 scope.go:117] "RemoveContainer" containerID="9f82b8ba173061e613facbd6e0767a3bb0a6f57a8ebd68abe3447c956660cf05" Oct 01 15:25:02 crc kubenswrapper[4771]: I1001 15:25:02.243097 4771 scope.go:117] "RemoveContainer" containerID="24164528286d94c164609d173dbf72e452c655f857e1f9c2834ee0be39e4d572" Oct 01 15:25:02 crc kubenswrapper[4771]: I1001 15:25:02.293120 4771 scope.go:117] "RemoveContainer" containerID="4fc72c6cf9e93feafe7bc58a3482e603c61ba143774546119094c46592de66bf" Oct 01 15:25:02 crc kubenswrapper[4771]: I1001 15:25:02.321675 4771 scope.go:117] "RemoveContainer" containerID="d140b7ce466b3823c65fe198c4f62d5d313c28d82b7acdeee65dd69e6f7610ac" Oct 01 15:25:02 crc kubenswrapper[4771]: I1001 15:25:02.390699 4771 scope.go:117] "RemoveContainer" containerID="c00acf985cfe64678616d36ec81d0a930db7d80fdf370d7eb6a3a5fa5167bcb0" Oct 01 15:25:02 crc kubenswrapper[4771]: I1001 15:25:02.424280 4771 scope.go:117] "RemoveContainer" containerID="5f20731ceb27dc77af04cf960e63721a47976e8675a07ee752f038d50d44be44" Oct 01 15:25:02 crc kubenswrapper[4771]: I1001 15:25:02.471067 4771 scope.go:117] "RemoveContainer" containerID="d3afd8daf31a54889ea06e80fb2e696bebb9f2f782ba5c2d3ff182daf3212ac4" Oct 01 15:25:02 crc kubenswrapper[4771]: I1001 15:25:02.514925 4771 scope.go:117] "RemoveContainer" containerID="d038276b3c071804af5139b55833ae2ef7f92b284760ce854c9d49c53f53ea19" Oct 01 15:25:08 crc kubenswrapper[4771]: I1001 15:25:08.039115 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-7184-account-create-s58vj"] Oct 01 15:25:08 crc kubenswrapper[4771]: I1001 15:25:08.050490 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-3995-account-create-bqj22"] Oct 01 15:25:08 crc kubenswrapper[4771]: I1001 15:25:08.059646 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-7184-account-create-s58vj"] Oct 01 15:25:08 crc kubenswrapper[4771]: I1001 15:25:08.068979 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-3995-account-create-bqj22"] Oct 01 15:25:10 crc kubenswrapper[4771]: I1001 15:25:10.006315 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a7e33d0-9533-4b7f-9bf3-d5b55185f04e" path="/var/lib/kubelet/pods/2a7e33d0-9533-4b7f-9bf3-d5b55185f04e/volumes" Oct 01 15:25:10 crc kubenswrapper[4771]: I1001 15:25:10.008023 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9aa95246-7e3e-49cc-90df-6d96afa66bdb" path="/var/lib/kubelet/pods/9aa95246-7e3e-49cc-90df-6d96afa66bdb/volumes" Oct 01 15:25:11 crc kubenswrapper[4771]: I1001 15:25:11.985943 4771 scope.go:117] "RemoveContainer" containerID="14f82a58b71f640691d4b9ebb4629f11abf0ca28aa3c0c30ba09d2fe31d6a0a2" Oct 01 15:25:11 crc kubenswrapper[4771]: E1001 15:25:11.986487 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:25:25 crc kubenswrapper[4771]: I1001 15:25:25.992198 4771 scope.go:117] "RemoveContainer" containerID="14f82a58b71f640691d4b9ebb4629f11abf0ca28aa3c0c30ba09d2fe31d6a0a2" Oct 01 15:25:25 crc kubenswrapper[4771]: E1001 15:25:25.993158 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:25:31 crc kubenswrapper[4771]: I1001 15:25:31.041300 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9xfmf"] Oct 01 15:25:31 crc kubenswrapper[4771]: I1001 15:25:31.048608 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9xfmf"] Oct 01 15:25:32 crc kubenswrapper[4771]: I1001 15:25:32.000353 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95f37934-8f6e-4013-b7a4-5563e5245c79" path="/var/lib/kubelet/pods/95f37934-8f6e-4013-b7a4-5563e5245c79/volumes" Oct 01 15:25:39 crc kubenswrapper[4771]: I1001 15:25:39.985086 4771 scope.go:117] "RemoveContainer" containerID="14f82a58b71f640691d4b9ebb4629f11abf0ca28aa3c0c30ba09d2fe31d6a0a2" Oct 01 15:25:39 crc kubenswrapper[4771]: E1001 15:25:39.986152 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:25:41 crc kubenswrapper[4771]: I1001 15:25:41.777925 4771 generic.go:334] "Generic (PLEG): container finished" podID="333518f5-86a1-4afc-974d-c3dbee185c42" containerID="4feb9ed4526dc28fa8c656f9a8ad6c3e1bf4a2ba38e6598c8685d9f4de8ef2f8" exitCode=0 Oct 01 15:25:41 crc kubenswrapper[4771]: I1001 15:25:41.777983 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zkh9n" event={"ID":"333518f5-86a1-4afc-974d-c3dbee185c42","Type":"ContainerDied","Data":"4feb9ed4526dc28fa8c656f9a8ad6c3e1bf4a2ba38e6598c8685d9f4de8ef2f8"} Oct 01 15:25:43 crc kubenswrapper[4771]: I1001 15:25:43.258242 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zkh9n" Oct 01 15:25:43 crc kubenswrapper[4771]: I1001 15:25:43.390629 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/333518f5-86a1-4afc-974d-c3dbee185c42-ssh-key\") pod \"333518f5-86a1-4afc-974d-c3dbee185c42\" (UID: \"333518f5-86a1-4afc-974d-c3dbee185c42\") " Oct 01 15:25:43 crc kubenswrapper[4771]: I1001 15:25:43.390779 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/333518f5-86a1-4afc-974d-c3dbee185c42-inventory\") pod \"333518f5-86a1-4afc-974d-c3dbee185c42\" (UID: \"333518f5-86a1-4afc-974d-c3dbee185c42\") " Oct 01 15:25:43 crc kubenswrapper[4771]: I1001 15:25:43.390841 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvksh\" (UniqueName: \"kubernetes.io/projected/333518f5-86a1-4afc-974d-c3dbee185c42-kube-api-access-vvksh\") pod \"333518f5-86a1-4afc-974d-c3dbee185c42\" (UID: \"333518f5-86a1-4afc-974d-c3dbee185c42\") " Oct 01 15:25:43 crc kubenswrapper[4771]: I1001 15:25:43.396535 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/333518f5-86a1-4afc-974d-c3dbee185c42-kube-api-access-vvksh" (OuterVolumeSpecName: "kube-api-access-vvksh") pod "333518f5-86a1-4afc-974d-c3dbee185c42" (UID: "333518f5-86a1-4afc-974d-c3dbee185c42"). InnerVolumeSpecName "kube-api-access-vvksh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:25:43 crc kubenswrapper[4771]: I1001 15:25:43.418547 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/333518f5-86a1-4afc-974d-c3dbee185c42-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "333518f5-86a1-4afc-974d-c3dbee185c42" (UID: "333518f5-86a1-4afc-974d-c3dbee185c42"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:25:43 crc kubenswrapper[4771]: I1001 15:25:43.422650 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/333518f5-86a1-4afc-974d-c3dbee185c42-inventory" (OuterVolumeSpecName: "inventory") pod "333518f5-86a1-4afc-974d-c3dbee185c42" (UID: "333518f5-86a1-4afc-974d-c3dbee185c42"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:25:43 crc kubenswrapper[4771]: I1001 15:25:43.492903 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/333518f5-86a1-4afc-974d-c3dbee185c42-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 15:25:43 crc kubenswrapper[4771]: I1001 15:25:43.492945 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/333518f5-86a1-4afc-974d-c3dbee185c42-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 15:25:43 crc kubenswrapper[4771]: I1001 15:25:43.492959 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvksh\" (UniqueName: \"kubernetes.io/projected/333518f5-86a1-4afc-974d-c3dbee185c42-kube-api-access-vvksh\") on node \"crc\" DevicePath \"\"" Oct 01 15:25:43 crc kubenswrapper[4771]: I1001 15:25:43.813077 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zkh9n" event={"ID":"333518f5-86a1-4afc-974d-c3dbee185c42","Type":"ContainerDied","Data":"76a3768fbeac397eb7e36ec20eaa2958d5ab737def7abf7a495d22c80bc33fb3"} Oct 01 15:25:43 crc kubenswrapper[4771]: I1001 15:25:43.813115 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76a3768fbeac397eb7e36ec20eaa2958d5ab737def7abf7a495d22c80bc33fb3" Oct 01 15:25:43 crc kubenswrapper[4771]: I1001 15:25:43.813236 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zkh9n" Oct 01 15:25:43 crc kubenswrapper[4771]: I1001 15:25:43.917296 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-txt9q"] Oct 01 15:25:43 crc kubenswrapper[4771]: E1001 15:25:43.918088 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="333518f5-86a1-4afc-974d-c3dbee185c42" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 01 15:25:43 crc kubenswrapper[4771]: I1001 15:25:43.918231 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="333518f5-86a1-4afc-974d-c3dbee185c42" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 01 15:25:43 crc kubenswrapper[4771]: I1001 15:25:43.918794 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="333518f5-86a1-4afc-974d-c3dbee185c42" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 01 15:25:43 crc kubenswrapper[4771]: I1001 15:25:43.919698 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-txt9q" Oct 01 15:25:43 crc kubenswrapper[4771]: I1001 15:25:43.922932 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fv9b7" Oct 01 15:25:43 crc kubenswrapper[4771]: I1001 15:25:43.923162 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 15:25:43 crc kubenswrapper[4771]: I1001 15:25:43.924840 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 15:25:43 crc kubenswrapper[4771]: I1001 15:25:43.925915 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 15:25:43 crc kubenswrapper[4771]: I1001 15:25:43.926745 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-txt9q"] Oct 01 15:25:44 crc kubenswrapper[4771]: I1001 15:25:44.002869 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a64a2e26-92a1-4578-9a5a-fc5e8062f1b8-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-txt9q\" (UID: \"a64a2e26-92a1-4578-9a5a-fc5e8062f1b8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-txt9q" Oct 01 15:25:44 crc kubenswrapper[4771]: I1001 15:25:44.003167 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4sgl\" (UniqueName: \"kubernetes.io/projected/a64a2e26-92a1-4578-9a5a-fc5e8062f1b8-kube-api-access-j4sgl\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-txt9q\" (UID: \"a64a2e26-92a1-4578-9a5a-fc5e8062f1b8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-txt9q" Oct 01 15:25:44 crc kubenswrapper[4771]: I1001 15:25:44.003429 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a64a2e26-92a1-4578-9a5a-fc5e8062f1b8-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-txt9q\" (UID: \"a64a2e26-92a1-4578-9a5a-fc5e8062f1b8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-txt9q" Oct 01 15:25:44 crc kubenswrapper[4771]: I1001 15:25:44.104893 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a64a2e26-92a1-4578-9a5a-fc5e8062f1b8-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-txt9q\" (UID: \"a64a2e26-92a1-4578-9a5a-fc5e8062f1b8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-txt9q" Oct 01 15:25:44 crc kubenswrapper[4771]: I1001 15:25:44.104988 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a64a2e26-92a1-4578-9a5a-fc5e8062f1b8-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-txt9q\" (UID: \"a64a2e26-92a1-4578-9a5a-fc5e8062f1b8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-txt9q" Oct 01 15:25:44 crc kubenswrapper[4771]: I1001 15:25:44.105048 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4sgl\" (UniqueName: \"kubernetes.io/projected/a64a2e26-92a1-4578-9a5a-fc5e8062f1b8-kube-api-access-j4sgl\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-txt9q\" (UID: \"a64a2e26-92a1-4578-9a5a-fc5e8062f1b8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-txt9q" Oct 01 15:25:44 crc kubenswrapper[4771]: I1001 15:25:44.108360 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a64a2e26-92a1-4578-9a5a-fc5e8062f1b8-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-txt9q\" (UID: \"a64a2e26-92a1-4578-9a5a-fc5e8062f1b8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-txt9q" Oct 01 15:25:44 crc kubenswrapper[4771]: I1001 15:25:44.111114 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a64a2e26-92a1-4578-9a5a-fc5e8062f1b8-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-txt9q\" (UID: \"a64a2e26-92a1-4578-9a5a-fc5e8062f1b8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-txt9q" Oct 01 15:25:44 crc kubenswrapper[4771]: I1001 15:25:44.121661 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4sgl\" (UniqueName: \"kubernetes.io/projected/a64a2e26-92a1-4578-9a5a-fc5e8062f1b8-kube-api-access-j4sgl\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-txt9q\" (UID: \"a64a2e26-92a1-4578-9a5a-fc5e8062f1b8\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-txt9q" Oct 01 15:25:44 crc kubenswrapper[4771]: I1001 15:25:44.241911 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-txt9q" Oct 01 15:25:44 crc kubenswrapper[4771]: I1001 15:25:44.832659 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-txt9q"] Oct 01 15:25:45 crc kubenswrapper[4771]: I1001 15:25:45.833297 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-txt9q" event={"ID":"a64a2e26-92a1-4578-9a5a-fc5e8062f1b8","Type":"ContainerStarted","Data":"7b9332ed1b6054c01af3198a4e9c0d7b046d18d4672ed895265c5b14504823b1"} Oct 01 15:25:46 crc kubenswrapper[4771]: I1001 15:25:46.844330 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-txt9q" event={"ID":"a64a2e26-92a1-4578-9a5a-fc5e8062f1b8","Type":"ContainerStarted","Data":"62c52839a5c1ffaa5164f35e679f63415f04ffb05048c2b83be6e0ccbc9c46ab"} Oct 01 15:25:46 crc kubenswrapper[4771]: I1001 15:25:46.871020 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-txt9q" podStartSLOduration=3.038120971 podStartE2EDuration="3.870999551s" podCreationTimestamp="2025-10-01 15:25:43 +0000 UTC" firstStartedPulling="2025-10-01 15:25:44.84306594 +0000 UTC m=+1789.462241121" lastFinishedPulling="2025-10-01 15:25:45.67594453 +0000 UTC m=+1790.295119701" observedRunningTime="2025-10-01 15:25:46.867020993 +0000 UTC m=+1791.486196204" watchObservedRunningTime="2025-10-01 15:25:46.870999551 +0000 UTC m=+1791.490174732" Oct 01 15:25:54 crc kubenswrapper[4771]: I1001 15:25:54.060024 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-hqpb2"] Oct 01 15:25:54 crc kubenswrapper[4771]: I1001 15:25:54.072697 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-hqpb2"] Oct 01 15:25:54 crc kubenswrapper[4771]: I1001 15:25:54.985424 4771 scope.go:117] "RemoveContainer" containerID="14f82a58b71f640691d4b9ebb4629f11abf0ca28aa3c0c30ba09d2fe31d6a0a2" Oct 01 15:25:54 crc kubenswrapper[4771]: E1001 15:25:54.985971 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:25:55 crc kubenswrapper[4771]: I1001 15:25:55.040148 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-fb68r"] Oct 01 15:25:55 crc kubenswrapper[4771]: I1001 15:25:55.049811 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-fb68r"] Oct 01 15:25:55 crc kubenswrapper[4771]: I1001 15:25:55.999173 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97e141b7-bc02-4fbe-b918-6b31a4dea6cf" path="/var/lib/kubelet/pods/97e141b7-bc02-4fbe-b918-6b31a4dea6cf/volumes" Oct 01 15:25:56 crc kubenswrapper[4771]: I1001 15:25:56.000085 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abc660b6-f8c7-4a39-b5f1-861ecaa73e75" path="/var/lib/kubelet/pods/abc660b6-f8c7-4a39-b5f1-861ecaa73e75/volumes" Oct 01 15:26:02 crc kubenswrapper[4771]: I1001 15:26:02.798172 4771 scope.go:117] "RemoveContainer" containerID="1a5f5f9a60df4d9edeecf2be5ab3c48d1643e112880c8b38f1a80d0a3738b41d" Oct 01 15:26:02 crc kubenswrapper[4771]: I1001 15:26:02.835582 4771 scope.go:117] "RemoveContainer" containerID="97995242163b40a7a899fdc97b76b359f163a71d228c41ee82476bdbe4f7f219" Oct 01 15:26:02 crc kubenswrapper[4771]: I1001 15:26:02.908652 4771 scope.go:117] "RemoveContainer" containerID="2bda283390b2179b92ccb2c1cdb4baeb170ec2009c6bf7a5860510b7057457d6" Oct 01 15:26:02 crc kubenswrapper[4771]: I1001 15:26:02.968548 4771 scope.go:117] "RemoveContainer" containerID="7ac0a44bd285907f9bc19f2bc2ec9d571c6507531c623cdf49edb9ca1b72fc57" Oct 01 15:26:03 crc kubenswrapper[4771]: I1001 15:26:03.008319 4771 scope.go:117] "RemoveContainer" containerID="a6e3a0e105d23e6398c3b0c2daff658ed59e266b5c5977dfce20cf632cb6d692" Oct 01 15:26:09 crc kubenswrapper[4771]: I1001 15:26:09.985416 4771 scope.go:117] "RemoveContainer" containerID="14f82a58b71f640691d4b9ebb4629f11abf0ca28aa3c0c30ba09d2fe31d6a0a2" Oct 01 15:26:09 crc kubenswrapper[4771]: E1001 15:26:09.986460 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:26:24 crc kubenswrapper[4771]: I1001 15:26:24.985662 4771 scope.go:117] "RemoveContainer" containerID="14f82a58b71f640691d4b9ebb4629f11abf0ca28aa3c0c30ba09d2fe31d6a0a2" Oct 01 15:26:26 crc kubenswrapper[4771]: I1001 15:26:26.286455 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" event={"ID":"289ee6d3-fabe-417f-964c-76ca03c143cc","Type":"ContainerStarted","Data":"9a6a9c0077062d17e78226f49233ded0b62ef8f37ee11aa7036368c11d9a3cf2"} Oct 01 15:26:39 crc kubenswrapper[4771]: I1001 15:26:39.066549 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-w58g7"] Oct 01 15:26:39 crc kubenswrapper[4771]: I1001 15:26:39.080322 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-w58g7"] Oct 01 15:26:40 crc kubenswrapper[4771]: I1001 15:26:40.005191 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="609dfd55-d3d8-4ae3-b8ce-9b64c07ac798" path="/var/lib/kubelet/pods/609dfd55-d3d8-4ae3-b8ce-9b64c07ac798/volumes" Oct 01 15:26:42 crc kubenswrapper[4771]: I1001 15:26:42.456391 4771 generic.go:334] "Generic (PLEG): container finished" podID="a64a2e26-92a1-4578-9a5a-fc5e8062f1b8" containerID="62c52839a5c1ffaa5164f35e679f63415f04ffb05048c2b83be6e0ccbc9c46ab" exitCode=2 Oct 01 15:26:42 crc kubenswrapper[4771]: I1001 15:26:42.456500 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-txt9q" event={"ID":"a64a2e26-92a1-4578-9a5a-fc5e8062f1b8","Type":"ContainerDied","Data":"62c52839a5c1ffaa5164f35e679f63415f04ffb05048c2b83be6e0ccbc9c46ab"} Oct 01 15:26:44 crc kubenswrapper[4771]: I1001 15:26:44.073331 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-txt9q" Oct 01 15:26:44 crc kubenswrapper[4771]: I1001 15:26:44.196302 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a64a2e26-92a1-4578-9a5a-fc5e8062f1b8-ssh-key\") pod \"a64a2e26-92a1-4578-9a5a-fc5e8062f1b8\" (UID: \"a64a2e26-92a1-4578-9a5a-fc5e8062f1b8\") " Oct 01 15:26:44 crc kubenswrapper[4771]: I1001 15:26:44.196370 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4sgl\" (UniqueName: \"kubernetes.io/projected/a64a2e26-92a1-4578-9a5a-fc5e8062f1b8-kube-api-access-j4sgl\") pod \"a64a2e26-92a1-4578-9a5a-fc5e8062f1b8\" (UID: \"a64a2e26-92a1-4578-9a5a-fc5e8062f1b8\") " Oct 01 15:26:44 crc kubenswrapper[4771]: I1001 15:26:44.196441 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a64a2e26-92a1-4578-9a5a-fc5e8062f1b8-inventory\") pod \"a64a2e26-92a1-4578-9a5a-fc5e8062f1b8\" (UID: \"a64a2e26-92a1-4578-9a5a-fc5e8062f1b8\") " Oct 01 15:26:44 crc kubenswrapper[4771]: I1001 15:26:44.202455 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a64a2e26-92a1-4578-9a5a-fc5e8062f1b8-kube-api-access-j4sgl" (OuterVolumeSpecName: "kube-api-access-j4sgl") pod "a64a2e26-92a1-4578-9a5a-fc5e8062f1b8" (UID: "a64a2e26-92a1-4578-9a5a-fc5e8062f1b8"). InnerVolumeSpecName "kube-api-access-j4sgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:26:44 crc kubenswrapper[4771]: I1001 15:26:44.249005 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a64a2e26-92a1-4578-9a5a-fc5e8062f1b8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a64a2e26-92a1-4578-9a5a-fc5e8062f1b8" (UID: "a64a2e26-92a1-4578-9a5a-fc5e8062f1b8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:26:44 crc kubenswrapper[4771]: I1001 15:26:44.250744 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a64a2e26-92a1-4578-9a5a-fc5e8062f1b8-inventory" (OuterVolumeSpecName: "inventory") pod "a64a2e26-92a1-4578-9a5a-fc5e8062f1b8" (UID: "a64a2e26-92a1-4578-9a5a-fc5e8062f1b8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:26:44 crc kubenswrapper[4771]: I1001 15:26:44.299239 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a64a2e26-92a1-4578-9a5a-fc5e8062f1b8-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 15:26:44 crc kubenswrapper[4771]: I1001 15:26:44.299292 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4sgl\" (UniqueName: \"kubernetes.io/projected/a64a2e26-92a1-4578-9a5a-fc5e8062f1b8-kube-api-access-j4sgl\") on node \"crc\" DevicePath \"\"" Oct 01 15:26:44 crc kubenswrapper[4771]: I1001 15:26:44.299351 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a64a2e26-92a1-4578-9a5a-fc5e8062f1b8-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 15:26:44 crc kubenswrapper[4771]: I1001 15:26:44.487726 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-txt9q" event={"ID":"a64a2e26-92a1-4578-9a5a-fc5e8062f1b8","Type":"ContainerDied","Data":"7b9332ed1b6054c01af3198a4e9c0d7b046d18d4672ed895265c5b14504823b1"} Oct 01 15:26:44 crc kubenswrapper[4771]: I1001 15:26:44.487791 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b9332ed1b6054c01af3198a4e9c0d7b046d18d4672ed895265c5b14504823b1" Oct 01 15:26:44 crc kubenswrapper[4771]: I1001 15:26:44.487801 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-txt9q" Oct 01 15:26:51 crc kubenswrapper[4771]: I1001 15:26:51.037824 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vv89q"] Oct 01 15:26:51 crc kubenswrapper[4771]: E1001 15:26:51.038798 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a64a2e26-92a1-4578-9a5a-fc5e8062f1b8" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 15:26:51 crc kubenswrapper[4771]: I1001 15:26:51.038819 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="a64a2e26-92a1-4578-9a5a-fc5e8062f1b8" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 15:26:51 crc kubenswrapper[4771]: I1001 15:26:51.039107 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="a64a2e26-92a1-4578-9a5a-fc5e8062f1b8" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 15:26:51 crc kubenswrapper[4771]: I1001 15:26:51.050159 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vv89q"] Oct 01 15:26:51 crc kubenswrapper[4771]: I1001 15:26:51.050289 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vv89q" Oct 01 15:26:51 crc kubenswrapper[4771]: I1001 15:26:51.054010 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 15:26:51 crc kubenswrapper[4771]: I1001 15:26:51.054037 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 15:26:51 crc kubenswrapper[4771]: I1001 15:26:51.054160 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fv9b7" Oct 01 15:26:51 crc kubenswrapper[4771]: I1001 15:26:51.057173 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 15:26:51 crc kubenswrapper[4771]: I1001 15:26:51.135900 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqj5c\" (UniqueName: \"kubernetes.io/projected/ae5eb9bd-1612-4698-850a-21e0b335a920-kube-api-access-dqj5c\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vv89q\" (UID: \"ae5eb9bd-1612-4698-850a-21e0b335a920\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vv89q" Oct 01 15:26:51 crc kubenswrapper[4771]: I1001 15:26:51.136056 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ae5eb9bd-1612-4698-850a-21e0b335a920-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vv89q\" (UID: \"ae5eb9bd-1612-4698-850a-21e0b335a920\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vv89q" Oct 01 15:26:51 crc kubenswrapper[4771]: I1001 15:26:51.136124 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae5eb9bd-1612-4698-850a-21e0b335a920-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vv89q\" (UID: \"ae5eb9bd-1612-4698-850a-21e0b335a920\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vv89q" Oct 01 15:26:51 crc kubenswrapper[4771]: I1001 15:26:51.238263 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ae5eb9bd-1612-4698-850a-21e0b335a920-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vv89q\" (UID: \"ae5eb9bd-1612-4698-850a-21e0b335a920\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vv89q" Oct 01 15:26:51 crc kubenswrapper[4771]: I1001 15:26:51.238633 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae5eb9bd-1612-4698-850a-21e0b335a920-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vv89q\" (UID: \"ae5eb9bd-1612-4698-850a-21e0b335a920\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vv89q" Oct 01 15:26:51 crc kubenswrapper[4771]: I1001 15:26:51.238850 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqj5c\" (UniqueName: \"kubernetes.io/projected/ae5eb9bd-1612-4698-850a-21e0b335a920-kube-api-access-dqj5c\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vv89q\" (UID: \"ae5eb9bd-1612-4698-850a-21e0b335a920\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vv89q" Oct 01 15:26:51 crc kubenswrapper[4771]: I1001 15:26:51.245037 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae5eb9bd-1612-4698-850a-21e0b335a920-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vv89q\" (UID: \"ae5eb9bd-1612-4698-850a-21e0b335a920\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vv89q" Oct 01 15:26:51 crc kubenswrapper[4771]: I1001 15:26:51.245112 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ae5eb9bd-1612-4698-850a-21e0b335a920-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vv89q\" (UID: \"ae5eb9bd-1612-4698-850a-21e0b335a920\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vv89q" Oct 01 15:26:51 crc kubenswrapper[4771]: I1001 15:26:51.261075 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqj5c\" (UniqueName: \"kubernetes.io/projected/ae5eb9bd-1612-4698-850a-21e0b335a920-kube-api-access-dqj5c\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-vv89q\" (UID: \"ae5eb9bd-1612-4698-850a-21e0b335a920\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vv89q" Oct 01 15:26:51 crc kubenswrapper[4771]: I1001 15:26:51.381680 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vv89q" Oct 01 15:26:51 crc kubenswrapper[4771]: I1001 15:26:51.937232 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vv89q"] Oct 01 15:26:52 crc kubenswrapper[4771]: I1001 15:26:52.568587 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vv89q" event={"ID":"ae5eb9bd-1612-4698-850a-21e0b335a920","Type":"ContainerStarted","Data":"0dbd990d380a2e4cd67ba81593a41fd9a0e4b3314aa83053e142733e830e06c6"} Oct 01 15:26:53 crc kubenswrapper[4771]: I1001 15:26:53.580348 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vv89q" event={"ID":"ae5eb9bd-1612-4698-850a-21e0b335a920","Type":"ContainerStarted","Data":"1538b09e916e6ac3cadc0a8ca8995752a75e3591ed66c4de4a68b990d3882111"} Oct 01 15:26:53 crc kubenswrapper[4771]: I1001 15:26:53.611613 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vv89q" podStartSLOduration=2.12789581 podStartE2EDuration="2.611591939s" podCreationTimestamp="2025-10-01 15:26:51 +0000 UTC" firstStartedPulling="2025-10-01 15:26:51.940523908 +0000 UTC m=+1856.559699069" lastFinishedPulling="2025-10-01 15:26:52.424219997 +0000 UTC m=+1857.043395198" observedRunningTime="2025-10-01 15:26:53.605127201 +0000 UTC m=+1858.224302372" watchObservedRunningTime="2025-10-01 15:26:53.611591939 +0000 UTC m=+1858.230767110" Oct 01 15:27:03 crc kubenswrapper[4771]: I1001 15:27:03.194411 4771 scope.go:117] "RemoveContainer" containerID="7db5e2a842876c469532a0a62c92604ad4c42a6288c773c3789a76d4565d4977" Oct 01 15:27:43 crc kubenswrapper[4771]: I1001 15:27:43.075547 4771 generic.go:334] "Generic (PLEG): container finished" podID="ae5eb9bd-1612-4698-850a-21e0b335a920" containerID="1538b09e916e6ac3cadc0a8ca8995752a75e3591ed66c4de4a68b990d3882111" exitCode=0 Oct 01 15:27:43 crc kubenswrapper[4771]: I1001 15:27:43.075646 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vv89q" event={"ID":"ae5eb9bd-1612-4698-850a-21e0b335a920","Type":"ContainerDied","Data":"1538b09e916e6ac3cadc0a8ca8995752a75e3591ed66c4de4a68b990d3882111"} Oct 01 15:27:44 crc kubenswrapper[4771]: I1001 15:27:44.597661 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vv89q" Oct 01 15:27:44 crc kubenswrapper[4771]: I1001 15:27:44.634640 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae5eb9bd-1612-4698-850a-21e0b335a920-inventory\") pod \"ae5eb9bd-1612-4698-850a-21e0b335a920\" (UID: \"ae5eb9bd-1612-4698-850a-21e0b335a920\") " Oct 01 15:27:44 crc kubenswrapper[4771]: I1001 15:27:44.634843 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqj5c\" (UniqueName: \"kubernetes.io/projected/ae5eb9bd-1612-4698-850a-21e0b335a920-kube-api-access-dqj5c\") pod \"ae5eb9bd-1612-4698-850a-21e0b335a920\" (UID: \"ae5eb9bd-1612-4698-850a-21e0b335a920\") " Oct 01 15:27:44 crc kubenswrapper[4771]: I1001 15:27:44.634926 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ae5eb9bd-1612-4698-850a-21e0b335a920-ssh-key\") pod \"ae5eb9bd-1612-4698-850a-21e0b335a920\" (UID: \"ae5eb9bd-1612-4698-850a-21e0b335a920\") " Oct 01 15:27:44 crc kubenswrapper[4771]: I1001 15:27:44.643040 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae5eb9bd-1612-4698-850a-21e0b335a920-kube-api-access-dqj5c" (OuterVolumeSpecName: "kube-api-access-dqj5c") pod "ae5eb9bd-1612-4698-850a-21e0b335a920" (UID: "ae5eb9bd-1612-4698-850a-21e0b335a920"). InnerVolumeSpecName "kube-api-access-dqj5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:27:44 crc kubenswrapper[4771]: I1001 15:27:44.681598 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae5eb9bd-1612-4698-850a-21e0b335a920-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ae5eb9bd-1612-4698-850a-21e0b335a920" (UID: "ae5eb9bd-1612-4698-850a-21e0b335a920"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:27:44 crc kubenswrapper[4771]: I1001 15:27:44.681775 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae5eb9bd-1612-4698-850a-21e0b335a920-inventory" (OuterVolumeSpecName: "inventory") pod "ae5eb9bd-1612-4698-850a-21e0b335a920" (UID: "ae5eb9bd-1612-4698-850a-21e0b335a920"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:27:44 crc kubenswrapper[4771]: I1001 15:27:44.737525 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae5eb9bd-1612-4698-850a-21e0b335a920-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 15:27:44 crc kubenswrapper[4771]: I1001 15:27:44.737559 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqj5c\" (UniqueName: \"kubernetes.io/projected/ae5eb9bd-1612-4698-850a-21e0b335a920-kube-api-access-dqj5c\") on node \"crc\" DevicePath \"\"" Oct 01 15:27:44 crc kubenswrapper[4771]: I1001 15:27:44.737570 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ae5eb9bd-1612-4698-850a-21e0b335a920-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 15:27:45 crc kubenswrapper[4771]: I1001 15:27:45.099444 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vv89q" event={"ID":"ae5eb9bd-1612-4698-850a-21e0b335a920","Type":"ContainerDied","Data":"0dbd990d380a2e4cd67ba81593a41fd9a0e4b3314aa83053e142733e830e06c6"} Oct 01 15:27:45 crc kubenswrapper[4771]: I1001 15:27:45.099845 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0dbd990d380a2e4cd67ba81593a41fd9a0e4b3314aa83053e142733e830e06c6" Oct 01 15:27:45 crc kubenswrapper[4771]: I1001 15:27:45.099546 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-vv89q" Oct 01 15:27:45 crc kubenswrapper[4771]: I1001 15:27:45.203607 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-cg72d"] Oct 01 15:27:45 crc kubenswrapper[4771]: E1001 15:27:45.204123 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae5eb9bd-1612-4698-850a-21e0b335a920" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 15:27:45 crc kubenswrapper[4771]: I1001 15:27:45.204149 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae5eb9bd-1612-4698-850a-21e0b335a920" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 15:27:45 crc kubenswrapper[4771]: I1001 15:27:45.204445 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae5eb9bd-1612-4698-850a-21e0b335a920" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 01 15:27:45 crc kubenswrapper[4771]: I1001 15:27:45.205295 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-cg72d" Oct 01 15:27:45 crc kubenswrapper[4771]: I1001 15:27:45.208602 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 15:27:45 crc kubenswrapper[4771]: I1001 15:27:45.209041 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 15:27:45 crc kubenswrapper[4771]: I1001 15:27:45.209130 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 15:27:45 crc kubenswrapper[4771]: I1001 15:27:45.209889 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fv9b7" Oct 01 15:27:45 crc kubenswrapper[4771]: I1001 15:27:45.224588 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-cg72d"] Oct 01 15:27:45 crc kubenswrapper[4771]: I1001 15:27:45.248921 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3d9fa44b-220b-4f14-824c-1393dd61fc88-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-cg72d\" (UID: \"3d9fa44b-220b-4f14-824c-1393dd61fc88\") " pod="openstack/ssh-known-hosts-edpm-deployment-cg72d" Oct 01 15:27:45 crc kubenswrapper[4771]: I1001 15:27:45.249001 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcjhk\" (UniqueName: \"kubernetes.io/projected/3d9fa44b-220b-4f14-824c-1393dd61fc88-kube-api-access-tcjhk\") pod \"ssh-known-hosts-edpm-deployment-cg72d\" (UID: \"3d9fa44b-220b-4f14-824c-1393dd61fc88\") " pod="openstack/ssh-known-hosts-edpm-deployment-cg72d" Oct 01 15:27:45 crc kubenswrapper[4771]: I1001 15:27:45.249124 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3d9fa44b-220b-4f14-824c-1393dd61fc88-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-cg72d\" (UID: \"3d9fa44b-220b-4f14-824c-1393dd61fc88\") " pod="openstack/ssh-known-hosts-edpm-deployment-cg72d" Oct 01 15:27:45 crc kubenswrapper[4771]: I1001 15:27:45.351215 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3d9fa44b-220b-4f14-824c-1393dd61fc88-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-cg72d\" (UID: \"3d9fa44b-220b-4f14-824c-1393dd61fc88\") " pod="openstack/ssh-known-hosts-edpm-deployment-cg72d" Oct 01 15:27:45 crc kubenswrapper[4771]: I1001 15:27:45.351453 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3d9fa44b-220b-4f14-824c-1393dd61fc88-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-cg72d\" (UID: \"3d9fa44b-220b-4f14-824c-1393dd61fc88\") " pod="openstack/ssh-known-hosts-edpm-deployment-cg72d" Oct 01 15:27:45 crc kubenswrapper[4771]: I1001 15:27:45.351510 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcjhk\" (UniqueName: \"kubernetes.io/projected/3d9fa44b-220b-4f14-824c-1393dd61fc88-kube-api-access-tcjhk\") pod \"ssh-known-hosts-edpm-deployment-cg72d\" (UID: \"3d9fa44b-220b-4f14-824c-1393dd61fc88\") " pod="openstack/ssh-known-hosts-edpm-deployment-cg72d" Oct 01 15:27:45 crc kubenswrapper[4771]: I1001 15:27:45.359018 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3d9fa44b-220b-4f14-824c-1393dd61fc88-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-cg72d\" (UID: \"3d9fa44b-220b-4f14-824c-1393dd61fc88\") " pod="openstack/ssh-known-hosts-edpm-deployment-cg72d" Oct 01 15:27:45 crc kubenswrapper[4771]: I1001 15:27:45.364780 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3d9fa44b-220b-4f14-824c-1393dd61fc88-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-cg72d\" (UID: \"3d9fa44b-220b-4f14-824c-1393dd61fc88\") " pod="openstack/ssh-known-hosts-edpm-deployment-cg72d" Oct 01 15:27:45 crc kubenswrapper[4771]: I1001 15:27:45.382121 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcjhk\" (UniqueName: \"kubernetes.io/projected/3d9fa44b-220b-4f14-824c-1393dd61fc88-kube-api-access-tcjhk\") pod \"ssh-known-hosts-edpm-deployment-cg72d\" (UID: \"3d9fa44b-220b-4f14-824c-1393dd61fc88\") " pod="openstack/ssh-known-hosts-edpm-deployment-cg72d" Oct 01 15:27:45 crc kubenswrapper[4771]: I1001 15:27:45.524869 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-cg72d" Oct 01 15:27:46 crc kubenswrapper[4771]: I1001 15:27:46.085804 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-cg72d"] Oct 01 15:27:46 crc kubenswrapper[4771]: I1001 15:27:46.109923 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-cg72d" event={"ID":"3d9fa44b-220b-4f14-824c-1393dd61fc88","Type":"ContainerStarted","Data":"f80c7c8fd457bfd43da22b7ef4755427add6f133b354cc0187e3918c3316353e"} Oct 01 15:27:47 crc kubenswrapper[4771]: I1001 15:27:47.119759 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-cg72d" event={"ID":"3d9fa44b-220b-4f14-824c-1393dd61fc88","Type":"ContainerStarted","Data":"577bbc61532dca25e3524589d4f26a765f86491dcae019aec4ed1b3ce8f6db22"} Oct 01 15:27:47 crc kubenswrapper[4771]: I1001 15:27:47.147830 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-cg72d" podStartSLOduration=1.6064280850000001 podStartE2EDuration="2.147804663s" podCreationTimestamp="2025-10-01 15:27:45 +0000 UTC" firstStartedPulling="2025-10-01 15:27:46.096500545 +0000 UTC m=+1910.715675716" lastFinishedPulling="2025-10-01 15:27:46.637877083 +0000 UTC m=+1911.257052294" observedRunningTime="2025-10-01 15:27:47.136839184 +0000 UTC m=+1911.756014395" watchObservedRunningTime="2025-10-01 15:27:47.147804663 +0000 UTC m=+1911.766979864" Oct 01 15:27:55 crc kubenswrapper[4771]: I1001 15:27:55.206328 4771 generic.go:334] "Generic (PLEG): container finished" podID="3d9fa44b-220b-4f14-824c-1393dd61fc88" containerID="577bbc61532dca25e3524589d4f26a765f86491dcae019aec4ed1b3ce8f6db22" exitCode=0 Oct 01 15:27:55 crc kubenswrapper[4771]: I1001 15:27:55.206866 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-cg72d" event={"ID":"3d9fa44b-220b-4f14-824c-1393dd61fc88","Type":"ContainerDied","Data":"577bbc61532dca25e3524589d4f26a765f86491dcae019aec4ed1b3ce8f6db22"} Oct 01 15:27:56 crc kubenswrapper[4771]: I1001 15:27:56.732793 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-cg72d" Oct 01 15:27:56 crc kubenswrapper[4771]: I1001 15:27:56.821756 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tcjhk\" (UniqueName: \"kubernetes.io/projected/3d9fa44b-220b-4f14-824c-1393dd61fc88-kube-api-access-tcjhk\") pod \"3d9fa44b-220b-4f14-824c-1393dd61fc88\" (UID: \"3d9fa44b-220b-4f14-824c-1393dd61fc88\") " Oct 01 15:27:56 crc kubenswrapper[4771]: I1001 15:27:56.821955 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3d9fa44b-220b-4f14-824c-1393dd61fc88-ssh-key-openstack-edpm-ipam\") pod \"3d9fa44b-220b-4f14-824c-1393dd61fc88\" (UID: \"3d9fa44b-220b-4f14-824c-1393dd61fc88\") " Oct 01 15:27:56 crc kubenswrapper[4771]: I1001 15:27:56.821999 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3d9fa44b-220b-4f14-824c-1393dd61fc88-inventory-0\") pod \"3d9fa44b-220b-4f14-824c-1393dd61fc88\" (UID: \"3d9fa44b-220b-4f14-824c-1393dd61fc88\") " Oct 01 15:27:56 crc kubenswrapper[4771]: I1001 15:27:56.831675 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d9fa44b-220b-4f14-824c-1393dd61fc88-kube-api-access-tcjhk" (OuterVolumeSpecName: "kube-api-access-tcjhk") pod "3d9fa44b-220b-4f14-824c-1393dd61fc88" (UID: "3d9fa44b-220b-4f14-824c-1393dd61fc88"). InnerVolumeSpecName "kube-api-access-tcjhk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:27:56 crc kubenswrapper[4771]: I1001 15:27:56.861311 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d9fa44b-220b-4f14-824c-1393dd61fc88-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3d9fa44b-220b-4f14-824c-1393dd61fc88" (UID: "3d9fa44b-220b-4f14-824c-1393dd61fc88"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:27:56 crc kubenswrapper[4771]: I1001 15:27:56.862726 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d9fa44b-220b-4f14-824c-1393dd61fc88-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "3d9fa44b-220b-4f14-824c-1393dd61fc88" (UID: "3d9fa44b-220b-4f14-824c-1393dd61fc88"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:27:56 crc kubenswrapper[4771]: I1001 15:27:56.924469 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3d9fa44b-220b-4f14-824c-1393dd61fc88-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 01 15:27:56 crc kubenswrapper[4771]: I1001 15:27:56.924503 4771 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/3d9fa44b-220b-4f14-824c-1393dd61fc88-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 01 15:27:56 crc kubenswrapper[4771]: I1001 15:27:56.924515 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tcjhk\" (UniqueName: \"kubernetes.io/projected/3d9fa44b-220b-4f14-824c-1393dd61fc88-kube-api-access-tcjhk\") on node \"crc\" DevicePath \"\"" Oct 01 15:27:57 crc kubenswrapper[4771]: I1001 15:27:57.228950 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-cg72d" event={"ID":"3d9fa44b-220b-4f14-824c-1393dd61fc88","Type":"ContainerDied","Data":"f80c7c8fd457bfd43da22b7ef4755427add6f133b354cc0187e3918c3316353e"} Oct 01 15:27:57 crc kubenswrapper[4771]: I1001 15:27:57.228996 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f80c7c8fd457bfd43da22b7ef4755427add6f133b354cc0187e3918c3316353e" Oct 01 15:27:57 crc kubenswrapper[4771]: I1001 15:27:57.229075 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-cg72d" Oct 01 15:27:57 crc kubenswrapper[4771]: I1001 15:27:57.301623 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-d6dm9"] Oct 01 15:27:57 crc kubenswrapper[4771]: E1001 15:27:57.302140 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d9fa44b-220b-4f14-824c-1393dd61fc88" containerName="ssh-known-hosts-edpm-deployment" Oct 01 15:27:57 crc kubenswrapper[4771]: I1001 15:27:57.302166 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d9fa44b-220b-4f14-824c-1393dd61fc88" containerName="ssh-known-hosts-edpm-deployment" Oct 01 15:27:57 crc kubenswrapper[4771]: I1001 15:27:57.302423 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d9fa44b-220b-4f14-824c-1393dd61fc88" containerName="ssh-known-hosts-edpm-deployment" Oct 01 15:27:57 crc kubenswrapper[4771]: I1001 15:27:57.303209 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d6dm9" Oct 01 15:27:57 crc kubenswrapper[4771]: I1001 15:27:57.307601 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 15:27:57 crc kubenswrapper[4771]: I1001 15:27:57.307605 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 15:27:57 crc kubenswrapper[4771]: I1001 15:27:57.307662 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fv9b7" Oct 01 15:27:57 crc kubenswrapper[4771]: I1001 15:27:57.307881 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 15:27:57 crc kubenswrapper[4771]: I1001 15:27:57.316862 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-d6dm9"] Oct 01 15:27:57 crc kubenswrapper[4771]: I1001 15:27:57.435124 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swc8r\" (UniqueName: \"kubernetes.io/projected/8a04e35a-e2e5-412d-ab61-896f5271ac14-kube-api-access-swc8r\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-d6dm9\" (UID: \"8a04e35a-e2e5-412d-ab61-896f5271ac14\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d6dm9" Oct 01 15:27:57 crc kubenswrapper[4771]: I1001 15:27:57.435569 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8a04e35a-e2e5-412d-ab61-896f5271ac14-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-d6dm9\" (UID: \"8a04e35a-e2e5-412d-ab61-896f5271ac14\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d6dm9" Oct 01 15:27:57 crc kubenswrapper[4771]: I1001 15:27:57.435611 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a04e35a-e2e5-412d-ab61-896f5271ac14-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-d6dm9\" (UID: \"8a04e35a-e2e5-412d-ab61-896f5271ac14\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d6dm9" Oct 01 15:27:57 crc kubenswrapper[4771]: I1001 15:27:57.538220 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swc8r\" (UniqueName: \"kubernetes.io/projected/8a04e35a-e2e5-412d-ab61-896f5271ac14-kube-api-access-swc8r\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-d6dm9\" (UID: \"8a04e35a-e2e5-412d-ab61-896f5271ac14\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d6dm9" Oct 01 15:27:57 crc kubenswrapper[4771]: I1001 15:27:57.538588 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8a04e35a-e2e5-412d-ab61-896f5271ac14-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-d6dm9\" (UID: \"8a04e35a-e2e5-412d-ab61-896f5271ac14\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d6dm9" Oct 01 15:27:57 crc kubenswrapper[4771]: I1001 15:27:57.538675 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a04e35a-e2e5-412d-ab61-896f5271ac14-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-d6dm9\" (UID: \"8a04e35a-e2e5-412d-ab61-896f5271ac14\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d6dm9" Oct 01 15:27:57 crc kubenswrapper[4771]: I1001 15:27:57.544643 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8a04e35a-e2e5-412d-ab61-896f5271ac14-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-d6dm9\" (UID: \"8a04e35a-e2e5-412d-ab61-896f5271ac14\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d6dm9" Oct 01 15:27:57 crc kubenswrapper[4771]: I1001 15:27:57.550446 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a04e35a-e2e5-412d-ab61-896f5271ac14-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-d6dm9\" (UID: \"8a04e35a-e2e5-412d-ab61-896f5271ac14\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d6dm9" Oct 01 15:27:57 crc kubenswrapper[4771]: I1001 15:27:57.564158 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swc8r\" (UniqueName: \"kubernetes.io/projected/8a04e35a-e2e5-412d-ab61-896f5271ac14-kube-api-access-swc8r\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-d6dm9\" (UID: \"8a04e35a-e2e5-412d-ab61-896f5271ac14\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d6dm9" Oct 01 15:27:57 crc kubenswrapper[4771]: I1001 15:27:57.630239 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d6dm9" Oct 01 15:27:58 crc kubenswrapper[4771]: I1001 15:27:58.157906 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-d6dm9"] Oct 01 15:27:58 crc kubenswrapper[4771]: I1001 15:27:58.241169 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d6dm9" event={"ID":"8a04e35a-e2e5-412d-ab61-896f5271ac14","Type":"ContainerStarted","Data":"5d97d7148eb98706063a281105bc898ff4a9510b5c4731cbe116be9fcb53d476"} Oct 01 15:27:59 crc kubenswrapper[4771]: I1001 15:27:59.259718 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d6dm9" event={"ID":"8a04e35a-e2e5-412d-ab61-896f5271ac14","Type":"ContainerStarted","Data":"bf44204d582182eed094ca224bff7f4b4d815090122cf1128fea33c44c810693"} Oct 01 15:27:59 crc kubenswrapper[4771]: I1001 15:27:59.282997 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d6dm9" podStartSLOduration=1.72826025 podStartE2EDuration="2.282977373s" podCreationTimestamp="2025-10-01 15:27:57 +0000 UTC" firstStartedPulling="2025-10-01 15:27:58.169503247 +0000 UTC m=+1922.788678428" lastFinishedPulling="2025-10-01 15:27:58.72422037 +0000 UTC m=+1923.343395551" observedRunningTime="2025-10-01 15:27:59.279590691 +0000 UTC m=+1923.898765892" watchObservedRunningTime="2025-10-01 15:27:59.282977373 +0000 UTC m=+1923.902152554" Oct 01 15:28:08 crc kubenswrapper[4771]: I1001 15:28:08.352326 4771 generic.go:334] "Generic (PLEG): container finished" podID="8a04e35a-e2e5-412d-ab61-896f5271ac14" containerID="bf44204d582182eed094ca224bff7f4b4d815090122cf1128fea33c44c810693" exitCode=0 Oct 01 15:28:08 crc kubenswrapper[4771]: I1001 15:28:08.352395 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d6dm9" event={"ID":"8a04e35a-e2e5-412d-ab61-896f5271ac14","Type":"ContainerDied","Data":"bf44204d582182eed094ca224bff7f4b4d815090122cf1128fea33c44c810693"} Oct 01 15:28:09 crc kubenswrapper[4771]: I1001 15:28:09.884842 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d6dm9" Oct 01 15:28:09 crc kubenswrapper[4771]: I1001 15:28:09.908769 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swc8r\" (UniqueName: \"kubernetes.io/projected/8a04e35a-e2e5-412d-ab61-896f5271ac14-kube-api-access-swc8r\") pod \"8a04e35a-e2e5-412d-ab61-896f5271ac14\" (UID: \"8a04e35a-e2e5-412d-ab61-896f5271ac14\") " Oct 01 15:28:09 crc kubenswrapper[4771]: I1001 15:28:09.908854 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a04e35a-e2e5-412d-ab61-896f5271ac14-inventory\") pod \"8a04e35a-e2e5-412d-ab61-896f5271ac14\" (UID: \"8a04e35a-e2e5-412d-ab61-896f5271ac14\") " Oct 01 15:28:09 crc kubenswrapper[4771]: I1001 15:28:09.909043 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8a04e35a-e2e5-412d-ab61-896f5271ac14-ssh-key\") pod \"8a04e35a-e2e5-412d-ab61-896f5271ac14\" (UID: \"8a04e35a-e2e5-412d-ab61-896f5271ac14\") " Oct 01 15:28:09 crc kubenswrapper[4771]: I1001 15:28:09.916300 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a04e35a-e2e5-412d-ab61-896f5271ac14-kube-api-access-swc8r" (OuterVolumeSpecName: "kube-api-access-swc8r") pod "8a04e35a-e2e5-412d-ab61-896f5271ac14" (UID: "8a04e35a-e2e5-412d-ab61-896f5271ac14"). InnerVolumeSpecName "kube-api-access-swc8r". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:28:09 crc kubenswrapper[4771]: I1001 15:28:09.940849 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a04e35a-e2e5-412d-ab61-896f5271ac14-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8a04e35a-e2e5-412d-ab61-896f5271ac14" (UID: "8a04e35a-e2e5-412d-ab61-896f5271ac14"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:28:09 crc kubenswrapper[4771]: I1001 15:28:09.964286 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a04e35a-e2e5-412d-ab61-896f5271ac14-inventory" (OuterVolumeSpecName: "inventory") pod "8a04e35a-e2e5-412d-ab61-896f5271ac14" (UID: "8a04e35a-e2e5-412d-ab61-896f5271ac14"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:28:10 crc kubenswrapper[4771]: I1001 15:28:10.011079 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8a04e35a-e2e5-412d-ab61-896f5271ac14-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 15:28:10 crc kubenswrapper[4771]: I1001 15:28:10.011111 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swc8r\" (UniqueName: \"kubernetes.io/projected/8a04e35a-e2e5-412d-ab61-896f5271ac14-kube-api-access-swc8r\") on node \"crc\" DevicePath \"\"" Oct 01 15:28:10 crc kubenswrapper[4771]: I1001 15:28:10.011124 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a04e35a-e2e5-412d-ab61-896f5271ac14-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 15:28:10 crc kubenswrapper[4771]: I1001 15:28:10.388338 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d6dm9" event={"ID":"8a04e35a-e2e5-412d-ab61-896f5271ac14","Type":"ContainerDied","Data":"5d97d7148eb98706063a281105bc898ff4a9510b5c4731cbe116be9fcb53d476"} Oct 01 15:28:10 crc kubenswrapper[4771]: I1001 15:28:10.388392 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d97d7148eb98706063a281105bc898ff4a9510b5c4731cbe116be9fcb53d476" Oct 01 15:28:10 crc kubenswrapper[4771]: I1001 15:28:10.388425 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-d6dm9" Oct 01 15:28:10 crc kubenswrapper[4771]: I1001 15:28:10.476791 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6gkfz"] Oct 01 15:28:10 crc kubenswrapper[4771]: E1001 15:28:10.483141 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a04e35a-e2e5-412d-ab61-896f5271ac14" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 01 15:28:10 crc kubenswrapper[4771]: I1001 15:28:10.483183 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a04e35a-e2e5-412d-ab61-896f5271ac14" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 01 15:28:10 crc kubenswrapper[4771]: I1001 15:28:10.483656 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a04e35a-e2e5-412d-ab61-896f5271ac14" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 01 15:28:10 crc kubenswrapper[4771]: I1001 15:28:10.484639 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6gkfz" Oct 01 15:28:10 crc kubenswrapper[4771]: I1001 15:28:10.487178 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 15:28:10 crc kubenswrapper[4771]: I1001 15:28:10.487846 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fv9b7" Oct 01 15:28:10 crc kubenswrapper[4771]: I1001 15:28:10.487894 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 15:28:10 crc kubenswrapper[4771]: I1001 15:28:10.491792 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 15:28:10 crc kubenswrapper[4771]: I1001 15:28:10.501142 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6gkfz"] Oct 01 15:28:10 crc kubenswrapper[4771]: I1001 15:28:10.521909 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krpcq\" (UniqueName: \"kubernetes.io/projected/560e443b-7ae0-4b0c-912d-6f7895b3a8dd-kube-api-access-krpcq\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6gkfz\" (UID: \"560e443b-7ae0-4b0c-912d-6f7895b3a8dd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6gkfz" Oct 01 15:28:10 crc kubenswrapper[4771]: I1001 15:28:10.522077 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/560e443b-7ae0-4b0c-912d-6f7895b3a8dd-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6gkfz\" (UID: \"560e443b-7ae0-4b0c-912d-6f7895b3a8dd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6gkfz" Oct 01 15:28:10 crc kubenswrapper[4771]: I1001 15:28:10.522119 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/560e443b-7ae0-4b0c-912d-6f7895b3a8dd-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6gkfz\" (UID: \"560e443b-7ae0-4b0c-912d-6f7895b3a8dd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6gkfz" Oct 01 15:28:10 crc kubenswrapper[4771]: I1001 15:28:10.623839 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/560e443b-7ae0-4b0c-912d-6f7895b3a8dd-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6gkfz\" (UID: \"560e443b-7ae0-4b0c-912d-6f7895b3a8dd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6gkfz" Oct 01 15:28:10 crc kubenswrapper[4771]: I1001 15:28:10.623892 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/560e443b-7ae0-4b0c-912d-6f7895b3a8dd-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6gkfz\" (UID: \"560e443b-7ae0-4b0c-912d-6f7895b3a8dd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6gkfz" Oct 01 15:28:10 crc kubenswrapper[4771]: I1001 15:28:10.623965 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krpcq\" (UniqueName: \"kubernetes.io/projected/560e443b-7ae0-4b0c-912d-6f7895b3a8dd-kube-api-access-krpcq\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6gkfz\" (UID: \"560e443b-7ae0-4b0c-912d-6f7895b3a8dd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6gkfz" Oct 01 15:28:10 crc kubenswrapper[4771]: I1001 15:28:10.628864 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/560e443b-7ae0-4b0c-912d-6f7895b3a8dd-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6gkfz\" (UID: \"560e443b-7ae0-4b0c-912d-6f7895b3a8dd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6gkfz" Oct 01 15:28:10 crc kubenswrapper[4771]: I1001 15:28:10.640961 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/560e443b-7ae0-4b0c-912d-6f7895b3a8dd-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6gkfz\" (UID: \"560e443b-7ae0-4b0c-912d-6f7895b3a8dd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6gkfz" Oct 01 15:28:10 crc kubenswrapper[4771]: I1001 15:28:10.646946 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krpcq\" (UniqueName: \"kubernetes.io/projected/560e443b-7ae0-4b0c-912d-6f7895b3a8dd-kube-api-access-krpcq\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-6gkfz\" (UID: \"560e443b-7ae0-4b0c-912d-6f7895b3a8dd\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6gkfz" Oct 01 15:28:10 crc kubenswrapper[4771]: I1001 15:28:10.850932 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6gkfz" Oct 01 15:28:11 crc kubenswrapper[4771]: I1001 15:28:11.430407 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6gkfz"] Oct 01 15:28:11 crc kubenswrapper[4771]: W1001 15:28:11.431105 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod560e443b_7ae0_4b0c_912d_6f7895b3a8dd.slice/crio-8982b111f468445db04832c1b02381d95e8d3e853b3db2e77047fedf8a78fb7e WatchSource:0}: Error finding container 8982b111f468445db04832c1b02381d95e8d3e853b3db2e77047fedf8a78fb7e: Status 404 returned error can't find the container with id 8982b111f468445db04832c1b02381d95e8d3e853b3db2e77047fedf8a78fb7e Oct 01 15:28:12 crc kubenswrapper[4771]: I1001 15:28:12.413976 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6gkfz" event={"ID":"560e443b-7ae0-4b0c-912d-6f7895b3a8dd","Type":"ContainerStarted","Data":"e4eeb59dd0e4cb5358291606365e8f52595d35f42f63ce6a90f389ee17738c5c"} Oct 01 15:28:12 crc kubenswrapper[4771]: I1001 15:28:12.414521 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6gkfz" event={"ID":"560e443b-7ae0-4b0c-912d-6f7895b3a8dd","Type":"ContainerStarted","Data":"8982b111f468445db04832c1b02381d95e8d3e853b3db2e77047fedf8a78fb7e"} Oct 01 15:28:22 crc kubenswrapper[4771]: I1001 15:28:22.537310 4771 generic.go:334] "Generic (PLEG): container finished" podID="560e443b-7ae0-4b0c-912d-6f7895b3a8dd" containerID="e4eeb59dd0e4cb5358291606365e8f52595d35f42f63ce6a90f389ee17738c5c" exitCode=0 Oct 01 15:28:22 crc kubenswrapper[4771]: I1001 15:28:22.537446 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6gkfz" event={"ID":"560e443b-7ae0-4b0c-912d-6f7895b3a8dd","Type":"ContainerDied","Data":"e4eeb59dd0e4cb5358291606365e8f52595d35f42f63ce6a90f389ee17738c5c"} Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.020755 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6gkfz" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.120198 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/560e443b-7ae0-4b0c-912d-6f7895b3a8dd-ssh-key\") pod \"560e443b-7ae0-4b0c-912d-6f7895b3a8dd\" (UID: \"560e443b-7ae0-4b0c-912d-6f7895b3a8dd\") " Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.120349 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-krpcq\" (UniqueName: \"kubernetes.io/projected/560e443b-7ae0-4b0c-912d-6f7895b3a8dd-kube-api-access-krpcq\") pod \"560e443b-7ae0-4b0c-912d-6f7895b3a8dd\" (UID: \"560e443b-7ae0-4b0c-912d-6f7895b3a8dd\") " Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.120426 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/560e443b-7ae0-4b0c-912d-6f7895b3a8dd-inventory\") pod \"560e443b-7ae0-4b0c-912d-6f7895b3a8dd\" (UID: \"560e443b-7ae0-4b0c-912d-6f7895b3a8dd\") " Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.125723 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/560e443b-7ae0-4b0c-912d-6f7895b3a8dd-kube-api-access-krpcq" (OuterVolumeSpecName: "kube-api-access-krpcq") pod "560e443b-7ae0-4b0c-912d-6f7895b3a8dd" (UID: "560e443b-7ae0-4b0c-912d-6f7895b3a8dd"). InnerVolumeSpecName "kube-api-access-krpcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.146213 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/560e443b-7ae0-4b0c-912d-6f7895b3a8dd-inventory" (OuterVolumeSpecName: "inventory") pod "560e443b-7ae0-4b0c-912d-6f7895b3a8dd" (UID: "560e443b-7ae0-4b0c-912d-6f7895b3a8dd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.156640 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/560e443b-7ae0-4b0c-912d-6f7895b3a8dd-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "560e443b-7ae0-4b0c-912d-6f7895b3a8dd" (UID: "560e443b-7ae0-4b0c-912d-6f7895b3a8dd"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.224109 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/560e443b-7ae0-4b0c-912d-6f7895b3a8dd-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.224144 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-krpcq\" (UniqueName: \"kubernetes.io/projected/560e443b-7ae0-4b0c-912d-6f7895b3a8dd-kube-api-access-krpcq\") on node \"crc\" DevicePath \"\"" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.224160 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/560e443b-7ae0-4b0c-912d-6f7895b3a8dd-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.560825 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6gkfz" event={"ID":"560e443b-7ae0-4b0c-912d-6f7895b3a8dd","Type":"ContainerDied","Data":"8982b111f468445db04832c1b02381d95e8d3e853b3db2e77047fedf8a78fb7e"} Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.560864 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8982b111f468445db04832c1b02381d95e8d3e853b3db2e77047fedf8a78fb7e" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.560907 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-6gkfz" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.670951 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797"] Oct 01 15:28:24 crc kubenswrapper[4771]: E1001 15:28:24.671500 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="560e443b-7ae0-4b0c-912d-6f7895b3a8dd" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.671528 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="560e443b-7ae0-4b0c-912d-6f7895b3a8dd" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.671924 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="560e443b-7ae0-4b0c-912d-6f7895b3a8dd" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.672960 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.676290 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.676533 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.676786 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.676920 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.677043 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.677158 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fv9b7" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.677266 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.680258 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.685695 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797"] Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.835663 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qk797\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.836140 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qk797\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.836186 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qk797\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.836245 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qk797\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.836333 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qk797\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.836389 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qk797\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.836425 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qk797\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.836471 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qk797\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.836530 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qk797\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.836621 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qk797\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.836676 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qk797\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.836727 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qk797\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.836809 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzgks\" (UniqueName: \"kubernetes.io/projected/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-kube-api-access-hzgks\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qk797\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.836863 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qk797\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.938267 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qk797\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.938341 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qk797\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.938372 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qk797\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.938411 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qk797\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.938479 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qk797\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.938512 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qk797\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.938542 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qk797\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.938569 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qk797\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.938611 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qk797\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.938657 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qk797\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.938687 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qk797\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.938721 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qk797\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.938811 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzgks\" (UniqueName: \"kubernetes.io/projected/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-kube-api-access-hzgks\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qk797\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.938861 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qk797\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.942969 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qk797\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.943519 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qk797\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.944918 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qk797\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.946364 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qk797\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.947206 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qk797\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.947332 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qk797\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.947340 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qk797\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.949453 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qk797\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.949567 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qk797\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.950320 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qk797\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.950411 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qk797\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.950429 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qk797\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.954205 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qk797\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.963695 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzgks\" (UniqueName: \"kubernetes.io/projected/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-kube-api-access-hzgks\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-qk797\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" Oct 01 15:28:24 crc kubenswrapper[4771]: I1001 15:28:24.998534 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" Oct 01 15:28:25 crc kubenswrapper[4771]: I1001 15:28:25.379885 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797"] Oct 01 15:28:25 crc kubenswrapper[4771]: I1001 15:28:25.573404 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" event={"ID":"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a","Type":"ContainerStarted","Data":"9cb86d9cb6bd654a8c4a1ec4bd04597444b83ca90d7f4711af313f8b8ebee646"} Oct 01 15:28:26 crc kubenswrapper[4771]: I1001 15:28:26.583373 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" event={"ID":"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a","Type":"ContainerStarted","Data":"53059152393e2533267fa4d288ff11093a1e34d16b1a2ea0e587dae9eb31f8c5"} Oct 01 15:28:26 crc kubenswrapper[4771]: I1001 15:28:26.603692 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" podStartSLOduration=2.141871783 podStartE2EDuration="2.603674667s" podCreationTimestamp="2025-10-01 15:28:24 +0000 UTC" firstStartedPulling="2025-10-01 15:28:25.382264463 +0000 UTC m=+1950.001439644" lastFinishedPulling="2025-10-01 15:28:25.844067347 +0000 UTC m=+1950.463242528" observedRunningTime="2025-10-01 15:28:26.599559396 +0000 UTC m=+1951.218734577" watchObservedRunningTime="2025-10-01 15:28:26.603674667 +0000 UTC m=+1951.222849838" Oct 01 15:28:42 crc kubenswrapper[4771]: I1001 15:28:42.177672 4771 patch_prober.go:28] interesting pod/machine-config-daemon-vck47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 15:28:42 crc kubenswrapper[4771]: I1001 15:28:42.178449 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 15:29:10 crc kubenswrapper[4771]: E1001 15:29:10.404244 4771 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode6e7232e_0b6f_433f_a1e5_f99aab22ed8a.slice/crio-conmon-53059152393e2533267fa4d288ff11093a1e34d16b1a2ea0e587dae9eb31f8c5.scope\": RecentStats: unable to find data in memory cache]" Oct 01 15:29:11 crc kubenswrapper[4771]: I1001 15:29:11.087040 4771 generic.go:334] "Generic (PLEG): container finished" podID="e6e7232e-0b6f-433f-a1e5-f99aab22ed8a" containerID="53059152393e2533267fa4d288ff11093a1e34d16b1a2ea0e587dae9eb31f8c5" exitCode=0 Oct 01 15:29:11 crc kubenswrapper[4771]: I1001 15:29:11.087186 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" event={"ID":"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a","Type":"ContainerDied","Data":"53059152393e2533267fa4d288ff11093a1e34d16b1a2ea0e587dae9eb31f8c5"} Oct 01 15:29:12 crc kubenswrapper[4771]: I1001 15:29:12.177571 4771 patch_prober.go:28] interesting pod/machine-config-daemon-vck47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 15:29:12 crc kubenswrapper[4771]: I1001 15:29:12.177622 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 15:29:12 crc kubenswrapper[4771]: I1001 15:29:12.503126 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" Oct 01 15:29:12 crc kubenswrapper[4771]: I1001 15:29:12.682444 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-neutron-metadata-combined-ca-bundle\") pod \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " Oct 01 15:29:12 crc kubenswrapper[4771]: I1001 15:29:12.682561 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-bootstrap-combined-ca-bundle\") pod \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " Oct 01 15:29:12 crc kubenswrapper[4771]: I1001 15:29:12.682681 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-telemetry-combined-ca-bundle\") pod \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " Oct 01 15:29:12 crc kubenswrapper[4771]: I1001 15:29:12.682724 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-nova-combined-ca-bundle\") pod \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " Oct 01 15:29:12 crc kubenswrapper[4771]: I1001 15:29:12.682815 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-libvirt-combined-ca-bundle\") pod \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " Oct 01 15:29:12 crc kubenswrapper[4771]: I1001 15:29:12.682852 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-ssh-key\") pod \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " Oct 01 15:29:12 crc kubenswrapper[4771]: I1001 15:29:12.683524 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-ovn-combined-ca-bundle\") pod \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " Oct 01 15:29:12 crc kubenswrapper[4771]: I1001 15:29:12.683553 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-repo-setup-combined-ca-bundle\") pod \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " Oct 01 15:29:12 crc kubenswrapper[4771]: I1001 15:29:12.683628 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " Oct 01 15:29:12 crc kubenswrapper[4771]: I1001 15:29:12.683668 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " Oct 01 15:29:12 crc kubenswrapper[4771]: I1001 15:29:12.683687 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-openstack-edpm-ipam-ovn-default-certs-0\") pod \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " Oct 01 15:29:12 crc kubenswrapper[4771]: I1001 15:29:12.683769 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzgks\" (UniqueName: \"kubernetes.io/projected/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-kube-api-access-hzgks\") pod \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " Oct 01 15:29:12 crc kubenswrapper[4771]: I1001 15:29:12.683805 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-inventory\") pod \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " Oct 01 15:29:12 crc kubenswrapper[4771]: I1001 15:29:12.683822 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\" (UID: \"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a\") " Oct 01 15:29:12 crc kubenswrapper[4771]: I1001 15:29:12.691245 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "e6e7232e-0b6f-433f-a1e5-f99aab22ed8a" (UID: "e6e7232e-0b6f-433f-a1e5-f99aab22ed8a"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:29:12 crc kubenswrapper[4771]: I1001 15:29:12.691529 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "e6e7232e-0b6f-433f-a1e5-f99aab22ed8a" (UID: "e6e7232e-0b6f-433f-a1e5-f99aab22ed8a"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:29:12 crc kubenswrapper[4771]: I1001 15:29:12.691540 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "e6e7232e-0b6f-433f-a1e5-f99aab22ed8a" (UID: "e6e7232e-0b6f-433f-a1e5-f99aab22ed8a"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:29:12 crc kubenswrapper[4771]: I1001 15:29:12.691648 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-kube-api-access-hzgks" (OuterVolumeSpecName: "kube-api-access-hzgks") pod "e6e7232e-0b6f-433f-a1e5-f99aab22ed8a" (UID: "e6e7232e-0b6f-433f-a1e5-f99aab22ed8a"). InnerVolumeSpecName "kube-api-access-hzgks". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:29:12 crc kubenswrapper[4771]: I1001 15:29:12.692085 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "e6e7232e-0b6f-433f-a1e5-f99aab22ed8a" (UID: "e6e7232e-0b6f-433f-a1e5-f99aab22ed8a"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:29:12 crc kubenswrapper[4771]: I1001 15:29:12.692287 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "e6e7232e-0b6f-433f-a1e5-f99aab22ed8a" (UID: "e6e7232e-0b6f-433f-a1e5-f99aab22ed8a"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:29:12 crc kubenswrapper[4771]: I1001 15:29:12.692321 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "e6e7232e-0b6f-433f-a1e5-f99aab22ed8a" (UID: "e6e7232e-0b6f-433f-a1e5-f99aab22ed8a"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:29:12 crc kubenswrapper[4771]: I1001 15:29:12.692777 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "e6e7232e-0b6f-433f-a1e5-f99aab22ed8a" (UID: "e6e7232e-0b6f-433f-a1e5-f99aab22ed8a"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:29:12 crc kubenswrapper[4771]: I1001 15:29:12.693020 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "e6e7232e-0b6f-433f-a1e5-f99aab22ed8a" (UID: "e6e7232e-0b6f-433f-a1e5-f99aab22ed8a"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:29:12 crc kubenswrapper[4771]: I1001 15:29:12.693075 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "e6e7232e-0b6f-433f-a1e5-f99aab22ed8a" (UID: "e6e7232e-0b6f-433f-a1e5-f99aab22ed8a"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:29:12 crc kubenswrapper[4771]: I1001 15:29:12.694235 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "e6e7232e-0b6f-433f-a1e5-f99aab22ed8a" (UID: "e6e7232e-0b6f-433f-a1e5-f99aab22ed8a"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:29:12 crc kubenswrapper[4771]: I1001 15:29:12.700123 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "e6e7232e-0b6f-433f-a1e5-f99aab22ed8a" (UID: "e6e7232e-0b6f-433f-a1e5-f99aab22ed8a"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:29:12 crc kubenswrapper[4771]: I1001 15:29:12.724350 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-inventory" (OuterVolumeSpecName: "inventory") pod "e6e7232e-0b6f-433f-a1e5-f99aab22ed8a" (UID: "e6e7232e-0b6f-433f-a1e5-f99aab22ed8a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:29:12 crc kubenswrapper[4771]: I1001 15:29:12.728106 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e6e7232e-0b6f-433f-a1e5-f99aab22ed8a" (UID: "e6e7232e-0b6f-433f-a1e5-f99aab22ed8a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:29:12 crc kubenswrapper[4771]: I1001 15:29:12.786689 4771 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 01 15:29:12 crc kubenswrapper[4771]: I1001 15:29:12.786777 4771 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 01 15:29:12 crc kubenswrapper[4771]: I1001 15:29:12.786803 4771 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 01 15:29:12 crc kubenswrapper[4771]: I1001 15:29:12.786968 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzgks\" (UniqueName: \"kubernetes.io/projected/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-kube-api-access-hzgks\") on node \"crc\" DevicePath \"\"" Oct 01 15:29:12 crc kubenswrapper[4771]: I1001 15:29:12.787038 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 15:29:12 crc kubenswrapper[4771]: I1001 15:29:12.787060 4771 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 01 15:29:12 crc kubenswrapper[4771]: I1001 15:29:12.787079 4771 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:29:12 crc kubenswrapper[4771]: I1001 15:29:12.787099 4771 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:29:12 crc kubenswrapper[4771]: I1001 15:29:12.787117 4771 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:29:12 crc kubenswrapper[4771]: I1001 15:29:12.787133 4771 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:29:12 crc kubenswrapper[4771]: I1001 15:29:12.787150 4771 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:29:12 crc kubenswrapper[4771]: I1001 15:29:12.787167 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 15:29:12 crc kubenswrapper[4771]: I1001 15:29:12.787186 4771 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:29:12 crc kubenswrapper[4771]: I1001 15:29:12.787203 4771 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e7232e-0b6f-433f-a1e5-f99aab22ed8a-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:29:13 crc kubenswrapper[4771]: I1001 15:29:13.106115 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" event={"ID":"e6e7232e-0b6f-433f-a1e5-f99aab22ed8a","Type":"ContainerDied","Data":"9cb86d9cb6bd654a8c4a1ec4bd04597444b83ca90d7f4711af313f8b8ebee646"} Oct 01 15:29:13 crc kubenswrapper[4771]: I1001 15:29:13.106184 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9cb86d9cb6bd654a8c4a1ec4bd04597444b83ca90d7f4711af313f8b8ebee646" Oct 01 15:29:13 crc kubenswrapper[4771]: I1001 15:29:13.106314 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-qk797" Oct 01 15:29:13 crc kubenswrapper[4771]: I1001 15:29:13.270124 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-h96xv"] Oct 01 15:29:13 crc kubenswrapper[4771]: E1001 15:29:13.270565 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6e7232e-0b6f-433f-a1e5-f99aab22ed8a" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 01 15:29:13 crc kubenswrapper[4771]: I1001 15:29:13.270579 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6e7232e-0b6f-433f-a1e5-f99aab22ed8a" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 01 15:29:13 crc kubenswrapper[4771]: I1001 15:29:13.270769 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6e7232e-0b6f-433f-a1e5-f99aab22ed8a" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 01 15:29:13 crc kubenswrapper[4771]: I1001 15:29:13.271381 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h96xv" Oct 01 15:29:13 crc kubenswrapper[4771]: I1001 15:29:13.274090 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 15:29:13 crc kubenswrapper[4771]: I1001 15:29:13.274867 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 01 15:29:13 crc kubenswrapper[4771]: I1001 15:29:13.275118 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fv9b7" Oct 01 15:29:13 crc kubenswrapper[4771]: I1001 15:29:13.275291 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 15:29:13 crc kubenswrapper[4771]: I1001 15:29:13.277593 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 15:29:13 crc kubenswrapper[4771]: I1001 15:29:13.278467 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-h96xv"] Oct 01 15:29:13 crc kubenswrapper[4771]: I1001 15:29:13.401109 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h96xv\" (UID: \"0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h96xv" Oct 01 15:29:13 crc kubenswrapper[4771]: I1001 15:29:13.401396 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h96xv\" (UID: \"0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h96xv" Oct 01 15:29:13 crc kubenswrapper[4771]: I1001 15:29:13.401474 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxhv6\" (UniqueName: \"kubernetes.io/projected/0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295-kube-api-access-gxhv6\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h96xv\" (UID: \"0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h96xv" Oct 01 15:29:13 crc kubenswrapper[4771]: I1001 15:29:13.401846 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h96xv\" (UID: \"0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h96xv" Oct 01 15:29:13 crc kubenswrapper[4771]: I1001 15:29:13.401950 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h96xv\" (UID: \"0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h96xv" Oct 01 15:29:13 crc kubenswrapper[4771]: I1001 15:29:13.503399 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h96xv\" (UID: \"0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h96xv" Oct 01 15:29:13 crc kubenswrapper[4771]: I1001 15:29:13.503457 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h96xv\" (UID: \"0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h96xv" Oct 01 15:29:13 crc kubenswrapper[4771]: I1001 15:29:13.503487 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h96xv\" (UID: \"0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h96xv" Oct 01 15:29:13 crc kubenswrapper[4771]: I1001 15:29:13.503584 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxhv6\" (UniqueName: \"kubernetes.io/projected/0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295-kube-api-access-gxhv6\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h96xv\" (UID: \"0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h96xv" Oct 01 15:29:13 crc kubenswrapper[4771]: I1001 15:29:13.503722 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h96xv\" (UID: \"0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h96xv" Oct 01 15:29:13 crc kubenswrapper[4771]: I1001 15:29:13.504672 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h96xv\" (UID: \"0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h96xv" Oct 01 15:29:13 crc kubenswrapper[4771]: I1001 15:29:13.508891 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h96xv\" (UID: \"0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h96xv" Oct 01 15:29:13 crc kubenswrapper[4771]: I1001 15:29:13.509182 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h96xv\" (UID: \"0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h96xv" Oct 01 15:29:13 crc kubenswrapper[4771]: I1001 15:29:13.509377 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h96xv\" (UID: \"0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h96xv" Oct 01 15:29:13 crc kubenswrapper[4771]: I1001 15:29:13.521981 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxhv6\" (UniqueName: \"kubernetes.io/projected/0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295-kube-api-access-gxhv6\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-h96xv\" (UID: \"0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h96xv" Oct 01 15:29:13 crc kubenswrapper[4771]: I1001 15:29:13.592518 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h96xv" Oct 01 15:29:14 crc kubenswrapper[4771]: I1001 15:29:14.168947 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 15:29:14 crc kubenswrapper[4771]: I1001 15:29:14.172354 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-h96xv"] Oct 01 15:29:15 crc kubenswrapper[4771]: I1001 15:29:15.130353 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h96xv" event={"ID":"0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295","Type":"ContainerStarted","Data":"8f835321e32059624ae0c809e7efc5bb68b025cc2d090ce1a65c172747bee975"} Oct 01 15:29:15 crc kubenswrapper[4771]: I1001 15:29:15.132070 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h96xv" event={"ID":"0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295","Type":"ContainerStarted","Data":"7bc96652360139afc291a51d561ef581caeb897b5950c6ea3833f014a2034ba8"} Oct 01 15:29:15 crc kubenswrapper[4771]: I1001 15:29:15.155166 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h96xv" podStartSLOduration=1.642076348 podStartE2EDuration="2.155144914s" podCreationTimestamp="2025-10-01 15:29:13 +0000 UTC" firstStartedPulling="2025-10-01 15:29:14.168600599 +0000 UTC m=+1998.787775780" lastFinishedPulling="2025-10-01 15:29:14.681669135 +0000 UTC m=+1999.300844346" observedRunningTime="2025-10-01 15:29:15.153682048 +0000 UTC m=+1999.772857229" watchObservedRunningTime="2025-10-01 15:29:15.155144914 +0000 UTC m=+1999.774320085" Oct 01 15:29:42 crc kubenswrapper[4771]: I1001 15:29:42.177197 4771 patch_prober.go:28] interesting pod/machine-config-daemon-vck47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 15:29:42 crc kubenswrapper[4771]: I1001 15:29:42.178008 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 15:29:42 crc kubenswrapper[4771]: I1001 15:29:42.178077 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vck47" Oct 01 15:29:42 crc kubenswrapper[4771]: I1001 15:29:42.178908 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9a6a9c0077062d17e78226f49233ded0b62ef8f37ee11aa7036368c11d9a3cf2"} pod="openshift-machine-config-operator/machine-config-daemon-vck47" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 15:29:42 crc kubenswrapper[4771]: I1001 15:29:42.179010 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" containerID="cri-o://9a6a9c0077062d17e78226f49233ded0b62ef8f37ee11aa7036368c11d9a3cf2" gracePeriod=600 Oct 01 15:29:42 crc kubenswrapper[4771]: I1001 15:29:42.408890 4771 generic.go:334] "Generic (PLEG): container finished" podID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerID="9a6a9c0077062d17e78226f49233ded0b62ef8f37ee11aa7036368c11d9a3cf2" exitCode=0 Oct 01 15:29:42 crc kubenswrapper[4771]: I1001 15:29:42.408942 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" event={"ID":"289ee6d3-fabe-417f-964c-76ca03c143cc","Type":"ContainerDied","Data":"9a6a9c0077062d17e78226f49233ded0b62ef8f37ee11aa7036368c11d9a3cf2"} Oct 01 15:29:42 crc kubenswrapper[4771]: I1001 15:29:42.408986 4771 scope.go:117] "RemoveContainer" containerID="14f82a58b71f640691d4b9ebb4629f11abf0ca28aa3c0c30ba09d2fe31d6a0a2" Oct 01 15:29:43 crc kubenswrapper[4771]: I1001 15:29:43.426206 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" event={"ID":"289ee6d3-fabe-417f-964c-76ca03c143cc","Type":"ContainerStarted","Data":"16ae180b9499312005658862660284e647910a63ac7bfb5649900bd12b06fcf8"} Oct 01 15:30:00 crc kubenswrapper[4771]: I1001 15:30:00.154889 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322210-gbdnz"] Oct 01 15:30:00 crc kubenswrapper[4771]: I1001 15:30:00.156587 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322210-gbdnz" Oct 01 15:30:00 crc kubenswrapper[4771]: I1001 15:30:00.160029 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 15:30:00 crc kubenswrapper[4771]: I1001 15:30:00.163565 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 15:30:00 crc kubenswrapper[4771]: I1001 15:30:00.186534 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322210-gbdnz"] Oct 01 15:30:00 crc kubenswrapper[4771]: I1001 15:30:00.279036 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/36e1a608-b8a8-4ecc-a7eb-6e2434996e51-secret-volume\") pod \"collect-profiles-29322210-gbdnz\" (UID: \"36e1a608-b8a8-4ecc-a7eb-6e2434996e51\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322210-gbdnz" Oct 01 15:30:00 crc kubenswrapper[4771]: I1001 15:30:00.279169 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7q79\" (UniqueName: \"kubernetes.io/projected/36e1a608-b8a8-4ecc-a7eb-6e2434996e51-kube-api-access-g7q79\") pod \"collect-profiles-29322210-gbdnz\" (UID: \"36e1a608-b8a8-4ecc-a7eb-6e2434996e51\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322210-gbdnz" Oct 01 15:30:00 crc kubenswrapper[4771]: I1001 15:30:00.279398 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/36e1a608-b8a8-4ecc-a7eb-6e2434996e51-config-volume\") pod \"collect-profiles-29322210-gbdnz\" (UID: \"36e1a608-b8a8-4ecc-a7eb-6e2434996e51\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322210-gbdnz" Oct 01 15:30:00 crc kubenswrapper[4771]: I1001 15:30:00.381429 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7q79\" (UniqueName: \"kubernetes.io/projected/36e1a608-b8a8-4ecc-a7eb-6e2434996e51-kube-api-access-g7q79\") pod \"collect-profiles-29322210-gbdnz\" (UID: \"36e1a608-b8a8-4ecc-a7eb-6e2434996e51\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322210-gbdnz" Oct 01 15:30:00 crc kubenswrapper[4771]: I1001 15:30:00.381537 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/36e1a608-b8a8-4ecc-a7eb-6e2434996e51-config-volume\") pod \"collect-profiles-29322210-gbdnz\" (UID: \"36e1a608-b8a8-4ecc-a7eb-6e2434996e51\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322210-gbdnz" Oct 01 15:30:00 crc kubenswrapper[4771]: I1001 15:30:00.381763 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/36e1a608-b8a8-4ecc-a7eb-6e2434996e51-secret-volume\") pod \"collect-profiles-29322210-gbdnz\" (UID: \"36e1a608-b8a8-4ecc-a7eb-6e2434996e51\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322210-gbdnz" Oct 01 15:30:00 crc kubenswrapper[4771]: I1001 15:30:00.383143 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/36e1a608-b8a8-4ecc-a7eb-6e2434996e51-config-volume\") pod \"collect-profiles-29322210-gbdnz\" (UID: \"36e1a608-b8a8-4ecc-a7eb-6e2434996e51\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322210-gbdnz" Oct 01 15:30:00 crc kubenswrapper[4771]: I1001 15:30:00.389155 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/36e1a608-b8a8-4ecc-a7eb-6e2434996e51-secret-volume\") pod \"collect-profiles-29322210-gbdnz\" (UID: \"36e1a608-b8a8-4ecc-a7eb-6e2434996e51\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322210-gbdnz" Oct 01 15:30:00 crc kubenswrapper[4771]: I1001 15:30:00.405399 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7q79\" (UniqueName: \"kubernetes.io/projected/36e1a608-b8a8-4ecc-a7eb-6e2434996e51-kube-api-access-g7q79\") pod \"collect-profiles-29322210-gbdnz\" (UID: \"36e1a608-b8a8-4ecc-a7eb-6e2434996e51\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322210-gbdnz" Oct 01 15:30:00 crc kubenswrapper[4771]: I1001 15:30:00.500884 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322210-gbdnz" Oct 01 15:30:00 crc kubenswrapper[4771]: I1001 15:30:00.968899 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322210-gbdnz"] Oct 01 15:30:00 crc kubenswrapper[4771]: W1001 15:30:00.983295 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36e1a608_b8a8_4ecc_a7eb_6e2434996e51.slice/crio-eae8ddcf773f4020adfb7e1401d3338f65f5be7f3a2cc197c469bcc000586c0e WatchSource:0}: Error finding container eae8ddcf773f4020adfb7e1401d3338f65f5be7f3a2cc197c469bcc000586c0e: Status 404 returned error can't find the container with id eae8ddcf773f4020adfb7e1401d3338f65f5be7f3a2cc197c469bcc000586c0e Oct 01 15:30:01 crc kubenswrapper[4771]: I1001 15:30:01.641043 4771 generic.go:334] "Generic (PLEG): container finished" podID="36e1a608-b8a8-4ecc-a7eb-6e2434996e51" containerID="f3548f49d62b30c8499b858d010bfdbf7f41543758a76537a43ae66fac195064" exitCode=0 Oct 01 15:30:01 crc kubenswrapper[4771]: I1001 15:30:01.641109 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322210-gbdnz" event={"ID":"36e1a608-b8a8-4ecc-a7eb-6e2434996e51","Type":"ContainerDied","Data":"f3548f49d62b30c8499b858d010bfdbf7f41543758a76537a43ae66fac195064"} Oct 01 15:30:01 crc kubenswrapper[4771]: I1001 15:30:01.643282 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322210-gbdnz" event={"ID":"36e1a608-b8a8-4ecc-a7eb-6e2434996e51","Type":"ContainerStarted","Data":"eae8ddcf773f4020adfb7e1401d3338f65f5be7f3a2cc197c469bcc000586c0e"} Oct 01 15:30:03 crc kubenswrapper[4771]: I1001 15:30:03.034510 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322210-gbdnz" Oct 01 15:30:03 crc kubenswrapper[4771]: I1001 15:30:03.130061 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/36e1a608-b8a8-4ecc-a7eb-6e2434996e51-secret-volume\") pod \"36e1a608-b8a8-4ecc-a7eb-6e2434996e51\" (UID: \"36e1a608-b8a8-4ecc-a7eb-6e2434996e51\") " Oct 01 15:30:03 crc kubenswrapper[4771]: I1001 15:30:03.130156 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7q79\" (UniqueName: \"kubernetes.io/projected/36e1a608-b8a8-4ecc-a7eb-6e2434996e51-kube-api-access-g7q79\") pod \"36e1a608-b8a8-4ecc-a7eb-6e2434996e51\" (UID: \"36e1a608-b8a8-4ecc-a7eb-6e2434996e51\") " Oct 01 15:30:03 crc kubenswrapper[4771]: I1001 15:30:03.130257 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/36e1a608-b8a8-4ecc-a7eb-6e2434996e51-config-volume\") pod \"36e1a608-b8a8-4ecc-a7eb-6e2434996e51\" (UID: \"36e1a608-b8a8-4ecc-a7eb-6e2434996e51\") " Oct 01 15:30:03 crc kubenswrapper[4771]: I1001 15:30:03.132248 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36e1a608-b8a8-4ecc-a7eb-6e2434996e51-config-volume" (OuterVolumeSpecName: "config-volume") pod "36e1a608-b8a8-4ecc-a7eb-6e2434996e51" (UID: "36e1a608-b8a8-4ecc-a7eb-6e2434996e51"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:30:03 crc kubenswrapper[4771]: I1001 15:30:03.135394 4771 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/36e1a608-b8a8-4ecc-a7eb-6e2434996e51-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 15:30:03 crc kubenswrapper[4771]: I1001 15:30:03.140183 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36e1a608-b8a8-4ecc-a7eb-6e2434996e51-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "36e1a608-b8a8-4ecc-a7eb-6e2434996e51" (UID: "36e1a608-b8a8-4ecc-a7eb-6e2434996e51"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:30:03 crc kubenswrapper[4771]: I1001 15:30:03.140624 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36e1a608-b8a8-4ecc-a7eb-6e2434996e51-kube-api-access-g7q79" (OuterVolumeSpecName: "kube-api-access-g7q79") pod "36e1a608-b8a8-4ecc-a7eb-6e2434996e51" (UID: "36e1a608-b8a8-4ecc-a7eb-6e2434996e51"). InnerVolumeSpecName "kube-api-access-g7q79". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:30:03 crc kubenswrapper[4771]: I1001 15:30:03.238384 4771 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/36e1a608-b8a8-4ecc-a7eb-6e2434996e51-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 15:30:03 crc kubenswrapper[4771]: I1001 15:30:03.238445 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7q79\" (UniqueName: \"kubernetes.io/projected/36e1a608-b8a8-4ecc-a7eb-6e2434996e51-kube-api-access-g7q79\") on node \"crc\" DevicePath \"\"" Oct 01 15:30:03 crc kubenswrapper[4771]: I1001 15:30:03.668322 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322210-gbdnz" event={"ID":"36e1a608-b8a8-4ecc-a7eb-6e2434996e51","Type":"ContainerDied","Data":"eae8ddcf773f4020adfb7e1401d3338f65f5be7f3a2cc197c469bcc000586c0e"} Oct 01 15:30:03 crc kubenswrapper[4771]: I1001 15:30:03.668381 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eae8ddcf773f4020adfb7e1401d3338f65f5be7f3a2cc197c469bcc000586c0e" Oct 01 15:30:03 crc kubenswrapper[4771]: I1001 15:30:03.668401 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322210-gbdnz" Oct 01 15:30:04 crc kubenswrapper[4771]: I1001 15:30:04.137064 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322165-xdwrz"] Oct 01 15:30:04 crc kubenswrapper[4771]: I1001 15:30:04.147996 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322165-xdwrz"] Oct 01 15:30:05 crc kubenswrapper[4771]: I1001 15:30:05.999252 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40496e6d-3f79-4478-804b-dc9904473801" path="/var/lib/kubelet/pods/40496e6d-3f79-4478-804b-dc9904473801/volumes" Oct 01 15:30:24 crc kubenswrapper[4771]: I1001 15:30:24.897776 4771 generic.go:334] "Generic (PLEG): container finished" podID="0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295" containerID="8f835321e32059624ae0c809e7efc5bb68b025cc2d090ce1a65c172747bee975" exitCode=0 Oct 01 15:30:24 crc kubenswrapper[4771]: I1001 15:30:24.897838 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h96xv" event={"ID":"0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295","Type":"ContainerDied","Data":"8f835321e32059624ae0c809e7efc5bb68b025cc2d090ce1a65c172747bee975"} Oct 01 15:30:26 crc kubenswrapper[4771]: I1001 15:30:26.307083 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h96xv" Oct 01 15:30:26 crc kubenswrapper[4771]: I1001 15:30:26.422295 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxhv6\" (UniqueName: \"kubernetes.io/projected/0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295-kube-api-access-gxhv6\") pod \"0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295\" (UID: \"0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295\") " Oct 01 15:30:26 crc kubenswrapper[4771]: I1001 15:30:26.422385 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295-ovncontroller-config-0\") pod \"0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295\" (UID: \"0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295\") " Oct 01 15:30:26 crc kubenswrapper[4771]: I1001 15:30:26.422421 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295-ovn-combined-ca-bundle\") pod \"0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295\" (UID: \"0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295\") " Oct 01 15:30:26 crc kubenswrapper[4771]: I1001 15:30:26.422467 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295-inventory\") pod \"0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295\" (UID: \"0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295\") " Oct 01 15:30:26 crc kubenswrapper[4771]: I1001 15:30:26.422546 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295-ssh-key\") pod \"0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295\" (UID: \"0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295\") " Oct 01 15:30:26 crc kubenswrapper[4771]: I1001 15:30:26.428437 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295-kube-api-access-gxhv6" (OuterVolumeSpecName: "kube-api-access-gxhv6") pod "0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295" (UID: "0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295"). InnerVolumeSpecName "kube-api-access-gxhv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:30:26 crc kubenswrapper[4771]: I1001 15:30:26.429309 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295" (UID: "0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:30:26 crc kubenswrapper[4771]: I1001 15:30:26.454534 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295" (UID: "0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:30:26 crc kubenswrapper[4771]: I1001 15:30:26.458307 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295" (UID: "0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:30:26 crc kubenswrapper[4771]: I1001 15:30:26.465544 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295-inventory" (OuterVolumeSpecName: "inventory") pod "0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295" (UID: "0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:30:26 crc kubenswrapper[4771]: I1001 15:30:26.524537 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 15:30:26 crc kubenswrapper[4771]: I1001 15:30:26.524567 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxhv6\" (UniqueName: \"kubernetes.io/projected/0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295-kube-api-access-gxhv6\") on node \"crc\" DevicePath \"\"" Oct 01 15:30:26 crc kubenswrapper[4771]: I1001 15:30:26.524580 4771 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 01 15:30:26 crc kubenswrapper[4771]: I1001 15:30:26.524589 4771 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:30:26 crc kubenswrapper[4771]: I1001 15:30:26.524597 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 15:30:26 crc kubenswrapper[4771]: I1001 15:30:26.923681 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h96xv" event={"ID":"0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295","Type":"ContainerDied","Data":"7bc96652360139afc291a51d561ef581caeb897b5950c6ea3833f014a2034ba8"} Oct 01 15:30:26 crc kubenswrapper[4771]: I1001 15:30:26.923779 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7bc96652360139afc291a51d561ef581caeb897b5950c6ea3833f014a2034ba8" Oct 01 15:30:26 crc kubenswrapper[4771]: I1001 15:30:26.923789 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-h96xv" Oct 01 15:30:27 crc kubenswrapper[4771]: I1001 15:30:27.030776 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s"] Oct 01 15:30:27 crc kubenswrapper[4771]: E1001 15:30:27.031265 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36e1a608-b8a8-4ecc-a7eb-6e2434996e51" containerName="collect-profiles" Oct 01 15:30:27 crc kubenswrapper[4771]: I1001 15:30:27.031289 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="36e1a608-b8a8-4ecc-a7eb-6e2434996e51" containerName="collect-profiles" Oct 01 15:30:27 crc kubenswrapper[4771]: E1001 15:30:27.031310 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 01 15:30:27 crc kubenswrapper[4771]: I1001 15:30:27.031320 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 01 15:30:27 crc kubenswrapper[4771]: I1001 15:30:27.031602 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 01 15:30:27 crc kubenswrapper[4771]: I1001 15:30:27.031657 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="36e1a608-b8a8-4ecc-a7eb-6e2434996e51" containerName="collect-profiles" Oct 01 15:30:27 crc kubenswrapper[4771]: I1001 15:30:27.032520 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s" Oct 01 15:30:27 crc kubenswrapper[4771]: I1001 15:30:27.038424 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Oct 01 15:30:27 crc kubenswrapper[4771]: I1001 15:30:27.038683 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Oct 01 15:30:27 crc kubenswrapper[4771]: I1001 15:30:27.038847 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 15:30:27 crc kubenswrapper[4771]: I1001 15:30:27.039030 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 15:30:27 crc kubenswrapper[4771]: I1001 15:30:27.039221 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fv9b7" Oct 01 15:30:27 crc kubenswrapper[4771]: I1001 15:30:27.042902 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s"] Oct 01 15:30:27 crc kubenswrapper[4771]: I1001 15:30:27.046092 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 15:30:27 crc kubenswrapper[4771]: I1001 15:30:27.138294 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9c50c132-15f0-45c7-a895-46fe2be6003e-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s\" (UID: \"9c50c132-15f0-45c7-a895-46fe2be6003e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s" Oct 01 15:30:27 crc kubenswrapper[4771]: I1001 15:30:27.138335 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c50c132-15f0-45c7-a895-46fe2be6003e-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s\" (UID: \"9c50c132-15f0-45c7-a895-46fe2be6003e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s" Oct 01 15:30:27 crc kubenswrapper[4771]: I1001 15:30:27.138395 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9c50c132-15f0-45c7-a895-46fe2be6003e-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s\" (UID: \"9c50c132-15f0-45c7-a895-46fe2be6003e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s" Oct 01 15:30:27 crc kubenswrapper[4771]: I1001 15:30:27.138442 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c50c132-15f0-45c7-a895-46fe2be6003e-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s\" (UID: \"9c50c132-15f0-45c7-a895-46fe2be6003e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s" Oct 01 15:30:27 crc kubenswrapper[4771]: I1001 15:30:27.138482 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6t96\" (UniqueName: \"kubernetes.io/projected/9c50c132-15f0-45c7-a895-46fe2be6003e-kube-api-access-g6t96\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s\" (UID: \"9c50c132-15f0-45c7-a895-46fe2be6003e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s" Oct 01 15:30:27 crc kubenswrapper[4771]: I1001 15:30:27.138508 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c50c132-15f0-45c7-a895-46fe2be6003e-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s\" (UID: \"9c50c132-15f0-45c7-a895-46fe2be6003e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s" Oct 01 15:30:27 crc kubenswrapper[4771]: I1001 15:30:27.240713 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9c50c132-15f0-45c7-a895-46fe2be6003e-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s\" (UID: \"9c50c132-15f0-45c7-a895-46fe2be6003e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s" Oct 01 15:30:27 crc kubenswrapper[4771]: I1001 15:30:27.241113 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c50c132-15f0-45c7-a895-46fe2be6003e-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s\" (UID: \"9c50c132-15f0-45c7-a895-46fe2be6003e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s" Oct 01 15:30:27 crc kubenswrapper[4771]: I1001 15:30:27.241348 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9c50c132-15f0-45c7-a895-46fe2be6003e-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s\" (UID: \"9c50c132-15f0-45c7-a895-46fe2be6003e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s" Oct 01 15:30:27 crc kubenswrapper[4771]: I1001 15:30:27.241602 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c50c132-15f0-45c7-a895-46fe2be6003e-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s\" (UID: \"9c50c132-15f0-45c7-a895-46fe2be6003e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s" Oct 01 15:30:27 crc kubenswrapper[4771]: I1001 15:30:27.241846 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6t96\" (UniqueName: \"kubernetes.io/projected/9c50c132-15f0-45c7-a895-46fe2be6003e-kube-api-access-g6t96\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s\" (UID: \"9c50c132-15f0-45c7-a895-46fe2be6003e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s" Oct 01 15:30:27 crc kubenswrapper[4771]: I1001 15:30:27.242029 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c50c132-15f0-45c7-a895-46fe2be6003e-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s\" (UID: \"9c50c132-15f0-45c7-a895-46fe2be6003e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s" Oct 01 15:30:27 crc kubenswrapper[4771]: I1001 15:30:27.246423 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9c50c132-15f0-45c7-a895-46fe2be6003e-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s\" (UID: \"9c50c132-15f0-45c7-a895-46fe2be6003e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s" Oct 01 15:30:27 crc kubenswrapper[4771]: I1001 15:30:27.246794 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c50c132-15f0-45c7-a895-46fe2be6003e-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s\" (UID: \"9c50c132-15f0-45c7-a895-46fe2be6003e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s" Oct 01 15:30:27 crc kubenswrapper[4771]: I1001 15:30:27.249193 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c50c132-15f0-45c7-a895-46fe2be6003e-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s\" (UID: \"9c50c132-15f0-45c7-a895-46fe2be6003e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s" Oct 01 15:30:27 crc kubenswrapper[4771]: I1001 15:30:27.251494 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9c50c132-15f0-45c7-a895-46fe2be6003e-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s\" (UID: \"9c50c132-15f0-45c7-a895-46fe2be6003e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s" Oct 01 15:30:27 crc kubenswrapper[4771]: I1001 15:30:27.255422 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c50c132-15f0-45c7-a895-46fe2be6003e-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s\" (UID: \"9c50c132-15f0-45c7-a895-46fe2be6003e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s" Oct 01 15:30:27 crc kubenswrapper[4771]: I1001 15:30:27.265138 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6t96\" (UniqueName: \"kubernetes.io/projected/9c50c132-15f0-45c7-a895-46fe2be6003e-kube-api-access-g6t96\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s\" (UID: \"9c50c132-15f0-45c7-a895-46fe2be6003e\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s" Oct 01 15:30:27 crc kubenswrapper[4771]: I1001 15:30:27.384872 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s" Oct 01 15:30:27 crc kubenswrapper[4771]: I1001 15:30:27.933627 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s"] Oct 01 15:30:28 crc kubenswrapper[4771]: I1001 15:30:28.942593 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s" event={"ID":"9c50c132-15f0-45c7-a895-46fe2be6003e","Type":"ContainerStarted","Data":"2c8c1d9d12667d0060d98ef7b6b78ba3281eb54bb266356987e97449cf377cce"} Oct 01 15:30:28 crc kubenswrapper[4771]: I1001 15:30:28.944030 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s" event={"ID":"9c50c132-15f0-45c7-a895-46fe2be6003e","Type":"ContainerStarted","Data":"231e9cff195b9196e80d984b597453fed022db484892b9af5b9f5582a50d42c1"} Oct 01 15:30:28 crc kubenswrapper[4771]: I1001 15:30:28.994771 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s" podStartSLOduration=2.549969393 podStartE2EDuration="2.994746236s" podCreationTimestamp="2025-10-01 15:30:26 +0000 UTC" firstStartedPulling="2025-10-01 15:30:27.935175862 +0000 UTC m=+2072.554351043" lastFinishedPulling="2025-10-01 15:30:28.379952705 +0000 UTC m=+2072.999127886" observedRunningTime="2025-10-01 15:30:28.963166276 +0000 UTC m=+2073.582341457" watchObservedRunningTime="2025-10-01 15:30:28.994746236 +0000 UTC m=+2073.613921417" Oct 01 15:30:39 crc kubenswrapper[4771]: I1001 15:30:39.303754 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4rct2"] Oct 01 15:30:39 crc kubenswrapper[4771]: I1001 15:30:39.306566 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4rct2" Oct 01 15:30:39 crc kubenswrapper[4771]: I1001 15:30:39.323210 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4rct2"] Oct 01 15:30:39 crc kubenswrapper[4771]: I1001 15:30:39.400637 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6d40cbc-35fb-4d0d-a948-acd279363bd0-utilities\") pod \"redhat-marketplace-4rct2\" (UID: \"b6d40cbc-35fb-4d0d-a948-acd279363bd0\") " pod="openshift-marketplace/redhat-marketplace-4rct2" Oct 01 15:30:39 crc kubenswrapper[4771]: I1001 15:30:39.400792 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsg9k\" (UniqueName: \"kubernetes.io/projected/b6d40cbc-35fb-4d0d-a948-acd279363bd0-kube-api-access-rsg9k\") pod \"redhat-marketplace-4rct2\" (UID: \"b6d40cbc-35fb-4d0d-a948-acd279363bd0\") " pod="openshift-marketplace/redhat-marketplace-4rct2" Oct 01 15:30:39 crc kubenswrapper[4771]: I1001 15:30:39.400868 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6d40cbc-35fb-4d0d-a948-acd279363bd0-catalog-content\") pod \"redhat-marketplace-4rct2\" (UID: \"b6d40cbc-35fb-4d0d-a948-acd279363bd0\") " pod="openshift-marketplace/redhat-marketplace-4rct2" Oct 01 15:30:39 crc kubenswrapper[4771]: I1001 15:30:39.503004 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsg9k\" (UniqueName: \"kubernetes.io/projected/b6d40cbc-35fb-4d0d-a948-acd279363bd0-kube-api-access-rsg9k\") pod \"redhat-marketplace-4rct2\" (UID: \"b6d40cbc-35fb-4d0d-a948-acd279363bd0\") " pod="openshift-marketplace/redhat-marketplace-4rct2" Oct 01 15:30:39 crc kubenswrapper[4771]: I1001 15:30:39.503100 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6d40cbc-35fb-4d0d-a948-acd279363bd0-catalog-content\") pod \"redhat-marketplace-4rct2\" (UID: \"b6d40cbc-35fb-4d0d-a948-acd279363bd0\") " pod="openshift-marketplace/redhat-marketplace-4rct2" Oct 01 15:30:39 crc kubenswrapper[4771]: I1001 15:30:39.503137 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6d40cbc-35fb-4d0d-a948-acd279363bd0-utilities\") pod \"redhat-marketplace-4rct2\" (UID: \"b6d40cbc-35fb-4d0d-a948-acd279363bd0\") " pod="openshift-marketplace/redhat-marketplace-4rct2" Oct 01 15:30:39 crc kubenswrapper[4771]: I1001 15:30:39.503716 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6d40cbc-35fb-4d0d-a948-acd279363bd0-utilities\") pod \"redhat-marketplace-4rct2\" (UID: \"b6d40cbc-35fb-4d0d-a948-acd279363bd0\") " pod="openshift-marketplace/redhat-marketplace-4rct2" Oct 01 15:30:39 crc kubenswrapper[4771]: I1001 15:30:39.503717 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6d40cbc-35fb-4d0d-a948-acd279363bd0-catalog-content\") pod \"redhat-marketplace-4rct2\" (UID: \"b6d40cbc-35fb-4d0d-a948-acd279363bd0\") " pod="openshift-marketplace/redhat-marketplace-4rct2" Oct 01 15:30:39 crc kubenswrapper[4771]: I1001 15:30:39.528656 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsg9k\" (UniqueName: \"kubernetes.io/projected/b6d40cbc-35fb-4d0d-a948-acd279363bd0-kube-api-access-rsg9k\") pod \"redhat-marketplace-4rct2\" (UID: \"b6d40cbc-35fb-4d0d-a948-acd279363bd0\") " pod="openshift-marketplace/redhat-marketplace-4rct2" Oct 01 15:30:39 crc kubenswrapper[4771]: I1001 15:30:39.634881 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4rct2" Oct 01 15:30:40 crc kubenswrapper[4771]: I1001 15:30:40.154539 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4rct2"] Oct 01 15:30:41 crc kubenswrapper[4771]: I1001 15:30:41.072595 4771 generic.go:334] "Generic (PLEG): container finished" podID="b6d40cbc-35fb-4d0d-a948-acd279363bd0" containerID="a55c4afb4ea329fe2b56a6d7583a49e08d7006da726037362142e3ef7ba65412" exitCode=0 Oct 01 15:30:41 crc kubenswrapper[4771]: I1001 15:30:41.072683 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rct2" event={"ID":"b6d40cbc-35fb-4d0d-a948-acd279363bd0","Type":"ContainerDied","Data":"a55c4afb4ea329fe2b56a6d7583a49e08d7006da726037362142e3ef7ba65412"} Oct 01 15:30:41 crc kubenswrapper[4771]: I1001 15:30:41.072979 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rct2" event={"ID":"b6d40cbc-35fb-4d0d-a948-acd279363bd0","Type":"ContainerStarted","Data":"a4f69d7f5260cddfef7339702045d99603a4e689898fb5dbf68f7a0a0804c8f0"} Oct 01 15:30:43 crc kubenswrapper[4771]: I1001 15:30:43.099091 4771 generic.go:334] "Generic (PLEG): container finished" podID="b6d40cbc-35fb-4d0d-a948-acd279363bd0" containerID="5217abc9ad12926578ca91967ca8e0f8dcc0f0189257c6bcec799c2c44aa52fc" exitCode=0 Oct 01 15:30:43 crc kubenswrapper[4771]: I1001 15:30:43.099209 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rct2" event={"ID":"b6d40cbc-35fb-4d0d-a948-acd279363bd0","Type":"ContainerDied","Data":"5217abc9ad12926578ca91967ca8e0f8dcc0f0189257c6bcec799c2c44aa52fc"} Oct 01 15:30:44 crc kubenswrapper[4771]: I1001 15:30:44.111871 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rct2" event={"ID":"b6d40cbc-35fb-4d0d-a948-acd279363bd0","Type":"ContainerStarted","Data":"6910309bed64c85954a3717292a842436a0a17343f52a832b574a2a531e80e6b"} Oct 01 15:30:44 crc kubenswrapper[4771]: I1001 15:30:44.132294 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4rct2" podStartSLOduration=2.406167391 podStartE2EDuration="5.132275652s" podCreationTimestamp="2025-10-01 15:30:39 +0000 UTC" firstStartedPulling="2025-10-01 15:30:41.075497972 +0000 UTC m=+2085.694673193" lastFinishedPulling="2025-10-01 15:30:43.801606273 +0000 UTC m=+2088.420781454" observedRunningTime="2025-10-01 15:30:44.128618972 +0000 UTC m=+2088.747794143" watchObservedRunningTime="2025-10-01 15:30:44.132275652 +0000 UTC m=+2088.751450823" Oct 01 15:30:49 crc kubenswrapper[4771]: I1001 15:30:49.635939 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4rct2" Oct 01 15:30:49 crc kubenswrapper[4771]: I1001 15:30:49.636530 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4rct2" Oct 01 15:30:49 crc kubenswrapper[4771]: I1001 15:30:49.684537 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4rct2" Oct 01 15:30:50 crc kubenswrapper[4771]: I1001 15:30:50.217012 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4rct2" Oct 01 15:30:50 crc kubenswrapper[4771]: I1001 15:30:50.262439 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4rct2"] Oct 01 15:30:52 crc kubenswrapper[4771]: I1001 15:30:52.190019 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4rct2" podUID="b6d40cbc-35fb-4d0d-a948-acd279363bd0" containerName="registry-server" containerID="cri-o://6910309bed64c85954a3717292a842436a0a17343f52a832b574a2a531e80e6b" gracePeriod=2 Oct 01 15:30:52 crc kubenswrapper[4771]: I1001 15:30:52.677593 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4rct2" Oct 01 15:30:52 crc kubenswrapper[4771]: I1001 15:30:52.771180 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6d40cbc-35fb-4d0d-a948-acd279363bd0-utilities\") pod \"b6d40cbc-35fb-4d0d-a948-acd279363bd0\" (UID: \"b6d40cbc-35fb-4d0d-a948-acd279363bd0\") " Oct 01 15:30:52 crc kubenswrapper[4771]: I1001 15:30:52.771786 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6d40cbc-35fb-4d0d-a948-acd279363bd0-catalog-content\") pod \"b6d40cbc-35fb-4d0d-a948-acd279363bd0\" (UID: \"b6d40cbc-35fb-4d0d-a948-acd279363bd0\") " Oct 01 15:30:52 crc kubenswrapper[4771]: I1001 15:30:52.772266 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsg9k\" (UniqueName: \"kubernetes.io/projected/b6d40cbc-35fb-4d0d-a948-acd279363bd0-kube-api-access-rsg9k\") pod \"b6d40cbc-35fb-4d0d-a948-acd279363bd0\" (UID: \"b6d40cbc-35fb-4d0d-a948-acd279363bd0\") " Oct 01 15:30:52 crc kubenswrapper[4771]: I1001 15:30:52.773321 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6d40cbc-35fb-4d0d-a948-acd279363bd0-utilities" (OuterVolumeSpecName: "utilities") pod "b6d40cbc-35fb-4d0d-a948-acd279363bd0" (UID: "b6d40cbc-35fb-4d0d-a948-acd279363bd0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:30:52 crc kubenswrapper[4771]: I1001 15:30:52.773871 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6d40cbc-35fb-4d0d-a948-acd279363bd0-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 15:30:52 crc kubenswrapper[4771]: I1001 15:30:52.778053 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6d40cbc-35fb-4d0d-a948-acd279363bd0-kube-api-access-rsg9k" (OuterVolumeSpecName: "kube-api-access-rsg9k") pod "b6d40cbc-35fb-4d0d-a948-acd279363bd0" (UID: "b6d40cbc-35fb-4d0d-a948-acd279363bd0"). InnerVolumeSpecName "kube-api-access-rsg9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:30:52 crc kubenswrapper[4771]: I1001 15:30:52.786883 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6d40cbc-35fb-4d0d-a948-acd279363bd0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b6d40cbc-35fb-4d0d-a948-acd279363bd0" (UID: "b6d40cbc-35fb-4d0d-a948-acd279363bd0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:30:52 crc kubenswrapper[4771]: I1001 15:30:52.874814 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsg9k\" (UniqueName: \"kubernetes.io/projected/b6d40cbc-35fb-4d0d-a948-acd279363bd0-kube-api-access-rsg9k\") on node \"crc\" DevicePath \"\"" Oct 01 15:30:52 crc kubenswrapper[4771]: I1001 15:30:52.874837 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6d40cbc-35fb-4d0d-a948-acd279363bd0-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 15:30:53 crc kubenswrapper[4771]: I1001 15:30:53.202186 4771 generic.go:334] "Generic (PLEG): container finished" podID="b6d40cbc-35fb-4d0d-a948-acd279363bd0" containerID="6910309bed64c85954a3717292a842436a0a17343f52a832b574a2a531e80e6b" exitCode=0 Oct 01 15:30:53 crc kubenswrapper[4771]: I1001 15:30:53.202224 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rct2" event={"ID":"b6d40cbc-35fb-4d0d-a948-acd279363bd0","Type":"ContainerDied","Data":"6910309bed64c85954a3717292a842436a0a17343f52a832b574a2a531e80e6b"} Oct 01 15:30:53 crc kubenswrapper[4771]: I1001 15:30:53.202251 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rct2" event={"ID":"b6d40cbc-35fb-4d0d-a948-acd279363bd0","Type":"ContainerDied","Data":"a4f69d7f5260cddfef7339702045d99603a4e689898fb5dbf68f7a0a0804c8f0"} Oct 01 15:30:53 crc kubenswrapper[4771]: I1001 15:30:53.202268 4771 scope.go:117] "RemoveContainer" containerID="6910309bed64c85954a3717292a842436a0a17343f52a832b574a2a531e80e6b" Oct 01 15:30:53 crc kubenswrapper[4771]: I1001 15:30:53.202262 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4rct2" Oct 01 15:30:53 crc kubenswrapper[4771]: I1001 15:30:53.231067 4771 scope.go:117] "RemoveContainer" containerID="5217abc9ad12926578ca91967ca8e0f8dcc0f0189257c6bcec799c2c44aa52fc" Oct 01 15:30:53 crc kubenswrapper[4771]: I1001 15:30:53.235671 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4rct2"] Oct 01 15:30:53 crc kubenswrapper[4771]: I1001 15:30:53.246700 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4rct2"] Oct 01 15:30:53 crc kubenswrapper[4771]: I1001 15:30:53.258214 4771 scope.go:117] "RemoveContainer" containerID="a55c4afb4ea329fe2b56a6d7583a49e08d7006da726037362142e3ef7ba65412" Oct 01 15:30:53 crc kubenswrapper[4771]: I1001 15:30:53.289945 4771 scope.go:117] "RemoveContainer" containerID="6910309bed64c85954a3717292a842436a0a17343f52a832b574a2a531e80e6b" Oct 01 15:30:53 crc kubenswrapper[4771]: E1001 15:30:53.290495 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6910309bed64c85954a3717292a842436a0a17343f52a832b574a2a531e80e6b\": container with ID starting with 6910309bed64c85954a3717292a842436a0a17343f52a832b574a2a531e80e6b not found: ID does not exist" containerID="6910309bed64c85954a3717292a842436a0a17343f52a832b574a2a531e80e6b" Oct 01 15:30:53 crc kubenswrapper[4771]: I1001 15:30:53.290530 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6910309bed64c85954a3717292a842436a0a17343f52a832b574a2a531e80e6b"} err="failed to get container status \"6910309bed64c85954a3717292a842436a0a17343f52a832b574a2a531e80e6b\": rpc error: code = NotFound desc = could not find container \"6910309bed64c85954a3717292a842436a0a17343f52a832b574a2a531e80e6b\": container with ID starting with 6910309bed64c85954a3717292a842436a0a17343f52a832b574a2a531e80e6b not found: ID does not exist" Oct 01 15:30:53 crc kubenswrapper[4771]: I1001 15:30:53.290552 4771 scope.go:117] "RemoveContainer" containerID="5217abc9ad12926578ca91967ca8e0f8dcc0f0189257c6bcec799c2c44aa52fc" Oct 01 15:30:53 crc kubenswrapper[4771]: E1001 15:30:53.291010 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5217abc9ad12926578ca91967ca8e0f8dcc0f0189257c6bcec799c2c44aa52fc\": container with ID starting with 5217abc9ad12926578ca91967ca8e0f8dcc0f0189257c6bcec799c2c44aa52fc not found: ID does not exist" containerID="5217abc9ad12926578ca91967ca8e0f8dcc0f0189257c6bcec799c2c44aa52fc" Oct 01 15:30:53 crc kubenswrapper[4771]: I1001 15:30:53.291038 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5217abc9ad12926578ca91967ca8e0f8dcc0f0189257c6bcec799c2c44aa52fc"} err="failed to get container status \"5217abc9ad12926578ca91967ca8e0f8dcc0f0189257c6bcec799c2c44aa52fc\": rpc error: code = NotFound desc = could not find container \"5217abc9ad12926578ca91967ca8e0f8dcc0f0189257c6bcec799c2c44aa52fc\": container with ID starting with 5217abc9ad12926578ca91967ca8e0f8dcc0f0189257c6bcec799c2c44aa52fc not found: ID does not exist" Oct 01 15:30:53 crc kubenswrapper[4771]: I1001 15:30:53.291057 4771 scope.go:117] "RemoveContainer" containerID="a55c4afb4ea329fe2b56a6d7583a49e08d7006da726037362142e3ef7ba65412" Oct 01 15:30:53 crc kubenswrapper[4771]: E1001 15:30:53.291325 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a55c4afb4ea329fe2b56a6d7583a49e08d7006da726037362142e3ef7ba65412\": container with ID starting with a55c4afb4ea329fe2b56a6d7583a49e08d7006da726037362142e3ef7ba65412 not found: ID does not exist" containerID="a55c4afb4ea329fe2b56a6d7583a49e08d7006da726037362142e3ef7ba65412" Oct 01 15:30:53 crc kubenswrapper[4771]: I1001 15:30:53.291393 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a55c4afb4ea329fe2b56a6d7583a49e08d7006da726037362142e3ef7ba65412"} err="failed to get container status \"a55c4afb4ea329fe2b56a6d7583a49e08d7006da726037362142e3ef7ba65412\": rpc error: code = NotFound desc = could not find container \"a55c4afb4ea329fe2b56a6d7583a49e08d7006da726037362142e3ef7ba65412\": container with ID starting with a55c4afb4ea329fe2b56a6d7583a49e08d7006da726037362142e3ef7ba65412 not found: ID does not exist" Oct 01 15:30:53 crc kubenswrapper[4771]: I1001 15:30:53.997949 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6d40cbc-35fb-4d0d-a948-acd279363bd0" path="/var/lib/kubelet/pods/b6d40cbc-35fb-4d0d-a948-acd279363bd0/volumes" Oct 01 15:31:03 crc kubenswrapper[4771]: I1001 15:31:03.379736 4771 scope.go:117] "RemoveContainer" containerID="1262a15d0ce773b7696404ec7c2315d8babf9c50bfe5a5ed0b38caf7396c9c11" Oct 01 15:31:19 crc kubenswrapper[4771]: I1001 15:31:19.494135 4771 generic.go:334] "Generic (PLEG): container finished" podID="9c50c132-15f0-45c7-a895-46fe2be6003e" containerID="2c8c1d9d12667d0060d98ef7b6b78ba3281eb54bb266356987e97449cf377cce" exitCode=0 Oct 01 15:31:19 crc kubenswrapper[4771]: I1001 15:31:19.494222 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s" event={"ID":"9c50c132-15f0-45c7-a895-46fe2be6003e","Type":"ContainerDied","Data":"2c8c1d9d12667d0060d98ef7b6b78ba3281eb54bb266356987e97449cf377cce"} Oct 01 15:31:20 crc kubenswrapper[4771]: I1001 15:31:20.994758 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s" Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.139597 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c50c132-15f0-45c7-a895-46fe2be6003e-neutron-metadata-combined-ca-bundle\") pod \"9c50c132-15f0-45c7-a895-46fe2be6003e\" (UID: \"9c50c132-15f0-45c7-a895-46fe2be6003e\") " Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.139660 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c50c132-15f0-45c7-a895-46fe2be6003e-ssh-key\") pod \"9c50c132-15f0-45c7-a895-46fe2be6003e\" (UID: \"9c50c132-15f0-45c7-a895-46fe2be6003e\") " Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.139777 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6t96\" (UniqueName: \"kubernetes.io/projected/9c50c132-15f0-45c7-a895-46fe2be6003e-kube-api-access-g6t96\") pod \"9c50c132-15f0-45c7-a895-46fe2be6003e\" (UID: \"9c50c132-15f0-45c7-a895-46fe2be6003e\") " Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.139861 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9c50c132-15f0-45c7-a895-46fe2be6003e-neutron-ovn-metadata-agent-neutron-config-0\") pod \"9c50c132-15f0-45c7-a895-46fe2be6003e\" (UID: \"9c50c132-15f0-45c7-a895-46fe2be6003e\") " Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.139959 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9c50c132-15f0-45c7-a895-46fe2be6003e-nova-metadata-neutron-config-0\") pod \"9c50c132-15f0-45c7-a895-46fe2be6003e\" (UID: \"9c50c132-15f0-45c7-a895-46fe2be6003e\") " Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.140016 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c50c132-15f0-45c7-a895-46fe2be6003e-inventory\") pod \"9c50c132-15f0-45c7-a895-46fe2be6003e\" (UID: \"9c50c132-15f0-45c7-a895-46fe2be6003e\") " Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.146166 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c50c132-15f0-45c7-a895-46fe2be6003e-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "9c50c132-15f0-45c7-a895-46fe2be6003e" (UID: "9c50c132-15f0-45c7-a895-46fe2be6003e"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.147707 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c50c132-15f0-45c7-a895-46fe2be6003e-kube-api-access-g6t96" (OuterVolumeSpecName: "kube-api-access-g6t96") pod "9c50c132-15f0-45c7-a895-46fe2be6003e" (UID: "9c50c132-15f0-45c7-a895-46fe2be6003e"). InnerVolumeSpecName "kube-api-access-g6t96". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.178446 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c50c132-15f0-45c7-a895-46fe2be6003e-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "9c50c132-15f0-45c7-a895-46fe2be6003e" (UID: "9c50c132-15f0-45c7-a895-46fe2be6003e"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.183668 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c50c132-15f0-45c7-a895-46fe2be6003e-inventory" (OuterVolumeSpecName: "inventory") pod "9c50c132-15f0-45c7-a895-46fe2be6003e" (UID: "9c50c132-15f0-45c7-a895-46fe2be6003e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.185030 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c50c132-15f0-45c7-a895-46fe2be6003e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "9c50c132-15f0-45c7-a895-46fe2be6003e" (UID: "9c50c132-15f0-45c7-a895-46fe2be6003e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.193231 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c50c132-15f0-45c7-a895-46fe2be6003e-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "9c50c132-15f0-45c7-a895-46fe2be6003e" (UID: "9c50c132-15f0-45c7-a895-46fe2be6003e"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.242651 4771 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9c50c132-15f0-45c7-a895-46fe2be6003e-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.242687 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c50c132-15f0-45c7-a895-46fe2be6003e-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.242699 4771 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9c50c132-15f0-45c7-a895-46fe2be6003e-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.242713 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/9c50c132-15f0-45c7-a895-46fe2be6003e-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.242744 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6t96\" (UniqueName: \"kubernetes.io/projected/9c50c132-15f0-45c7-a895-46fe2be6003e-kube-api-access-g6t96\") on node \"crc\" DevicePath \"\"" Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.242759 4771 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/9c50c132-15f0-45c7-a895-46fe2be6003e-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.519089 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s" event={"ID":"9c50c132-15f0-45c7-a895-46fe2be6003e","Type":"ContainerDied","Data":"231e9cff195b9196e80d984b597453fed022db484892b9af5b9f5582a50d42c1"} Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.519137 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="231e9cff195b9196e80d984b597453fed022db484892b9af5b9f5582a50d42c1" Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.519231 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s" Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.630062 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hvr6k"] Oct 01 15:31:21 crc kubenswrapper[4771]: E1001 15:31:21.630683 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6d40cbc-35fb-4d0d-a948-acd279363bd0" containerName="extract-utilities" Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.630716 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6d40cbc-35fb-4d0d-a948-acd279363bd0" containerName="extract-utilities" Oct 01 15:31:21 crc kubenswrapper[4771]: E1001 15:31:21.630759 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6d40cbc-35fb-4d0d-a948-acd279363bd0" containerName="registry-server" Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.630773 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6d40cbc-35fb-4d0d-a948-acd279363bd0" containerName="registry-server" Oct 01 15:31:21 crc kubenswrapper[4771]: E1001 15:31:21.630800 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c50c132-15f0-45c7-a895-46fe2be6003e" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.630813 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c50c132-15f0-45c7-a895-46fe2be6003e" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 01 15:31:21 crc kubenswrapper[4771]: E1001 15:31:21.630842 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6d40cbc-35fb-4d0d-a948-acd279363bd0" containerName="extract-content" Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.630852 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6d40cbc-35fb-4d0d-a948-acd279363bd0" containerName="extract-content" Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.631125 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c50c132-15f0-45c7-a895-46fe2be6003e" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.631158 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6d40cbc-35fb-4d0d-a948-acd279363bd0" containerName="registry-server" Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.632240 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hvr6k" Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.634520 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.634567 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fv9b7" Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.635130 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.635526 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.635851 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.637913 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hvr6k"] Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.655710 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87149516-d807-4412-90a5-e127c03943e0-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-hvr6k\" (UID: \"87149516-d807-4412-90a5-e127c03943e0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hvr6k" Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.656118 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/87149516-d807-4412-90a5-e127c03943e0-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-hvr6k\" (UID: \"87149516-d807-4412-90a5-e127c03943e0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hvr6k" Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.656177 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87149516-d807-4412-90a5-e127c03943e0-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-hvr6k\" (UID: \"87149516-d807-4412-90a5-e127c03943e0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hvr6k" Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.656223 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzsx9\" (UniqueName: \"kubernetes.io/projected/87149516-d807-4412-90a5-e127c03943e0-kube-api-access-lzsx9\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-hvr6k\" (UID: \"87149516-d807-4412-90a5-e127c03943e0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hvr6k" Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.656412 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87149516-d807-4412-90a5-e127c03943e0-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-hvr6k\" (UID: \"87149516-d807-4412-90a5-e127c03943e0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hvr6k" Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.758492 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/87149516-d807-4412-90a5-e127c03943e0-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-hvr6k\" (UID: \"87149516-d807-4412-90a5-e127c03943e0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hvr6k" Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.758941 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87149516-d807-4412-90a5-e127c03943e0-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-hvr6k\" (UID: \"87149516-d807-4412-90a5-e127c03943e0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hvr6k" Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.759051 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzsx9\" (UniqueName: \"kubernetes.io/projected/87149516-d807-4412-90a5-e127c03943e0-kube-api-access-lzsx9\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-hvr6k\" (UID: \"87149516-d807-4412-90a5-e127c03943e0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hvr6k" Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.759207 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87149516-d807-4412-90a5-e127c03943e0-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-hvr6k\" (UID: \"87149516-d807-4412-90a5-e127c03943e0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hvr6k" Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.759337 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87149516-d807-4412-90a5-e127c03943e0-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-hvr6k\" (UID: \"87149516-d807-4412-90a5-e127c03943e0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hvr6k" Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.762016 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/87149516-d807-4412-90a5-e127c03943e0-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-hvr6k\" (UID: \"87149516-d807-4412-90a5-e127c03943e0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hvr6k" Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.762695 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87149516-d807-4412-90a5-e127c03943e0-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-hvr6k\" (UID: \"87149516-d807-4412-90a5-e127c03943e0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hvr6k" Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.764348 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87149516-d807-4412-90a5-e127c03943e0-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-hvr6k\" (UID: \"87149516-d807-4412-90a5-e127c03943e0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hvr6k" Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.764964 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87149516-d807-4412-90a5-e127c03943e0-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-hvr6k\" (UID: \"87149516-d807-4412-90a5-e127c03943e0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hvr6k" Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.776170 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzsx9\" (UniqueName: \"kubernetes.io/projected/87149516-d807-4412-90a5-e127c03943e0-kube-api-access-lzsx9\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-hvr6k\" (UID: \"87149516-d807-4412-90a5-e127c03943e0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hvr6k" Oct 01 15:31:21 crc kubenswrapper[4771]: I1001 15:31:21.958788 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hvr6k" Oct 01 15:31:22 crc kubenswrapper[4771]: I1001 15:31:22.478420 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hvr6k"] Oct 01 15:31:22 crc kubenswrapper[4771]: I1001 15:31:22.527623 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hvr6k" event={"ID":"87149516-d807-4412-90a5-e127c03943e0","Type":"ContainerStarted","Data":"7a3896700b53674a1ac663079b788ef5dbabd376f9ff31cecacd9e847844e06d"} Oct 01 15:31:23 crc kubenswrapper[4771]: I1001 15:31:23.537048 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hvr6k" event={"ID":"87149516-d807-4412-90a5-e127c03943e0","Type":"ContainerStarted","Data":"46936a8c9c0a1dd2d5bcf9d3f01dd378077ad9f4ac7906cf80b74a9ddb39a1c9"} Oct 01 15:31:23 crc kubenswrapper[4771]: I1001 15:31:23.558447 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hvr6k" podStartSLOduration=1.824266682 podStartE2EDuration="2.558413136s" podCreationTimestamp="2025-10-01 15:31:21 +0000 UTC" firstStartedPulling="2025-10-01 15:31:22.483559917 +0000 UTC m=+2127.102735088" lastFinishedPulling="2025-10-01 15:31:23.217706371 +0000 UTC m=+2127.836881542" observedRunningTime="2025-10-01 15:31:23.551679881 +0000 UTC m=+2128.170855072" watchObservedRunningTime="2025-10-01 15:31:23.558413136 +0000 UTC m=+2128.177588327" Oct 01 15:31:27 crc kubenswrapper[4771]: I1001 15:31:27.214953 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cn6hb"] Oct 01 15:31:27 crc kubenswrapper[4771]: I1001 15:31:27.222150 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cn6hb" Oct 01 15:31:27 crc kubenswrapper[4771]: I1001 15:31:27.226495 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cn6hb"] Oct 01 15:31:27 crc kubenswrapper[4771]: I1001 15:31:27.271440 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b308106b-8699-4b75-af0e-360e94bf6cac-catalog-content\") pod \"community-operators-cn6hb\" (UID: \"b308106b-8699-4b75-af0e-360e94bf6cac\") " pod="openshift-marketplace/community-operators-cn6hb" Oct 01 15:31:27 crc kubenswrapper[4771]: I1001 15:31:27.271495 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b308106b-8699-4b75-af0e-360e94bf6cac-utilities\") pod \"community-operators-cn6hb\" (UID: \"b308106b-8699-4b75-af0e-360e94bf6cac\") " pod="openshift-marketplace/community-operators-cn6hb" Oct 01 15:31:27 crc kubenswrapper[4771]: I1001 15:31:27.271598 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr9xj\" (UniqueName: \"kubernetes.io/projected/b308106b-8699-4b75-af0e-360e94bf6cac-kube-api-access-sr9xj\") pod \"community-operators-cn6hb\" (UID: \"b308106b-8699-4b75-af0e-360e94bf6cac\") " pod="openshift-marketplace/community-operators-cn6hb" Oct 01 15:31:27 crc kubenswrapper[4771]: I1001 15:31:27.372758 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sr9xj\" (UniqueName: \"kubernetes.io/projected/b308106b-8699-4b75-af0e-360e94bf6cac-kube-api-access-sr9xj\") pod \"community-operators-cn6hb\" (UID: \"b308106b-8699-4b75-af0e-360e94bf6cac\") " pod="openshift-marketplace/community-operators-cn6hb" Oct 01 15:31:27 crc kubenswrapper[4771]: I1001 15:31:27.372834 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b308106b-8699-4b75-af0e-360e94bf6cac-catalog-content\") pod \"community-operators-cn6hb\" (UID: \"b308106b-8699-4b75-af0e-360e94bf6cac\") " pod="openshift-marketplace/community-operators-cn6hb" Oct 01 15:31:27 crc kubenswrapper[4771]: I1001 15:31:27.372875 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b308106b-8699-4b75-af0e-360e94bf6cac-utilities\") pod \"community-operators-cn6hb\" (UID: \"b308106b-8699-4b75-af0e-360e94bf6cac\") " pod="openshift-marketplace/community-operators-cn6hb" Oct 01 15:31:27 crc kubenswrapper[4771]: I1001 15:31:27.373265 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b308106b-8699-4b75-af0e-360e94bf6cac-utilities\") pod \"community-operators-cn6hb\" (UID: \"b308106b-8699-4b75-af0e-360e94bf6cac\") " pod="openshift-marketplace/community-operators-cn6hb" Oct 01 15:31:27 crc kubenswrapper[4771]: I1001 15:31:27.373560 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b308106b-8699-4b75-af0e-360e94bf6cac-catalog-content\") pod \"community-operators-cn6hb\" (UID: \"b308106b-8699-4b75-af0e-360e94bf6cac\") " pod="openshift-marketplace/community-operators-cn6hb" Oct 01 15:31:27 crc kubenswrapper[4771]: I1001 15:31:27.396803 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr9xj\" (UniqueName: \"kubernetes.io/projected/b308106b-8699-4b75-af0e-360e94bf6cac-kube-api-access-sr9xj\") pod \"community-operators-cn6hb\" (UID: \"b308106b-8699-4b75-af0e-360e94bf6cac\") " pod="openshift-marketplace/community-operators-cn6hb" Oct 01 15:31:27 crc kubenswrapper[4771]: I1001 15:31:27.551425 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cn6hb" Oct 01 15:31:28 crc kubenswrapper[4771]: I1001 15:31:28.067420 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cn6hb"] Oct 01 15:31:28 crc kubenswrapper[4771]: I1001 15:31:28.581726 4771 generic.go:334] "Generic (PLEG): container finished" podID="b308106b-8699-4b75-af0e-360e94bf6cac" containerID="6446ab5c8d35c500ca24f1892e4a27ae0ffedc56359d0d52bda4012b7c795642" exitCode=0 Oct 01 15:31:28 crc kubenswrapper[4771]: I1001 15:31:28.581819 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cn6hb" event={"ID":"b308106b-8699-4b75-af0e-360e94bf6cac","Type":"ContainerDied","Data":"6446ab5c8d35c500ca24f1892e4a27ae0ffedc56359d0d52bda4012b7c795642"} Oct 01 15:31:28 crc kubenswrapper[4771]: I1001 15:31:28.582169 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cn6hb" event={"ID":"b308106b-8699-4b75-af0e-360e94bf6cac","Type":"ContainerStarted","Data":"6e83601f28e0f0f5aab9e1198510ca5e0130725c55ffb29a495102f24fb66c07"} Oct 01 15:31:30 crc kubenswrapper[4771]: I1001 15:31:30.611672 4771 generic.go:334] "Generic (PLEG): container finished" podID="b308106b-8699-4b75-af0e-360e94bf6cac" containerID="55334e8d5cd9d926bf5b67f492ffad88017be79e6a8e88f6a53a3922365eb809" exitCode=0 Oct 01 15:31:30 crc kubenswrapper[4771]: I1001 15:31:30.611763 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cn6hb" event={"ID":"b308106b-8699-4b75-af0e-360e94bf6cac","Type":"ContainerDied","Data":"55334e8d5cd9d926bf5b67f492ffad88017be79e6a8e88f6a53a3922365eb809"} Oct 01 15:31:31 crc kubenswrapper[4771]: I1001 15:31:31.624590 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cn6hb" event={"ID":"b308106b-8699-4b75-af0e-360e94bf6cac","Type":"ContainerStarted","Data":"77ec685c1f04103d05d0efa7e05ad96d9a3b651eef2579d1d71c38a9da61ea4b"} Oct 01 15:31:31 crc kubenswrapper[4771]: I1001 15:31:31.654138 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cn6hb" podStartSLOduration=2.055568404 podStartE2EDuration="4.654103181s" podCreationTimestamp="2025-10-01 15:31:27 +0000 UTC" firstStartedPulling="2025-10-01 15:31:28.584494059 +0000 UTC m=+2133.203669230" lastFinishedPulling="2025-10-01 15:31:31.183028806 +0000 UTC m=+2135.802204007" observedRunningTime="2025-10-01 15:31:31.645723717 +0000 UTC m=+2136.264898898" watchObservedRunningTime="2025-10-01 15:31:31.654103181 +0000 UTC m=+2136.273278422" Oct 01 15:31:37 crc kubenswrapper[4771]: I1001 15:31:37.551824 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cn6hb" Oct 01 15:31:37 crc kubenswrapper[4771]: I1001 15:31:37.552466 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cn6hb" Oct 01 15:31:37 crc kubenswrapper[4771]: I1001 15:31:37.598868 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cn6hb" Oct 01 15:31:37 crc kubenswrapper[4771]: I1001 15:31:37.736861 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cn6hb" Oct 01 15:31:37 crc kubenswrapper[4771]: I1001 15:31:37.839916 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cn6hb"] Oct 01 15:31:39 crc kubenswrapper[4771]: I1001 15:31:39.723760 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cn6hb" podUID="b308106b-8699-4b75-af0e-360e94bf6cac" containerName="registry-server" containerID="cri-o://77ec685c1f04103d05d0efa7e05ad96d9a3b651eef2579d1d71c38a9da61ea4b" gracePeriod=2 Oct 01 15:31:40 crc kubenswrapper[4771]: I1001 15:31:40.208294 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cn6hb" Oct 01 15:31:40 crc kubenswrapper[4771]: I1001 15:31:40.345221 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b308106b-8699-4b75-af0e-360e94bf6cac-utilities\") pod \"b308106b-8699-4b75-af0e-360e94bf6cac\" (UID: \"b308106b-8699-4b75-af0e-360e94bf6cac\") " Oct 01 15:31:40 crc kubenswrapper[4771]: I1001 15:31:40.345369 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sr9xj\" (UniqueName: \"kubernetes.io/projected/b308106b-8699-4b75-af0e-360e94bf6cac-kube-api-access-sr9xj\") pod \"b308106b-8699-4b75-af0e-360e94bf6cac\" (UID: \"b308106b-8699-4b75-af0e-360e94bf6cac\") " Oct 01 15:31:40 crc kubenswrapper[4771]: I1001 15:31:40.345588 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b308106b-8699-4b75-af0e-360e94bf6cac-catalog-content\") pod \"b308106b-8699-4b75-af0e-360e94bf6cac\" (UID: \"b308106b-8699-4b75-af0e-360e94bf6cac\") " Oct 01 15:31:40 crc kubenswrapper[4771]: I1001 15:31:40.346513 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b308106b-8699-4b75-af0e-360e94bf6cac-utilities" (OuterVolumeSpecName: "utilities") pod "b308106b-8699-4b75-af0e-360e94bf6cac" (UID: "b308106b-8699-4b75-af0e-360e94bf6cac"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:31:40 crc kubenswrapper[4771]: I1001 15:31:40.351942 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b308106b-8699-4b75-af0e-360e94bf6cac-kube-api-access-sr9xj" (OuterVolumeSpecName: "kube-api-access-sr9xj") pod "b308106b-8699-4b75-af0e-360e94bf6cac" (UID: "b308106b-8699-4b75-af0e-360e94bf6cac"). InnerVolumeSpecName "kube-api-access-sr9xj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:31:40 crc kubenswrapper[4771]: I1001 15:31:40.447750 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b308106b-8699-4b75-af0e-360e94bf6cac-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 15:31:40 crc kubenswrapper[4771]: I1001 15:31:40.447783 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sr9xj\" (UniqueName: \"kubernetes.io/projected/b308106b-8699-4b75-af0e-360e94bf6cac-kube-api-access-sr9xj\") on node \"crc\" DevicePath \"\"" Oct 01 15:31:40 crc kubenswrapper[4771]: I1001 15:31:40.561857 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b308106b-8699-4b75-af0e-360e94bf6cac-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b308106b-8699-4b75-af0e-360e94bf6cac" (UID: "b308106b-8699-4b75-af0e-360e94bf6cac"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:31:40 crc kubenswrapper[4771]: I1001 15:31:40.651842 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b308106b-8699-4b75-af0e-360e94bf6cac-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 15:31:40 crc kubenswrapper[4771]: I1001 15:31:40.735169 4771 generic.go:334] "Generic (PLEG): container finished" podID="b308106b-8699-4b75-af0e-360e94bf6cac" containerID="77ec685c1f04103d05d0efa7e05ad96d9a3b651eef2579d1d71c38a9da61ea4b" exitCode=0 Oct 01 15:31:40 crc kubenswrapper[4771]: I1001 15:31:40.735219 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cn6hb" event={"ID":"b308106b-8699-4b75-af0e-360e94bf6cac","Type":"ContainerDied","Data":"77ec685c1f04103d05d0efa7e05ad96d9a3b651eef2579d1d71c38a9da61ea4b"} Oct 01 15:31:40 crc kubenswrapper[4771]: I1001 15:31:40.735257 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cn6hb" event={"ID":"b308106b-8699-4b75-af0e-360e94bf6cac","Type":"ContainerDied","Data":"6e83601f28e0f0f5aab9e1198510ca5e0130725c55ffb29a495102f24fb66c07"} Oct 01 15:31:40 crc kubenswrapper[4771]: I1001 15:31:40.735281 4771 scope.go:117] "RemoveContainer" containerID="77ec685c1f04103d05d0efa7e05ad96d9a3b651eef2579d1d71c38a9da61ea4b" Oct 01 15:31:40 crc kubenswrapper[4771]: I1001 15:31:40.735404 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cn6hb" Oct 01 15:31:40 crc kubenswrapper[4771]: I1001 15:31:40.773347 4771 scope.go:117] "RemoveContainer" containerID="55334e8d5cd9d926bf5b67f492ffad88017be79e6a8e88f6a53a3922365eb809" Oct 01 15:31:40 crc kubenswrapper[4771]: I1001 15:31:40.789665 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cn6hb"] Oct 01 15:31:40 crc kubenswrapper[4771]: I1001 15:31:40.803070 4771 scope.go:117] "RemoveContainer" containerID="6446ab5c8d35c500ca24f1892e4a27ae0ffedc56359d0d52bda4012b7c795642" Oct 01 15:31:40 crc kubenswrapper[4771]: I1001 15:31:40.804766 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cn6hb"] Oct 01 15:31:40 crc kubenswrapper[4771]: I1001 15:31:40.853014 4771 scope.go:117] "RemoveContainer" containerID="77ec685c1f04103d05d0efa7e05ad96d9a3b651eef2579d1d71c38a9da61ea4b" Oct 01 15:31:40 crc kubenswrapper[4771]: E1001 15:31:40.853388 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77ec685c1f04103d05d0efa7e05ad96d9a3b651eef2579d1d71c38a9da61ea4b\": container with ID starting with 77ec685c1f04103d05d0efa7e05ad96d9a3b651eef2579d1d71c38a9da61ea4b not found: ID does not exist" containerID="77ec685c1f04103d05d0efa7e05ad96d9a3b651eef2579d1d71c38a9da61ea4b" Oct 01 15:31:40 crc kubenswrapper[4771]: I1001 15:31:40.853427 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77ec685c1f04103d05d0efa7e05ad96d9a3b651eef2579d1d71c38a9da61ea4b"} err="failed to get container status \"77ec685c1f04103d05d0efa7e05ad96d9a3b651eef2579d1d71c38a9da61ea4b\": rpc error: code = NotFound desc = could not find container \"77ec685c1f04103d05d0efa7e05ad96d9a3b651eef2579d1d71c38a9da61ea4b\": container with ID starting with 77ec685c1f04103d05d0efa7e05ad96d9a3b651eef2579d1d71c38a9da61ea4b not found: ID does not exist" Oct 01 15:31:40 crc kubenswrapper[4771]: I1001 15:31:40.853453 4771 scope.go:117] "RemoveContainer" containerID="55334e8d5cd9d926bf5b67f492ffad88017be79e6a8e88f6a53a3922365eb809" Oct 01 15:31:40 crc kubenswrapper[4771]: E1001 15:31:40.853890 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55334e8d5cd9d926bf5b67f492ffad88017be79e6a8e88f6a53a3922365eb809\": container with ID starting with 55334e8d5cd9d926bf5b67f492ffad88017be79e6a8e88f6a53a3922365eb809 not found: ID does not exist" containerID="55334e8d5cd9d926bf5b67f492ffad88017be79e6a8e88f6a53a3922365eb809" Oct 01 15:31:40 crc kubenswrapper[4771]: I1001 15:31:40.853926 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55334e8d5cd9d926bf5b67f492ffad88017be79e6a8e88f6a53a3922365eb809"} err="failed to get container status \"55334e8d5cd9d926bf5b67f492ffad88017be79e6a8e88f6a53a3922365eb809\": rpc error: code = NotFound desc = could not find container \"55334e8d5cd9d926bf5b67f492ffad88017be79e6a8e88f6a53a3922365eb809\": container with ID starting with 55334e8d5cd9d926bf5b67f492ffad88017be79e6a8e88f6a53a3922365eb809 not found: ID does not exist" Oct 01 15:31:40 crc kubenswrapper[4771]: I1001 15:31:40.853946 4771 scope.go:117] "RemoveContainer" containerID="6446ab5c8d35c500ca24f1892e4a27ae0ffedc56359d0d52bda4012b7c795642" Oct 01 15:31:40 crc kubenswrapper[4771]: E1001 15:31:40.855327 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6446ab5c8d35c500ca24f1892e4a27ae0ffedc56359d0d52bda4012b7c795642\": container with ID starting with 6446ab5c8d35c500ca24f1892e4a27ae0ffedc56359d0d52bda4012b7c795642 not found: ID does not exist" containerID="6446ab5c8d35c500ca24f1892e4a27ae0ffedc56359d0d52bda4012b7c795642" Oct 01 15:31:40 crc kubenswrapper[4771]: I1001 15:31:40.855358 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6446ab5c8d35c500ca24f1892e4a27ae0ffedc56359d0d52bda4012b7c795642"} err="failed to get container status \"6446ab5c8d35c500ca24f1892e4a27ae0ffedc56359d0d52bda4012b7c795642\": rpc error: code = NotFound desc = could not find container \"6446ab5c8d35c500ca24f1892e4a27ae0ffedc56359d0d52bda4012b7c795642\": container with ID starting with 6446ab5c8d35c500ca24f1892e4a27ae0ffedc56359d0d52bda4012b7c795642 not found: ID does not exist" Oct 01 15:31:41 crc kubenswrapper[4771]: I1001 15:31:41.996477 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b308106b-8699-4b75-af0e-360e94bf6cac" path="/var/lib/kubelet/pods/b308106b-8699-4b75-af0e-360e94bf6cac/volumes" Oct 01 15:31:42 crc kubenswrapper[4771]: I1001 15:31:42.177504 4771 patch_prober.go:28] interesting pod/machine-config-daemon-vck47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 15:31:42 crc kubenswrapper[4771]: I1001 15:31:42.177617 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 15:32:12 crc kubenswrapper[4771]: I1001 15:32:12.178106 4771 patch_prober.go:28] interesting pod/machine-config-daemon-vck47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 15:32:12 crc kubenswrapper[4771]: I1001 15:32:12.178691 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 15:32:21 crc kubenswrapper[4771]: I1001 15:32:21.428644 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-d67pt"] Oct 01 15:32:21 crc kubenswrapper[4771]: E1001 15:32:21.429883 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b308106b-8699-4b75-af0e-360e94bf6cac" containerName="registry-server" Oct 01 15:32:21 crc kubenswrapper[4771]: I1001 15:32:21.429898 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b308106b-8699-4b75-af0e-360e94bf6cac" containerName="registry-server" Oct 01 15:32:21 crc kubenswrapper[4771]: E1001 15:32:21.429910 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b308106b-8699-4b75-af0e-360e94bf6cac" containerName="extract-utilities" Oct 01 15:32:21 crc kubenswrapper[4771]: I1001 15:32:21.429918 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b308106b-8699-4b75-af0e-360e94bf6cac" containerName="extract-utilities" Oct 01 15:32:21 crc kubenswrapper[4771]: E1001 15:32:21.429933 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b308106b-8699-4b75-af0e-360e94bf6cac" containerName="extract-content" Oct 01 15:32:21 crc kubenswrapper[4771]: I1001 15:32:21.429939 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b308106b-8699-4b75-af0e-360e94bf6cac" containerName="extract-content" Oct 01 15:32:21 crc kubenswrapper[4771]: I1001 15:32:21.430118 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b308106b-8699-4b75-af0e-360e94bf6cac" containerName="registry-server" Oct 01 15:32:21 crc kubenswrapper[4771]: I1001 15:32:21.431416 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d67pt" Oct 01 15:32:21 crc kubenswrapper[4771]: I1001 15:32:21.447189 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d67pt"] Oct 01 15:32:21 crc kubenswrapper[4771]: I1001 15:32:21.624159 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kzkl\" (UniqueName: \"kubernetes.io/projected/e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13-kube-api-access-9kzkl\") pod \"redhat-operators-d67pt\" (UID: \"e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13\") " pod="openshift-marketplace/redhat-operators-d67pt" Oct 01 15:32:21 crc kubenswrapper[4771]: I1001 15:32:21.624838 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13-utilities\") pod \"redhat-operators-d67pt\" (UID: \"e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13\") " pod="openshift-marketplace/redhat-operators-d67pt" Oct 01 15:32:21 crc kubenswrapper[4771]: I1001 15:32:21.625091 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13-catalog-content\") pod \"redhat-operators-d67pt\" (UID: \"e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13\") " pod="openshift-marketplace/redhat-operators-d67pt" Oct 01 15:32:21 crc kubenswrapper[4771]: I1001 15:32:21.727586 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kzkl\" (UniqueName: \"kubernetes.io/projected/e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13-kube-api-access-9kzkl\") pod \"redhat-operators-d67pt\" (UID: \"e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13\") " pod="openshift-marketplace/redhat-operators-d67pt" Oct 01 15:32:21 crc kubenswrapper[4771]: I1001 15:32:21.728048 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13-utilities\") pod \"redhat-operators-d67pt\" (UID: \"e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13\") " pod="openshift-marketplace/redhat-operators-d67pt" Oct 01 15:32:21 crc kubenswrapper[4771]: I1001 15:32:21.728260 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13-catalog-content\") pod \"redhat-operators-d67pt\" (UID: \"e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13\") " pod="openshift-marketplace/redhat-operators-d67pt" Oct 01 15:32:21 crc kubenswrapper[4771]: I1001 15:32:21.728614 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13-utilities\") pod \"redhat-operators-d67pt\" (UID: \"e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13\") " pod="openshift-marketplace/redhat-operators-d67pt" Oct 01 15:32:21 crc kubenswrapper[4771]: I1001 15:32:21.728819 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13-catalog-content\") pod \"redhat-operators-d67pt\" (UID: \"e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13\") " pod="openshift-marketplace/redhat-operators-d67pt" Oct 01 15:32:21 crc kubenswrapper[4771]: I1001 15:32:21.754435 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kzkl\" (UniqueName: \"kubernetes.io/projected/e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13-kube-api-access-9kzkl\") pod \"redhat-operators-d67pt\" (UID: \"e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13\") " pod="openshift-marketplace/redhat-operators-d67pt" Oct 01 15:32:21 crc kubenswrapper[4771]: I1001 15:32:21.762383 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d67pt" Oct 01 15:32:22 crc kubenswrapper[4771]: I1001 15:32:22.247858 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d67pt"] Oct 01 15:32:23 crc kubenswrapper[4771]: I1001 15:32:23.246231 4771 generic.go:334] "Generic (PLEG): container finished" podID="e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13" containerID="15a8369d781591a4b917d4794db1030f0034db0f07830bfd1d8cf247327c2450" exitCode=0 Oct 01 15:32:23 crc kubenswrapper[4771]: I1001 15:32:23.246392 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d67pt" event={"ID":"e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13","Type":"ContainerDied","Data":"15a8369d781591a4b917d4794db1030f0034db0f07830bfd1d8cf247327c2450"} Oct 01 15:32:23 crc kubenswrapper[4771]: I1001 15:32:23.246535 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d67pt" event={"ID":"e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13","Type":"ContainerStarted","Data":"17b07962f02855b9941a8986302bd582e591b0d4a98ea9484688c580add13ae9"} Oct 01 15:32:24 crc kubenswrapper[4771]: I1001 15:32:24.260775 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d67pt" event={"ID":"e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13","Type":"ContainerStarted","Data":"19f800abc1a48f99a108478415d75605730c3c0a2d15b99ac2f01a9d6930b460"} Oct 01 15:32:25 crc kubenswrapper[4771]: I1001 15:32:25.273480 4771 generic.go:334] "Generic (PLEG): container finished" podID="e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13" containerID="19f800abc1a48f99a108478415d75605730c3c0a2d15b99ac2f01a9d6930b460" exitCode=0 Oct 01 15:32:25 crc kubenswrapper[4771]: I1001 15:32:25.273540 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d67pt" event={"ID":"e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13","Type":"ContainerDied","Data":"19f800abc1a48f99a108478415d75605730c3c0a2d15b99ac2f01a9d6930b460"} Oct 01 15:32:27 crc kubenswrapper[4771]: I1001 15:32:27.302122 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d67pt" event={"ID":"e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13","Type":"ContainerStarted","Data":"cd3285aa6e62c60e936f135037f9de5ddf1e980ec44a16d70b2f5d577eb3c724"} Oct 01 15:32:27 crc kubenswrapper[4771]: I1001 15:32:27.327835 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-d67pt" podStartSLOduration=3.803354465 podStartE2EDuration="6.327816915s" podCreationTimestamp="2025-10-01 15:32:21 +0000 UTC" firstStartedPulling="2025-10-01 15:32:23.249631192 +0000 UTC m=+2187.868806383" lastFinishedPulling="2025-10-01 15:32:25.774093662 +0000 UTC m=+2190.393268833" observedRunningTime="2025-10-01 15:32:27.320987339 +0000 UTC m=+2191.940162520" watchObservedRunningTime="2025-10-01 15:32:27.327816915 +0000 UTC m=+2191.946992106" Oct 01 15:32:31 crc kubenswrapper[4771]: I1001 15:32:31.763471 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-d67pt" Oct 01 15:32:31 crc kubenswrapper[4771]: I1001 15:32:31.763986 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-d67pt" Oct 01 15:32:31 crc kubenswrapper[4771]: I1001 15:32:31.809270 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-d67pt" Oct 01 15:32:32 crc kubenswrapper[4771]: I1001 15:32:32.422465 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-d67pt" Oct 01 15:32:32 crc kubenswrapper[4771]: I1001 15:32:32.494359 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d67pt"] Oct 01 15:32:34 crc kubenswrapper[4771]: I1001 15:32:34.373100 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-d67pt" podUID="e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13" containerName="registry-server" containerID="cri-o://cd3285aa6e62c60e936f135037f9de5ddf1e980ec44a16d70b2f5d577eb3c724" gracePeriod=2 Oct 01 15:32:34 crc kubenswrapper[4771]: I1001 15:32:34.850343 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d67pt" Oct 01 15:32:35 crc kubenswrapper[4771]: I1001 15:32:35.036108 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kzkl\" (UniqueName: \"kubernetes.io/projected/e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13-kube-api-access-9kzkl\") pod \"e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13\" (UID: \"e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13\") " Oct 01 15:32:35 crc kubenswrapper[4771]: I1001 15:32:35.036191 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13-utilities\") pod \"e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13\" (UID: \"e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13\") " Oct 01 15:32:35 crc kubenswrapper[4771]: I1001 15:32:35.036296 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13-catalog-content\") pod \"e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13\" (UID: \"e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13\") " Oct 01 15:32:35 crc kubenswrapper[4771]: I1001 15:32:35.037424 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13-utilities" (OuterVolumeSpecName: "utilities") pod "e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13" (UID: "e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:32:35 crc kubenswrapper[4771]: I1001 15:32:35.051271 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13-kube-api-access-9kzkl" (OuterVolumeSpecName: "kube-api-access-9kzkl") pod "e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13" (UID: "e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13"). InnerVolumeSpecName "kube-api-access-9kzkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:32:35 crc kubenswrapper[4771]: I1001 15:32:35.138432 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13" (UID: "e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:32:35 crc kubenswrapper[4771]: I1001 15:32:35.139082 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 15:32:35 crc kubenswrapper[4771]: I1001 15:32:35.139100 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9kzkl\" (UniqueName: \"kubernetes.io/projected/e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13-kube-api-access-9kzkl\") on node \"crc\" DevicePath \"\"" Oct 01 15:32:35 crc kubenswrapper[4771]: I1001 15:32:35.139115 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 15:32:35 crc kubenswrapper[4771]: I1001 15:32:35.390827 4771 generic.go:334] "Generic (PLEG): container finished" podID="e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13" containerID="cd3285aa6e62c60e936f135037f9de5ddf1e980ec44a16d70b2f5d577eb3c724" exitCode=0 Oct 01 15:32:35 crc kubenswrapper[4771]: I1001 15:32:35.390902 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d67pt" event={"ID":"e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13","Type":"ContainerDied","Data":"cd3285aa6e62c60e936f135037f9de5ddf1e980ec44a16d70b2f5d577eb3c724"} Oct 01 15:32:35 crc kubenswrapper[4771]: I1001 15:32:35.392885 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d67pt" event={"ID":"e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13","Type":"ContainerDied","Data":"17b07962f02855b9941a8986302bd582e591b0d4a98ea9484688c580add13ae9"} Oct 01 15:32:35 crc kubenswrapper[4771]: I1001 15:32:35.392930 4771 scope.go:117] "RemoveContainer" containerID="cd3285aa6e62c60e936f135037f9de5ddf1e980ec44a16d70b2f5d577eb3c724" Oct 01 15:32:35 crc kubenswrapper[4771]: I1001 15:32:35.390947 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d67pt" Oct 01 15:32:35 crc kubenswrapper[4771]: I1001 15:32:35.430038 4771 scope.go:117] "RemoveContainer" containerID="19f800abc1a48f99a108478415d75605730c3c0a2d15b99ac2f01a9d6930b460" Oct 01 15:32:35 crc kubenswrapper[4771]: I1001 15:32:35.441158 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d67pt"] Oct 01 15:32:35 crc kubenswrapper[4771]: I1001 15:32:35.451671 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-d67pt"] Oct 01 15:32:35 crc kubenswrapper[4771]: I1001 15:32:35.459141 4771 scope.go:117] "RemoveContainer" containerID="15a8369d781591a4b917d4794db1030f0034db0f07830bfd1d8cf247327c2450" Oct 01 15:32:35 crc kubenswrapper[4771]: I1001 15:32:35.519406 4771 scope.go:117] "RemoveContainer" containerID="cd3285aa6e62c60e936f135037f9de5ddf1e980ec44a16d70b2f5d577eb3c724" Oct 01 15:32:35 crc kubenswrapper[4771]: E1001 15:32:35.519804 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd3285aa6e62c60e936f135037f9de5ddf1e980ec44a16d70b2f5d577eb3c724\": container with ID starting with cd3285aa6e62c60e936f135037f9de5ddf1e980ec44a16d70b2f5d577eb3c724 not found: ID does not exist" containerID="cd3285aa6e62c60e936f135037f9de5ddf1e980ec44a16d70b2f5d577eb3c724" Oct 01 15:32:35 crc kubenswrapper[4771]: I1001 15:32:35.519842 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd3285aa6e62c60e936f135037f9de5ddf1e980ec44a16d70b2f5d577eb3c724"} err="failed to get container status \"cd3285aa6e62c60e936f135037f9de5ddf1e980ec44a16d70b2f5d577eb3c724\": rpc error: code = NotFound desc = could not find container \"cd3285aa6e62c60e936f135037f9de5ddf1e980ec44a16d70b2f5d577eb3c724\": container with ID starting with cd3285aa6e62c60e936f135037f9de5ddf1e980ec44a16d70b2f5d577eb3c724 not found: ID does not exist" Oct 01 15:32:35 crc kubenswrapper[4771]: I1001 15:32:35.519875 4771 scope.go:117] "RemoveContainer" containerID="19f800abc1a48f99a108478415d75605730c3c0a2d15b99ac2f01a9d6930b460" Oct 01 15:32:35 crc kubenswrapper[4771]: E1001 15:32:35.520623 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19f800abc1a48f99a108478415d75605730c3c0a2d15b99ac2f01a9d6930b460\": container with ID starting with 19f800abc1a48f99a108478415d75605730c3c0a2d15b99ac2f01a9d6930b460 not found: ID does not exist" containerID="19f800abc1a48f99a108478415d75605730c3c0a2d15b99ac2f01a9d6930b460" Oct 01 15:32:35 crc kubenswrapper[4771]: I1001 15:32:35.520861 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19f800abc1a48f99a108478415d75605730c3c0a2d15b99ac2f01a9d6930b460"} err="failed to get container status \"19f800abc1a48f99a108478415d75605730c3c0a2d15b99ac2f01a9d6930b460\": rpc error: code = NotFound desc = could not find container \"19f800abc1a48f99a108478415d75605730c3c0a2d15b99ac2f01a9d6930b460\": container with ID starting with 19f800abc1a48f99a108478415d75605730c3c0a2d15b99ac2f01a9d6930b460 not found: ID does not exist" Oct 01 15:32:35 crc kubenswrapper[4771]: I1001 15:32:35.521031 4771 scope.go:117] "RemoveContainer" containerID="15a8369d781591a4b917d4794db1030f0034db0f07830bfd1d8cf247327c2450" Oct 01 15:32:35 crc kubenswrapper[4771]: E1001 15:32:35.521631 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15a8369d781591a4b917d4794db1030f0034db0f07830bfd1d8cf247327c2450\": container with ID starting with 15a8369d781591a4b917d4794db1030f0034db0f07830bfd1d8cf247327c2450 not found: ID does not exist" containerID="15a8369d781591a4b917d4794db1030f0034db0f07830bfd1d8cf247327c2450" Oct 01 15:32:35 crc kubenswrapper[4771]: I1001 15:32:35.521659 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15a8369d781591a4b917d4794db1030f0034db0f07830bfd1d8cf247327c2450"} err="failed to get container status \"15a8369d781591a4b917d4794db1030f0034db0f07830bfd1d8cf247327c2450\": rpc error: code = NotFound desc = could not find container \"15a8369d781591a4b917d4794db1030f0034db0f07830bfd1d8cf247327c2450\": container with ID starting with 15a8369d781591a4b917d4794db1030f0034db0f07830bfd1d8cf247327c2450 not found: ID does not exist" Oct 01 15:32:36 crc kubenswrapper[4771]: I1001 15:32:36.005710 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13" path="/var/lib/kubelet/pods/e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13/volumes" Oct 01 15:32:42 crc kubenswrapper[4771]: I1001 15:32:42.177932 4771 patch_prober.go:28] interesting pod/machine-config-daemon-vck47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 15:32:42 crc kubenswrapper[4771]: I1001 15:32:42.178809 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 15:32:42 crc kubenswrapper[4771]: I1001 15:32:42.178884 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vck47" Oct 01 15:32:42 crc kubenswrapper[4771]: I1001 15:32:42.180097 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"16ae180b9499312005658862660284e647910a63ac7bfb5649900bd12b06fcf8"} pod="openshift-machine-config-operator/machine-config-daemon-vck47" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 15:32:42 crc kubenswrapper[4771]: I1001 15:32:42.180210 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" containerID="cri-o://16ae180b9499312005658862660284e647910a63ac7bfb5649900bd12b06fcf8" gracePeriod=600 Oct 01 15:32:42 crc kubenswrapper[4771]: E1001 15:32:42.302629 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:32:42 crc kubenswrapper[4771]: I1001 15:32:42.460270 4771 generic.go:334] "Generic (PLEG): container finished" podID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerID="16ae180b9499312005658862660284e647910a63ac7bfb5649900bd12b06fcf8" exitCode=0 Oct 01 15:32:42 crc kubenswrapper[4771]: I1001 15:32:42.460345 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" event={"ID":"289ee6d3-fabe-417f-964c-76ca03c143cc","Type":"ContainerDied","Data":"16ae180b9499312005658862660284e647910a63ac7bfb5649900bd12b06fcf8"} Oct 01 15:32:42 crc kubenswrapper[4771]: I1001 15:32:42.460961 4771 scope.go:117] "RemoveContainer" containerID="9a6a9c0077062d17e78226f49233ded0b62ef8f37ee11aa7036368c11d9a3cf2" Oct 01 15:32:42 crc kubenswrapper[4771]: I1001 15:32:42.461533 4771 scope.go:117] "RemoveContainer" containerID="16ae180b9499312005658862660284e647910a63ac7bfb5649900bd12b06fcf8" Oct 01 15:32:42 crc kubenswrapper[4771]: E1001 15:32:42.461811 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:32:56 crc kubenswrapper[4771]: I1001 15:32:56.985586 4771 scope.go:117] "RemoveContainer" containerID="16ae180b9499312005658862660284e647910a63ac7bfb5649900bd12b06fcf8" Oct 01 15:32:56 crc kubenswrapper[4771]: E1001 15:32:56.986674 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:33:08 crc kubenswrapper[4771]: I1001 15:33:08.985564 4771 scope.go:117] "RemoveContainer" containerID="16ae180b9499312005658862660284e647910a63ac7bfb5649900bd12b06fcf8" Oct 01 15:33:08 crc kubenswrapper[4771]: E1001 15:33:08.986408 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:33:23 crc kubenswrapper[4771]: I1001 15:33:23.985429 4771 scope.go:117] "RemoveContainer" containerID="16ae180b9499312005658862660284e647910a63ac7bfb5649900bd12b06fcf8" Oct 01 15:33:23 crc kubenswrapper[4771]: E1001 15:33:23.986443 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:33:37 crc kubenswrapper[4771]: I1001 15:33:37.984940 4771 scope.go:117] "RemoveContainer" containerID="16ae180b9499312005658862660284e647910a63ac7bfb5649900bd12b06fcf8" Oct 01 15:33:37 crc kubenswrapper[4771]: E1001 15:33:37.989269 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:33:52 crc kubenswrapper[4771]: I1001 15:33:52.986155 4771 scope.go:117] "RemoveContainer" containerID="16ae180b9499312005658862660284e647910a63ac7bfb5649900bd12b06fcf8" Oct 01 15:33:52 crc kubenswrapper[4771]: E1001 15:33:52.986821 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:34:04 crc kubenswrapper[4771]: I1001 15:34:04.986438 4771 scope.go:117] "RemoveContainer" containerID="16ae180b9499312005658862660284e647910a63ac7bfb5649900bd12b06fcf8" Oct 01 15:34:04 crc kubenswrapper[4771]: E1001 15:34:04.987465 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:34:16 crc kubenswrapper[4771]: I1001 15:34:16.989747 4771 scope.go:117] "RemoveContainer" containerID="16ae180b9499312005658862660284e647910a63ac7bfb5649900bd12b06fcf8" Oct 01 15:34:16 crc kubenswrapper[4771]: E1001 15:34:16.991057 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:34:30 crc kubenswrapper[4771]: I1001 15:34:30.985590 4771 scope.go:117] "RemoveContainer" containerID="16ae180b9499312005658862660284e647910a63ac7bfb5649900bd12b06fcf8" Oct 01 15:34:30 crc kubenswrapper[4771]: E1001 15:34:30.986601 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:34:44 crc kubenswrapper[4771]: I1001 15:34:44.985944 4771 scope.go:117] "RemoveContainer" containerID="16ae180b9499312005658862660284e647910a63ac7bfb5649900bd12b06fcf8" Oct 01 15:34:44 crc kubenswrapper[4771]: E1001 15:34:44.986894 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:34:59 crc kubenswrapper[4771]: I1001 15:34:59.986098 4771 scope.go:117] "RemoveContainer" containerID="16ae180b9499312005658862660284e647910a63ac7bfb5649900bd12b06fcf8" Oct 01 15:34:59 crc kubenswrapper[4771]: E1001 15:34:59.986933 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:35:11 crc kubenswrapper[4771]: I1001 15:35:11.987550 4771 scope.go:117] "RemoveContainer" containerID="16ae180b9499312005658862660284e647910a63ac7bfb5649900bd12b06fcf8" Oct 01 15:35:11 crc kubenswrapper[4771]: E1001 15:35:11.988355 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:35:24 crc kubenswrapper[4771]: I1001 15:35:24.985537 4771 scope.go:117] "RemoveContainer" containerID="16ae180b9499312005658862660284e647910a63ac7bfb5649900bd12b06fcf8" Oct 01 15:35:24 crc kubenswrapper[4771]: E1001 15:35:24.986338 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:35:39 crc kubenswrapper[4771]: I1001 15:35:39.985820 4771 scope.go:117] "RemoveContainer" containerID="16ae180b9499312005658862660284e647910a63ac7bfb5649900bd12b06fcf8" Oct 01 15:35:39 crc kubenswrapper[4771]: E1001 15:35:39.986934 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:35:44 crc kubenswrapper[4771]: I1001 15:35:44.320747 4771 generic.go:334] "Generic (PLEG): container finished" podID="87149516-d807-4412-90a5-e127c03943e0" containerID="46936a8c9c0a1dd2d5bcf9d3f01dd378077ad9f4ac7906cf80b74a9ddb39a1c9" exitCode=0 Oct 01 15:35:44 crc kubenswrapper[4771]: I1001 15:35:44.320889 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hvr6k" event={"ID":"87149516-d807-4412-90a5-e127c03943e0","Type":"ContainerDied","Data":"46936a8c9c0a1dd2d5bcf9d3f01dd378077ad9f4ac7906cf80b74a9ddb39a1c9"} Oct 01 15:35:45 crc kubenswrapper[4771]: I1001 15:35:45.743882 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hvr6k" Oct 01 15:35:45 crc kubenswrapper[4771]: I1001 15:35:45.878957 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87149516-d807-4412-90a5-e127c03943e0-libvirt-combined-ca-bundle\") pod \"87149516-d807-4412-90a5-e127c03943e0\" (UID: \"87149516-d807-4412-90a5-e127c03943e0\") " Oct 01 15:35:45 crc kubenswrapper[4771]: I1001 15:35:45.879307 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87149516-d807-4412-90a5-e127c03943e0-ssh-key\") pod \"87149516-d807-4412-90a5-e127c03943e0\" (UID: \"87149516-d807-4412-90a5-e127c03943e0\") " Oct 01 15:35:45 crc kubenswrapper[4771]: I1001 15:35:45.879371 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/87149516-d807-4412-90a5-e127c03943e0-libvirt-secret-0\") pod \"87149516-d807-4412-90a5-e127c03943e0\" (UID: \"87149516-d807-4412-90a5-e127c03943e0\") " Oct 01 15:35:45 crc kubenswrapper[4771]: I1001 15:35:45.879473 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87149516-d807-4412-90a5-e127c03943e0-inventory\") pod \"87149516-d807-4412-90a5-e127c03943e0\" (UID: \"87149516-d807-4412-90a5-e127c03943e0\") " Oct 01 15:35:45 crc kubenswrapper[4771]: I1001 15:35:45.879615 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzsx9\" (UniqueName: \"kubernetes.io/projected/87149516-d807-4412-90a5-e127c03943e0-kube-api-access-lzsx9\") pod \"87149516-d807-4412-90a5-e127c03943e0\" (UID: \"87149516-d807-4412-90a5-e127c03943e0\") " Oct 01 15:35:45 crc kubenswrapper[4771]: I1001 15:35:45.885081 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87149516-d807-4412-90a5-e127c03943e0-kube-api-access-lzsx9" (OuterVolumeSpecName: "kube-api-access-lzsx9") pod "87149516-d807-4412-90a5-e127c03943e0" (UID: "87149516-d807-4412-90a5-e127c03943e0"). InnerVolumeSpecName "kube-api-access-lzsx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:35:45 crc kubenswrapper[4771]: I1001 15:35:45.892242 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87149516-d807-4412-90a5-e127c03943e0-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "87149516-d807-4412-90a5-e127c03943e0" (UID: "87149516-d807-4412-90a5-e127c03943e0"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:35:45 crc kubenswrapper[4771]: I1001 15:35:45.911913 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87149516-d807-4412-90a5-e127c03943e0-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "87149516-d807-4412-90a5-e127c03943e0" (UID: "87149516-d807-4412-90a5-e127c03943e0"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:35:45 crc kubenswrapper[4771]: I1001 15:35:45.914600 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87149516-d807-4412-90a5-e127c03943e0-inventory" (OuterVolumeSpecName: "inventory") pod "87149516-d807-4412-90a5-e127c03943e0" (UID: "87149516-d807-4412-90a5-e127c03943e0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:35:45 crc kubenswrapper[4771]: I1001 15:35:45.919444 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87149516-d807-4412-90a5-e127c03943e0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "87149516-d807-4412-90a5-e127c03943e0" (UID: "87149516-d807-4412-90a5-e127c03943e0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:35:45 crc kubenswrapper[4771]: I1001 15:35:45.982951 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzsx9\" (UniqueName: \"kubernetes.io/projected/87149516-d807-4412-90a5-e127c03943e0-kube-api-access-lzsx9\") on node \"crc\" DevicePath \"\"" Oct 01 15:35:45 crc kubenswrapper[4771]: I1001 15:35:45.983019 4771 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87149516-d807-4412-90a5-e127c03943e0-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:35:45 crc kubenswrapper[4771]: I1001 15:35:45.983045 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/87149516-d807-4412-90a5-e127c03943e0-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 15:35:45 crc kubenswrapper[4771]: I1001 15:35:45.983075 4771 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/87149516-d807-4412-90a5-e127c03943e0-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 01 15:35:45 crc kubenswrapper[4771]: I1001 15:35:45.983098 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/87149516-d807-4412-90a5-e127c03943e0-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 15:35:46 crc kubenswrapper[4771]: I1001 15:35:46.342399 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hvr6k" event={"ID":"87149516-d807-4412-90a5-e127c03943e0","Type":"ContainerDied","Data":"7a3896700b53674a1ac663079b788ef5dbabd376f9ff31cecacd9e847844e06d"} Oct 01 15:35:46 crc kubenswrapper[4771]: I1001 15:35:46.342450 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a3896700b53674a1ac663079b788ef5dbabd376f9ff31cecacd9e847844e06d" Oct 01 15:35:46 crc kubenswrapper[4771]: I1001 15:35:46.342543 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-hvr6k" Oct 01 15:35:46 crc kubenswrapper[4771]: I1001 15:35:46.466811 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-lbdsd"] Oct 01 15:35:46 crc kubenswrapper[4771]: E1001 15:35:46.467288 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87149516-d807-4412-90a5-e127c03943e0" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 01 15:35:46 crc kubenswrapper[4771]: I1001 15:35:46.467309 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="87149516-d807-4412-90a5-e127c03943e0" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 01 15:35:46 crc kubenswrapper[4771]: E1001 15:35:46.467321 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13" containerName="extract-utilities" Oct 01 15:35:46 crc kubenswrapper[4771]: I1001 15:35:46.467330 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13" containerName="extract-utilities" Oct 01 15:35:46 crc kubenswrapper[4771]: E1001 15:35:46.467348 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13" containerName="extract-content" Oct 01 15:35:46 crc kubenswrapper[4771]: I1001 15:35:46.467356 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13" containerName="extract-content" Oct 01 15:35:46 crc kubenswrapper[4771]: E1001 15:35:46.467378 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13" containerName="registry-server" Oct 01 15:35:46 crc kubenswrapper[4771]: I1001 15:35:46.467386 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13" containerName="registry-server" Oct 01 15:35:46 crc kubenswrapper[4771]: I1001 15:35:46.467611 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5e986cb-80e0-4a98-9d5d-7d6fcb7aec13" containerName="registry-server" Oct 01 15:35:46 crc kubenswrapper[4771]: I1001 15:35:46.467635 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="87149516-d807-4412-90a5-e127c03943e0" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 01 15:35:46 crc kubenswrapper[4771]: I1001 15:35:46.468347 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lbdsd" Oct 01 15:35:46 crc kubenswrapper[4771]: I1001 15:35:46.472778 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 01 15:35:46 crc kubenswrapper[4771]: I1001 15:35:46.472821 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 15:35:46 crc kubenswrapper[4771]: I1001 15:35:46.472874 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 01 15:35:46 crc kubenswrapper[4771]: I1001 15:35:46.472997 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 15:35:46 crc kubenswrapper[4771]: I1001 15:35:46.475335 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Oct 01 15:35:46 crc kubenswrapper[4771]: I1001 15:35:46.475653 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fv9b7" Oct 01 15:35:46 crc kubenswrapper[4771]: I1001 15:35:46.477484 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 15:35:46 crc kubenswrapper[4771]: I1001 15:35:46.485697 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-lbdsd"] Oct 01 15:35:46 crc kubenswrapper[4771]: I1001 15:35:46.595330 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lbdsd\" (UID: \"ea0299a3-63d8-41e7-a23d-4ddd7491df9c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lbdsd" Oct 01 15:35:46 crc kubenswrapper[4771]: I1001 15:35:46.595405 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lbdsd\" (UID: \"ea0299a3-63d8-41e7-a23d-4ddd7491df9c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lbdsd" Oct 01 15:35:46 crc kubenswrapper[4771]: I1001 15:35:46.595437 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lbdsd\" (UID: \"ea0299a3-63d8-41e7-a23d-4ddd7491df9c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lbdsd" Oct 01 15:35:46 crc kubenswrapper[4771]: I1001 15:35:46.595575 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lbdsd\" (UID: \"ea0299a3-63d8-41e7-a23d-4ddd7491df9c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lbdsd" Oct 01 15:35:46 crc kubenswrapper[4771]: I1001 15:35:46.595632 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lbdsd\" (UID: \"ea0299a3-63d8-41e7-a23d-4ddd7491df9c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lbdsd" Oct 01 15:35:46 crc kubenswrapper[4771]: I1001 15:35:46.595844 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lbdsd\" (UID: \"ea0299a3-63d8-41e7-a23d-4ddd7491df9c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lbdsd" Oct 01 15:35:46 crc kubenswrapper[4771]: I1001 15:35:46.595899 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lbdsd\" (UID: \"ea0299a3-63d8-41e7-a23d-4ddd7491df9c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lbdsd" Oct 01 15:35:46 crc kubenswrapper[4771]: I1001 15:35:46.596082 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwtvv\" (UniqueName: \"kubernetes.io/projected/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-kube-api-access-hwtvv\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lbdsd\" (UID: \"ea0299a3-63d8-41e7-a23d-4ddd7491df9c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lbdsd" Oct 01 15:35:46 crc kubenswrapper[4771]: I1001 15:35:46.596158 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lbdsd\" (UID: \"ea0299a3-63d8-41e7-a23d-4ddd7491df9c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lbdsd" Oct 01 15:35:46 crc kubenswrapper[4771]: I1001 15:35:46.697806 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lbdsd\" (UID: \"ea0299a3-63d8-41e7-a23d-4ddd7491df9c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lbdsd" Oct 01 15:35:46 crc kubenswrapper[4771]: I1001 15:35:46.697852 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lbdsd\" (UID: \"ea0299a3-63d8-41e7-a23d-4ddd7491df9c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lbdsd" Oct 01 15:35:46 crc kubenswrapper[4771]: I1001 15:35:46.697912 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lbdsd\" (UID: \"ea0299a3-63d8-41e7-a23d-4ddd7491df9c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lbdsd" Oct 01 15:35:46 crc kubenswrapper[4771]: I1001 15:35:46.697942 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lbdsd\" (UID: \"ea0299a3-63d8-41e7-a23d-4ddd7491df9c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lbdsd" Oct 01 15:35:46 crc kubenswrapper[4771]: I1001 15:35:46.697998 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwtvv\" (UniqueName: \"kubernetes.io/projected/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-kube-api-access-hwtvv\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lbdsd\" (UID: \"ea0299a3-63d8-41e7-a23d-4ddd7491df9c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lbdsd" Oct 01 15:35:46 crc kubenswrapper[4771]: I1001 15:35:46.698027 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lbdsd\" (UID: \"ea0299a3-63d8-41e7-a23d-4ddd7491df9c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lbdsd" Oct 01 15:35:46 crc kubenswrapper[4771]: I1001 15:35:46.698088 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lbdsd\" (UID: \"ea0299a3-63d8-41e7-a23d-4ddd7491df9c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lbdsd" Oct 01 15:35:46 crc kubenswrapper[4771]: I1001 15:35:46.698109 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lbdsd\" (UID: \"ea0299a3-63d8-41e7-a23d-4ddd7491df9c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lbdsd" Oct 01 15:35:46 crc kubenswrapper[4771]: I1001 15:35:46.698128 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lbdsd\" (UID: \"ea0299a3-63d8-41e7-a23d-4ddd7491df9c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lbdsd" Oct 01 15:35:46 crc kubenswrapper[4771]: I1001 15:35:46.698783 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lbdsd\" (UID: \"ea0299a3-63d8-41e7-a23d-4ddd7491df9c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lbdsd" Oct 01 15:35:46 crc kubenswrapper[4771]: I1001 15:35:46.701387 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lbdsd\" (UID: \"ea0299a3-63d8-41e7-a23d-4ddd7491df9c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lbdsd" Oct 01 15:35:46 crc kubenswrapper[4771]: I1001 15:35:46.701505 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lbdsd\" (UID: \"ea0299a3-63d8-41e7-a23d-4ddd7491df9c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lbdsd" Oct 01 15:35:46 crc kubenswrapper[4771]: I1001 15:35:46.702570 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lbdsd\" (UID: \"ea0299a3-63d8-41e7-a23d-4ddd7491df9c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lbdsd" Oct 01 15:35:46 crc kubenswrapper[4771]: I1001 15:35:46.702981 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lbdsd\" (UID: \"ea0299a3-63d8-41e7-a23d-4ddd7491df9c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lbdsd" Oct 01 15:35:46 crc kubenswrapper[4771]: I1001 15:35:46.703311 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lbdsd\" (UID: \"ea0299a3-63d8-41e7-a23d-4ddd7491df9c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lbdsd" Oct 01 15:35:46 crc kubenswrapper[4771]: I1001 15:35:46.704047 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lbdsd\" (UID: \"ea0299a3-63d8-41e7-a23d-4ddd7491df9c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lbdsd" Oct 01 15:35:46 crc kubenswrapper[4771]: I1001 15:35:46.717637 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwtvv\" (UniqueName: \"kubernetes.io/projected/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-kube-api-access-hwtvv\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lbdsd\" (UID: \"ea0299a3-63d8-41e7-a23d-4ddd7491df9c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lbdsd" Oct 01 15:35:46 crc kubenswrapper[4771]: I1001 15:35:46.721019 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lbdsd\" (UID: \"ea0299a3-63d8-41e7-a23d-4ddd7491df9c\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lbdsd" Oct 01 15:35:46 crc kubenswrapper[4771]: I1001 15:35:46.790535 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lbdsd" Oct 01 15:35:47 crc kubenswrapper[4771]: I1001 15:35:47.329180 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 15:35:47 crc kubenswrapper[4771]: I1001 15:35:47.346873 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-lbdsd"] Oct 01 15:35:47 crc kubenswrapper[4771]: I1001 15:35:47.353519 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lbdsd" event={"ID":"ea0299a3-63d8-41e7-a23d-4ddd7491df9c","Type":"ContainerStarted","Data":"01d978eb9483c7ff24a1f10d4449e9c9dc4eed0722a900d6547377e9dc0add23"} Oct 01 15:35:48 crc kubenswrapper[4771]: I1001 15:35:48.366924 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lbdsd" event={"ID":"ea0299a3-63d8-41e7-a23d-4ddd7491df9c","Type":"ContainerStarted","Data":"8fcf2be7831417f95facfa3c38844b2c3de1f920f8588db017e00c7c03e893b9"} Oct 01 15:35:48 crc kubenswrapper[4771]: I1001 15:35:48.393442 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lbdsd" podStartSLOduration=1.929943325 podStartE2EDuration="2.393415897s" podCreationTimestamp="2025-10-01 15:35:46 +0000 UTC" firstStartedPulling="2025-10-01 15:35:47.3288896 +0000 UTC m=+2391.948064771" lastFinishedPulling="2025-10-01 15:35:47.792362172 +0000 UTC m=+2392.411537343" observedRunningTime="2025-10-01 15:35:48.385158416 +0000 UTC m=+2393.004333627" watchObservedRunningTime="2025-10-01 15:35:48.393415897 +0000 UTC m=+2393.012591108" Oct 01 15:35:54 crc kubenswrapper[4771]: I1001 15:35:54.985293 4771 scope.go:117] "RemoveContainer" containerID="16ae180b9499312005658862660284e647910a63ac7bfb5649900bd12b06fcf8" Oct 01 15:35:54 crc kubenswrapper[4771]: E1001 15:35:54.986154 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:36:06 crc kubenswrapper[4771]: I1001 15:36:06.985497 4771 scope.go:117] "RemoveContainer" containerID="16ae180b9499312005658862660284e647910a63ac7bfb5649900bd12b06fcf8" Oct 01 15:36:06 crc kubenswrapper[4771]: E1001 15:36:06.986465 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:36:19 crc kubenswrapper[4771]: I1001 15:36:19.986209 4771 scope.go:117] "RemoveContainer" containerID="16ae180b9499312005658862660284e647910a63ac7bfb5649900bd12b06fcf8" Oct 01 15:36:19 crc kubenswrapper[4771]: E1001 15:36:19.987559 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:36:32 crc kubenswrapper[4771]: I1001 15:36:32.986061 4771 scope.go:117] "RemoveContainer" containerID="16ae180b9499312005658862660284e647910a63ac7bfb5649900bd12b06fcf8" Oct 01 15:36:32 crc kubenswrapper[4771]: E1001 15:36:32.988519 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:36:46 crc kubenswrapper[4771]: I1001 15:36:46.986236 4771 scope.go:117] "RemoveContainer" containerID="16ae180b9499312005658862660284e647910a63ac7bfb5649900bd12b06fcf8" Oct 01 15:36:46 crc kubenswrapper[4771]: E1001 15:36:46.987277 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:36:58 crc kubenswrapper[4771]: I1001 15:36:58.985220 4771 scope.go:117] "RemoveContainer" containerID="16ae180b9499312005658862660284e647910a63ac7bfb5649900bd12b06fcf8" Oct 01 15:36:58 crc kubenswrapper[4771]: E1001 15:36:58.986498 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:37:09 crc kubenswrapper[4771]: I1001 15:37:09.985002 4771 scope.go:117] "RemoveContainer" containerID="16ae180b9499312005658862660284e647910a63ac7bfb5649900bd12b06fcf8" Oct 01 15:37:09 crc kubenswrapper[4771]: E1001 15:37:09.985615 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:37:22 crc kubenswrapper[4771]: I1001 15:37:22.985471 4771 scope.go:117] "RemoveContainer" containerID="16ae180b9499312005658862660284e647910a63ac7bfb5649900bd12b06fcf8" Oct 01 15:37:22 crc kubenswrapper[4771]: E1001 15:37:22.986676 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:37:37 crc kubenswrapper[4771]: I1001 15:37:37.985717 4771 scope.go:117] "RemoveContainer" containerID="16ae180b9499312005658862660284e647910a63ac7bfb5649900bd12b06fcf8" Oct 01 15:37:37 crc kubenswrapper[4771]: E1001 15:37:37.986791 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:37:48 crc kubenswrapper[4771]: I1001 15:37:48.986887 4771 scope.go:117] "RemoveContainer" containerID="16ae180b9499312005658862660284e647910a63ac7bfb5649900bd12b06fcf8" Oct 01 15:37:49 crc kubenswrapper[4771]: I1001 15:37:49.665788 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" event={"ID":"289ee6d3-fabe-417f-964c-76ca03c143cc","Type":"ContainerStarted","Data":"0d782d6a5f89be0ead3cef9790ea83544120df7e378bb400716acc9b592f88b8"} Oct 01 15:39:18 crc kubenswrapper[4771]: I1001 15:39:18.616015 4771 generic.go:334] "Generic (PLEG): container finished" podID="ea0299a3-63d8-41e7-a23d-4ddd7491df9c" containerID="8fcf2be7831417f95facfa3c38844b2c3de1f920f8588db017e00c7c03e893b9" exitCode=0 Oct 01 15:39:18 crc kubenswrapper[4771]: I1001 15:39:18.616115 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lbdsd" event={"ID":"ea0299a3-63d8-41e7-a23d-4ddd7491df9c","Type":"ContainerDied","Data":"8fcf2be7831417f95facfa3c38844b2c3de1f920f8588db017e00c7c03e893b9"} Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.109749 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lbdsd" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.206899 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwtvv\" (UniqueName: \"kubernetes.io/projected/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-kube-api-access-hwtvv\") pod \"ea0299a3-63d8-41e7-a23d-4ddd7491df9c\" (UID: \"ea0299a3-63d8-41e7-a23d-4ddd7491df9c\") " Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.206991 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-inventory\") pod \"ea0299a3-63d8-41e7-a23d-4ddd7491df9c\" (UID: \"ea0299a3-63d8-41e7-a23d-4ddd7491df9c\") " Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.207096 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-nova-cell1-compute-config-1\") pod \"ea0299a3-63d8-41e7-a23d-4ddd7491df9c\" (UID: \"ea0299a3-63d8-41e7-a23d-4ddd7491df9c\") " Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.207135 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-nova-migration-ssh-key-1\") pod \"ea0299a3-63d8-41e7-a23d-4ddd7491df9c\" (UID: \"ea0299a3-63d8-41e7-a23d-4ddd7491df9c\") " Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.207191 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-nova-migration-ssh-key-0\") pod \"ea0299a3-63d8-41e7-a23d-4ddd7491df9c\" (UID: \"ea0299a3-63d8-41e7-a23d-4ddd7491df9c\") " Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.207262 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-nova-combined-ca-bundle\") pod \"ea0299a3-63d8-41e7-a23d-4ddd7491df9c\" (UID: \"ea0299a3-63d8-41e7-a23d-4ddd7491df9c\") " Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.207322 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-nova-extra-config-0\") pod \"ea0299a3-63d8-41e7-a23d-4ddd7491df9c\" (UID: \"ea0299a3-63d8-41e7-a23d-4ddd7491df9c\") " Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.207354 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-ssh-key\") pod \"ea0299a3-63d8-41e7-a23d-4ddd7491df9c\" (UID: \"ea0299a3-63d8-41e7-a23d-4ddd7491df9c\") " Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.207476 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-nova-cell1-compute-config-0\") pod \"ea0299a3-63d8-41e7-a23d-4ddd7491df9c\" (UID: \"ea0299a3-63d8-41e7-a23d-4ddd7491df9c\") " Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.221583 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "ea0299a3-63d8-41e7-a23d-4ddd7491df9c" (UID: "ea0299a3-63d8-41e7-a23d-4ddd7491df9c"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.221644 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-kube-api-access-hwtvv" (OuterVolumeSpecName: "kube-api-access-hwtvv") pod "ea0299a3-63d8-41e7-a23d-4ddd7491df9c" (UID: "ea0299a3-63d8-41e7-a23d-4ddd7491df9c"). InnerVolumeSpecName "kube-api-access-hwtvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.232693 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "ea0299a3-63d8-41e7-a23d-4ddd7491df9c" (UID: "ea0299a3-63d8-41e7-a23d-4ddd7491df9c"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.241673 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ea0299a3-63d8-41e7-a23d-4ddd7491df9c" (UID: "ea0299a3-63d8-41e7-a23d-4ddd7491df9c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.241852 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-inventory" (OuterVolumeSpecName: "inventory") pod "ea0299a3-63d8-41e7-a23d-4ddd7491df9c" (UID: "ea0299a3-63d8-41e7-a23d-4ddd7491df9c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.244647 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "ea0299a3-63d8-41e7-a23d-4ddd7491df9c" (UID: "ea0299a3-63d8-41e7-a23d-4ddd7491df9c"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.249138 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "ea0299a3-63d8-41e7-a23d-4ddd7491df9c" (UID: "ea0299a3-63d8-41e7-a23d-4ddd7491df9c"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.253754 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "ea0299a3-63d8-41e7-a23d-4ddd7491df9c" (UID: "ea0299a3-63d8-41e7-a23d-4ddd7491df9c"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.259608 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "ea0299a3-63d8-41e7-a23d-4ddd7491df9c" (UID: "ea0299a3-63d8-41e7-a23d-4ddd7491df9c"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.309702 4771 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.309819 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwtvv\" (UniqueName: \"kubernetes.io/projected/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-kube-api-access-hwtvv\") on node \"crc\" DevicePath \"\"" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.309837 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.309849 4771 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.309862 4771 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.309875 4771 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.309890 4771 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.309902 4771 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.309914 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ea0299a3-63d8-41e7-a23d-4ddd7491df9c-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.651716 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lbdsd" event={"ID":"ea0299a3-63d8-41e7-a23d-4ddd7491df9c","Type":"ContainerDied","Data":"01d978eb9483c7ff24a1f10d4449e9c9dc4eed0722a900d6547377e9dc0add23"} Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.651784 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01d978eb9483c7ff24a1f10d4449e9c9dc4eed0722a900d6547377e9dc0add23" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.651891 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lbdsd" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.752951 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6"] Oct 01 15:39:20 crc kubenswrapper[4771]: E1001 15:39:20.753511 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea0299a3-63d8-41e7-a23d-4ddd7491df9c" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.753615 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea0299a3-63d8-41e7-a23d-4ddd7491df9c" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.753949 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea0299a3-63d8-41e7-a23d-4ddd7491df9c" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.754781 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.756884 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.757178 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-fv9b7" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.757398 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.757441 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.757337 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.768319 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6"] Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.827326 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b938863c-4e4f-414a-9b0b-2d2583d9ae0c-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6\" (UID: \"b938863c-4e4f-414a-9b0b-2d2583d9ae0c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.827583 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b938863c-4e4f-414a-9b0b-2d2583d9ae0c-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6\" (UID: \"b938863c-4e4f-414a-9b0b-2d2583d9ae0c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.827634 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b938863c-4e4f-414a-9b0b-2d2583d9ae0c-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6\" (UID: \"b938863c-4e4f-414a-9b0b-2d2583d9ae0c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.827685 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b938863c-4e4f-414a-9b0b-2d2583d9ae0c-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6\" (UID: \"b938863c-4e4f-414a-9b0b-2d2583d9ae0c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.827777 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b938863c-4e4f-414a-9b0b-2d2583d9ae0c-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6\" (UID: \"b938863c-4e4f-414a-9b0b-2d2583d9ae0c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.827833 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b938863c-4e4f-414a-9b0b-2d2583d9ae0c-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6\" (UID: \"b938863c-4e4f-414a-9b0b-2d2583d9ae0c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.827928 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2qns\" (UniqueName: \"kubernetes.io/projected/b938863c-4e4f-414a-9b0b-2d2583d9ae0c-kube-api-access-d2qns\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6\" (UID: \"b938863c-4e4f-414a-9b0b-2d2583d9ae0c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.929893 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b938863c-4e4f-414a-9b0b-2d2583d9ae0c-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6\" (UID: \"b938863c-4e4f-414a-9b0b-2d2583d9ae0c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.929958 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b938863c-4e4f-414a-9b0b-2d2583d9ae0c-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6\" (UID: \"b938863c-4e4f-414a-9b0b-2d2583d9ae0c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.930018 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2qns\" (UniqueName: \"kubernetes.io/projected/b938863c-4e4f-414a-9b0b-2d2583d9ae0c-kube-api-access-d2qns\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6\" (UID: \"b938863c-4e4f-414a-9b0b-2d2583d9ae0c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.930078 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b938863c-4e4f-414a-9b0b-2d2583d9ae0c-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6\" (UID: \"b938863c-4e4f-414a-9b0b-2d2583d9ae0c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.930263 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b938863c-4e4f-414a-9b0b-2d2583d9ae0c-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6\" (UID: \"b938863c-4e4f-414a-9b0b-2d2583d9ae0c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.930291 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b938863c-4e4f-414a-9b0b-2d2583d9ae0c-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6\" (UID: \"b938863c-4e4f-414a-9b0b-2d2583d9ae0c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.930318 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b938863c-4e4f-414a-9b0b-2d2583d9ae0c-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6\" (UID: \"b938863c-4e4f-414a-9b0b-2d2583d9ae0c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.935984 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b938863c-4e4f-414a-9b0b-2d2583d9ae0c-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6\" (UID: \"b938863c-4e4f-414a-9b0b-2d2583d9ae0c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.937870 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b938863c-4e4f-414a-9b0b-2d2583d9ae0c-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6\" (UID: \"b938863c-4e4f-414a-9b0b-2d2583d9ae0c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.939533 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b938863c-4e4f-414a-9b0b-2d2583d9ae0c-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6\" (UID: \"b938863c-4e4f-414a-9b0b-2d2583d9ae0c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.939676 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b938863c-4e4f-414a-9b0b-2d2583d9ae0c-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6\" (UID: \"b938863c-4e4f-414a-9b0b-2d2583d9ae0c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.949397 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b938863c-4e4f-414a-9b0b-2d2583d9ae0c-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6\" (UID: \"b938863c-4e4f-414a-9b0b-2d2583d9ae0c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.949489 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b938863c-4e4f-414a-9b0b-2d2583d9ae0c-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6\" (UID: \"b938863c-4e4f-414a-9b0b-2d2583d9ae0c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6" Oct 01 15:39:20 crc kubenswrapper[4771]: I1001 15:39:20.957598 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2qns\" (UniqueName: \"kubernetes.io/projected/b938863c-4e4f-414a-9b0b-2d2583d9ae0c-kube-api-access-d2qns\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6\" (UID: \"b938863c-4e4f-414a-9b0b-2d2583d9ae0c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6" Oct 01 15:39:21 crc kubenswrapper[4771]: I1001 15:39:21.139631 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6" Oct 01 15:39:21 crc kubenswrapper[4771]: I1001 15:39:21.652304 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6"] Oct 01 15:39:21 crc kubenswrapper[4771]: W1001 15:39:21.655507 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb938863c_4e4f_414a_9b0b_2d2583d9ae0c.slice/crio-93d90989e85ee000fe11a13316ddc339c70d03da206489282a7d4c1240ffd1ae WatchSource:0}: Error finding container 93d90989e85ee000fe11a13316ddc339c70d03da206489282a7d4c1240ffd1ae: Status 404 returned error can't find the container with id 93d90989e85ee000fe11a13316ddc339c70d03da206489282a7d4c1240ffd1ae Oct 01 15:39:22 crc kubenswrapper[4771]: I1001 15:39:22.674643 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6" event={"ID":"b938863c-4e4f-414a-9b0b-2d2583d9ae0c","Type":"ContainerStarted","Data":"965ed52cfa015a7187563c13643581155aa21ec6fbedf3c00ee8de7838d8b47a"} Oct 01 15:39:22 crc kubenswrapper[4771]: I1001 15:39:22.674956 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6" event={"ID":"b938863c-4e4f-414a-9b0b-2d2583d9ae0c","Type":"ContainerStarted","Data":"93d90989e85ee000fe11a13316ddc339c70d03da206489282a7d4c1240ffd1ae"} Oct 01 15:39:22 crc kubenswrapper[4771]: I1001 15:39:22.713450 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6" podStartSLOduration=2.172328248 podStartE2EDuration="2.71341503s" podCreationTimestamp="2025-10-01 15:39:20 +0000 UTC" firstStartedPulling="2025-10-01 15:39:21.657682591 +0000 UTC m=+2606.276857772" lastFinishedPulling="2025-10-01 15:39:22.198769353 +0000 UTC m=+2606.817944554" observedRunningTime="2025-10-01 15:39:22.703423878 +0000 UTC m=+2607.322599109" watchObservedRunningTime="2025-10-01 15:39:22.71341503 +0000 UTC m=+2607.332590241" Oct 01 15:40:08 crc kubenswrapper[4771]: I1001 15:40:08.814480 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hfkg4"] Oct 01 15:40:08 crc kubenswrapper[4771]: I1001 15:40:08.819428 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hfkg4" Oct 01 15:40:08 crc kubenswrapper[4771]: I1001 15:40:08.836718 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hfkg4"] Oct 01 15:40:08 crc kubenswrapper[4771]: I1001 15:40:08.890072 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9817ec8-e004-46a6-a963-a69abdda0fc6-utilities\") pod \"certified-operators-hfkg4\" (UID: \"e9817ec8-e004-46a6-a963-a69abdda0fc6\") " pod="openshift-marketplace/certified-operators-hfkg4" Oct 01 15:40:08 crc kubenswrapper[4771]: I1001 15:40:08.890172 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v29vz\" (UniqueName: \"kubernetes.io/projected/e9817ec8-e004-46a6-a963-a69abdda0fc6-kube-api-access-v29vz\") pod \"certified-operators-hfkg4\" (UID: \"e9817ec8-e004-46a6-a963-a69abdda0fc6\") " pod="openshift-marketplace/certified-operators-hfkg4" Oct 01 15:40:08 crc kubenswrapper[4771]: I1001 15:40:08.890380 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9817ec8-e004-46a6-a963-a69abdda0fc6-catalog-content\") pod \"certified-operators-hfkg4\" (UID: \"e9817ec8-e004-46a6-a963-a69abdda0fc6\") " pod="openshift-marketplace/certified-operators-hfkg4" Oct 01 15:40:08 crc kubenswrapper[4771]: I1001 15:40:08.992444 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9817ec8-e004-46a6-a963-a69abdda0fc6-utilities\") pod \"certified-operators-hfkg4\" (UID: \"e9817ec8-e004-46a6-a963-a69abdda0fc6\") " pod="openshift-marketplace/certified-operators-hfkg4" Oct 01 15:40:08 crc kubenswrapper[4771]: I1001 15:40:08.992832 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v29vz\" (UniqueName: \"kubernetes.io/projected/e9817ec8-e004-46a6-a963-a69abdda0fc6-kube-api-access-v29vz\") pod \"certified-operators-hfkg4\" (UID: \"e9817ec8-e004-46a6-a963-a69abdda0fc6\") " pod="openshift-marketplace/certified-operators-hfkg4" Oct 01 15:40:08 crc kubenswrapper[4771]: I1001 15:40:08.992892 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9817ec8-e004-46a6-a963-a69abdda0fc6-catalog-content\") pod \"certified-operators-hfkg4\" (UID: \"e9817ec8-e004-46a6-a963-a69abdda0fc6\") " pod="openshift-marketplace/certified-operators-hfkg4" Oct 01 15:40:08 crc kubenswrapper[4771]: I1001 15:40:08.992912 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9817ec8-e004-46a6-a963-a69abdda0fc6-utilities\") pod \"certified-operators-hfkg4\" (UID: \"e9817ec8-e004-46a6-a963-a69abdda0fc6\") " pod="openshift-marketplace/certified-operators-hfkg4" Oct 01 15:40:08 crc kubenswrapper[4771]: I1001 15:40:08.993341 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9817ec8-e004-46a6-a963-a69abdda0fc6-catalog-content\") pod \"certified-operators-hfkg4\" (UID: \"e9817ec8-e004-46a6-a963-a69abdda0fc6\") " pod="openshift-marketplace/certified-operators-hfkg4" Oct 01 15:40:09 crc kubenswrapper[4771]: I1001 15:40:09.020064 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v29vz\" (UniqueName: \"kubernetes.io/projected/e9817ec8-e004-46a6-a963-a69abdda0fc6-kube-api-access-v29vz\") pod \"certified-operators-hfkg4\" (UID: \"e9817ec8-e004-46a6-a963-a69abdda0fc6\") " pod="openshift-marketplace/certified-operators-hfkg4" Oct 01 15:40:09 crc kubenswrapper[4771]: I1001 15:40:09.150661 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hfkg4" Oct 01 15:40:09 crc kubenswrapper[4771]: I1001 15:40:09.711276 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hfkg4"] Oct 01 15:40:10 crc kubenswrapper[4771]: I1001 15:40:10.184631 4771 generic.go:334] "Generic (PLEG): container finished" podID="e9817ec8-e004-46a6-a963-a69abdda0fc6" containerID="a5dce560e396d549393102f1633dc007b73a5e74c2d9e599695d19d47532dac5" exitCode=0 Oct 01 15:40:10 crc kubenswrapper[4771]: I1001 15:40:10.184694 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hfkg4" event={"ID":"e9817ec8-e004-46a6-a963-a69abdda0fc6","Type":"ContainerDied","Data":"a5dce560e396d549393102f1633dc007b73a5e74c2d9e599695d19d47532dac5"} Oct 01 15:40:10 crc kubenswrapper[4771]: I1001 15:40:10.185122 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hfkg4" event={"ID":"e9817ec8-e004-46a6-a963-a69abdda0fc6","Type":"ContainerStarted","Data":"aade9c945eec57c52bb9d48c64486cb9fd73dca77fd81cea60f674f8befe54ab"} Oct 01 15:40:12 crc kubenswrapper[4771]: I1001 15:40:12.176862 4771 patch_prober.go:28] interesting pod/machine-config-daemon-vck47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 15:40:12 crc kubenswrapper[4771]: I1001 15:40:12.178116 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 15:40:12 crc kubenswrapper[4771]: I1001 15:40:12.206177 4771 generic.go:334] "Generic (PLEG): container finished" podID="e9817ec8-e004-46a6-a963-a69abdda0fc6" containerID="123a2b4389895edff70b31d8f8da6a944e683326bb847a696ca3bf0655432f0d" exitCode=0 Oct 01 15:40:12 crc kubenswrapper[4771]: I1001 15:40:12.206236 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hfkg4" event={"ID":"e9817ec8-e004-46a6-a963-a69abdda0fc6","Type":"ContainerDied","Data":"123a2b4389895edff70b31d8f8da6a944e683326bb847a696ca3bf0655432f0d"} Oct 01 15:40:13 crc kubenswrapper[4771]: I1001 15:40:13.225616 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hfkg4" event={"ID":"e9817ec8-e004-46a6-a963-a69abdda0fc6","Type":"ContainerStarted","Data":"e00725827e14ca0ebc2d01fa6818c0300c39f203668d069dbd1c8a3516041715"} Oct 01 15:40:13 crc kubenswrapper[4771]: I1001 15:40:13.254980 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hfkg4" podStartSLOduration=2.786154738 podStartE2EDuration="5.254958106s" podCreationTimestamp="2025-10-01 15:40:08 +0000 UTC" firstStartedPulling="2025-10-01 15:40:10.186604131 +0000 UTC m=+2654.805779342" lastFinishedPulling="2025-10-01 15:40:12.655407549 +0000 UTC m=+2657.274582710" observedRunningTime="2025-10-01 15:40:13.248849172 +0000 UTC m=+2657.868024353" watchObservedRunningTime="2025-10-01 15:40:13.254958106 +0000 UTC m=+2657.874133287" Oct 01 15:40:19 crc kubenswrapper[4771]: I1001 15:40:19.151489 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hfkg4" Oct 01 15:40:19 crc kubenswrapper[4771]: I1001 15:40:19.152151 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hfkg4" Oct 01 15:40:19 crc kubenswrapper[4771]: I1001 15:40:19.204437 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hfkg4" Oct 01 15:40:19 crc kubenswrapper[4771]: I1001 15:40:19.356524 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hfkg4" Oct 01 15:40:19 crc kubenswrapper[4771]: I1001 15:40:19.449768 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hfkg4"] Oct 01 15:40:21 crc kubenswrapper[4771]: I1001 15:40:21.327656 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hfkg4" podUID="e9817ec8-e004-46a6-a963-a69abdda0fc6" containerName="registry-server" containerID="cri-o://e00725827e14ca0ebc2d01fa6818c0300c39f203668d069dbd1c8a3516041715" gracePeriod=2 Oct 01 15:40:21 crc kubenswrapper[4771]: I1001 15:40:21.902465 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hfkg4" Oct 01 15:40:21 crc kubenswrapper[4771]: I1001 15:40:21.991641 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9817ec8-e004-46a6-a963-a69abdda0fc6-catalog-content\") pod \"e9817ec8-e004-46a6-a963-a69abdda0fc6\" (UID: \"e9817ec8-e004-46a6-a963-a69abdda0fc6\") " Oct 01 15:40:21 crc kubenswrapper[4771]: I1001 15:40:21.991692 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v29vz\" (UniqueName: \"kubernetes.io/projected/e9817ec8-e004-46a6-a963-a69abdda0fc6-kube-api-access-v29vz\") pod \"e9817ec8-e004-46a6-a963-a69abdda0fc6\" (UID: \"e9817ec8-e004-46a6-a963-a69abdda0fc6\") " Oct 01 15:40:21 crc kubenswrapper[4771]: I1001 15:40:21.991778 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9817ec8-e004-46a6-a963-a69abdda0fc6-utilities\") pod \"e9817ec8-e004-46a6-a963-a69abdda0fc6\" (UID: \"e9817ec8-e004-46a6-a963-a69abdda0fc6\") " Oct 01 15:40:21 crc kubenswrapper[4771]: I1001 15:40:21.992542 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9817ec8-e004-46a6-a963-a69abdda0fc6-utilities" (OuterVolumeSpecName: "utilities") pod "e9817ec8-e004-46a6-a963-a69abdda0fc6" (UID: "e9817ec8-e004-46a6-a963-a69abdda0fc6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:40:21 crc kubenswrapper[4771]: I1001 15:40:21.999522 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9817ec8-e004-46a6-a963-a69abdda0fc6-kube-api-access-v29vz" (OuterVolumeSpecName: "kube-api-access-v29vz") pod "e9817ec8-e004-46a6-a963-a69abdda0fc6" (UID: "e9817ec8-e004-46a6-a963-a69abdda0fc6"). InnerVolumeSpecName "kube-api-access-v29vz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:40:22 crc kubenswrapper[4771]: I1001 15:40:22.032524 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9817ec8-e004-46a6-a963-a69abdda0fc6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e9817ec8-e004-46a6-a963-a69abdda0fc6" (UID: "e9817ec8-e004-46a6-a963-a69abdda0fc6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:40:22 crc kubenswrapper[4771]: I1001 15:40:22.094542 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e9817ec8-e004-46a6-a963-a69abdda0fc6-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 15:40:22 crc kubenswrapper[4771]: I1001 15:40:22.094595 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v29vz\" (UniqueName: \"kubernetes.io/projected/e9817ec8-e004-46a6-a963-a69abdda0fc6-kube-api-access-v29vz\") on node \"crc\" DevicePath \"\"" Oct 01 15:40:22 crc kubenswrapper[4771]: I1001 15:40:22.094616 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e9817ec8-e004-46a6-a963-a69abdda0fc6-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 15:40:22 crc kubenswrapper[4771]: I1001 15:40:22.341772 4771 generic.go:334] "Generic (PLEG): container finished" podID="e9817ec8-e004-46a6-a963-a69abdda0fc6" containerID="e00725827e14ca0ebc2d01fa6818c0300c39f203668d069dbd1c8a3516041715" exitCode=0 Oct 01 15:40:22 crc kubenswrapper[4771]: I1001 15:40:22.341831 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hfkg4" event={"ID":"e9817ec8-e004-46a6-a963-a69abdda0fc6","Type":"ContainerDied","Data":"e00725827e14ca0ebc2d01fa6818c0300c39f203668d069dbd1c8a3516041715"} Oct 01 15:40:22 crc kubenswrapper[4771]: I1001 15:40:22.341879 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hfkg4" event={"ID":"e9817ec8-e004-46a6-a963-a69abdda0fc6","Type":"ContainerDied","Data":"aade9c945eec57c52bb9d48c64486cb9fd73dca77fd81cea60f674f8befe54ab"} Oct 01 15:40:22 crc kubenswrapper[4771]: I1001 15:40:22.341911 4771 scope.go:117] "RemoveContainer" containerID="e00725827e14ca0ebc2d01fa6818c0300c39f203668d069dbd1c8a3516041715" Oct 01 15:40:22 crc kubenswrapper[4771]: I1001 15:40:22.341990 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hfkg4" Oct 01 15:40:22 crc kubenswrapper[4771]: I1001 15:40:22.384329 4771 scope.go:117] "RemoveContainer" containerID="123a2b4389895edff70b31d8f8da6a944e683326bb847a696ca3bf0655432f0d" Oct 01 15:40:22 crc kubenswrapper[4771]: I1001 15:40:22.393713 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hfkg4"] Oct 01 15:40:22 crc kubenswrapper[4771]: I1001 15:40:22.415096 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hfkg4"] Oct 01 15:40:22 crc kubenswrapper[4771]: I1001 15:40:22.430853 4771 scope.go:117] "RemoveContainer" containerID="a5dce560e396d549393102f1633dc007b73a5e74c2d9e599695d19d47532dac5" Oct 01 15:40:22 crc kubenswrapper[4771]: I1001 15:40:22.491651 4771 scope.go:117] "RemoveContainer" containerID="e00725827e14ca0ebc2d01fa6818c0300c39f203668d069dbd1c8a3516041715" Oct 01 15:40:22 crc kubenswrapper[4771]: E1001 15:40:22.492494 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e00725827e14ca0ebc2d01fa6818c0300c39f203668d069dbd1c8a3516041715\": container with ID starting with e00725827e14ca0ebc2d01fa6818c0300c39f203668d069dbd1c8a3516041715 not found: ID does not exist" containerID="e00725827e14ca0ebc2d01fa6818c0300c39f203668d069dbd1c8a3516041715" Oct 01 15:40:22 crc kubenswrapper[4771]: I1001 15:40:22.492614 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e00725827e14ca0ebc2d01fa6818c0300c39f203668d069dbd1c8a3516041715"} err="failed to get container status \"e00725827e14ca0ebc2d01fa6818c0300c39f203668d069dbd1c8a3516041715\": rpc error: code = NotFound desc = could not find container \"e00725827e14ca0ebc2d01fa6818c0300c39f203668d069dbd1c8a3516041715\": container with ID starting with e00725827e14ca0ebc2d01fa6818c0300c39f203668d069dbd1c8a3516041715 not found: ID does not exist" Oct 01 15:40:22 crc kubenswrapper[4771]: I1001 15:40:22.492715 4771 scope.go:117] "RemoveContainer" containerID="123a2b4389895edff70b31d8f8da6a944e683326bb847a696ca3bf0655432f0d" Oct 01 15:40:22 crc kubenswrapper[4771]: E1001 15:40:22.493520 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"123a2b4389895edff70b31d8f8da6a944e683326bb847a696ca3bf0655432f0d\": container with ID starting with 123a2b4389895edff70b31d8f8da6a944e683326bb847a696ca3bf0655432f0d not found: ID does not exist" containerID="123a2b4389895edff70b31d8f8da6a944e683326bb847a696ca3bf0655432f0d" Oct 01 15:40:22 crc kubenswrapper[4771]: I1001 15:40:22.493574 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"123a2b4389895edff70b31d8f8da6a944e683326bb847a696ca3bf0655432f0d"} err="failed to get container status \"123a2b4389895edff70b31d8f8da6a944e683326bb847a696ca3bf0655432f0d\": rpc error: code = NotFound desc = could not find container \"123a2b4389895edff70b31d8f8da6a944e683326bb847a696ca3bf0655432f0d\": container with ID starting with 123a2b4389895edff70b31d8f8da6a944e683326bb847a696ca3bf0655432f0d not found: ID does not exist" Oct 01 15:40:22 crc kubenswrapper[4771]: I1001 15:40:22.493610 4771 scope.go:117] "RemoveContainer" containerID="a5dce560e396d549393102f1633dc007b73a5e74c2d9e599695d19d47532dac5" Oct 01 15:40:22 crc kubenswrapper[4771]: E1001 15:40:22.494254 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5dce560e396d549393102f1633dc007b73a5e74c2d9e599695d19d47532dac5\": container with ID starting with a5dce560e396d549393102f1633dc007b73a5e74c2d9e599695d19d47532dac5 not found: ID does not exist" containerID="a5dce560e396d549393102f1633dc007b73a5e74c2d9e599695d19d47532dac5" Oct 01 15:40:22 crc kubenswrapper[4771]: I1001 15:40:22.494306 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5dce560e396d549393102f1633dc007b73a5e74c2d9e599695d19d47532dac5"} err="failed to get container status \"a5dce560e396d549393102f1633dc007b73a5e74c2d9e599695d19d47532dac5\": rpc error: code = NotFound desc = could not find container \"a5dce560e396d549393102f1633dc007b73a5e74c2d9e599695d19d47532dac5\": container with ID starting with a5dce560e396d549393102f1633dc007b73a5e74c2d9e599695d19d47532dac5 not found: ID does not exist" Oct 01 15:40:24 crc kubenswrapper[4771]: I1001 15:40:24.004227 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9817ec8-e004-46a6-a963-a69abdda0fc6" path="/var/lib/kubelet/pods/e9817ec8-e004-46a6-a963-a69abdda0fc6/volumes" Oct 01 15:40:42 crc kubenswrapper[4771]: I1001 15:40:42.177592 4771 patch_prober.go:28] interesting pod/machine-config-daemon-vck47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 15:40:42 crc kubenswrapper[4771]: I1001 15:40:42.178301 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 15:40:42 crc kubenswrapper[4771]: I1001 15:40:42.258866 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-w9lkt"] Oct 01 15:40:42 crc kubenswrapper[4771]: E1001 15:40:42.259335 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9817ec8-e004-46a6-a963-a69abdda0fc6" containerName="registry-server" Oct 01 15:40:42 crc kubenswrapper[4771]: I1001 15:40:42.259356 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9817ec8-e004-46a6-a963-a69abdda0fc6" containerName="registry-server" Oct 01 15:40:42 crc kubenswrapper[4771]: E1001 15:40:42.259382 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9817ec8-e004-46a6-a963-a69abdda0fc6" containerName="extract-utilities" Oct 01 15:40:42 crc kubenswrapper[4771]: I1001 15:40:42.259390 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9817ec8-e004-46a6-a963-a69abdda0fc6" containerName="extract-utilities" Oct 01 15:40:42 crc kubenswrapper[4771]: E1001 15:40:42.259409 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9817ec8-e004-46a6-a963-a69abdda0fc6" containerName="extract-content" Oct 01 15:40:42 crc kubenswrapper[4771]: I1001 15:40:42.259418 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9817ec8-e004-46a6-a963-a69abdda0fc6" containerName="extract-content" Oct 01 15:40:42 crc kubenswrapper[4771]: I1001 15:40:42.259682 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9817ec8-e004-46a6-a963-a69abdda0fc6" containerName="registry-server" Oct 01 15:40:42 crc kubenswrapper[4771]: I1001 15:40:42.263917 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w9lkt" Oct 01 15:40:42 crc kubenswrapper[4771]: I1001 15:40:42.284216 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w9lkt"] Oct 01 15:40:42 crc kubenswrapper[4771]: I1001 15:40:42.356171 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/daa6c1cb-f03a-4d56-8953-244f08b4c4c8-catalog-content\") pod \"redhat-marketplace-w9lkt\" (UID: \"daa6c1cb-f03a-4d56-8953-244f08b4c4c8\") " pod="openshift-marketplace/redhat-marketplace-w9lkt" Oct 01 15:40:42 crc kubenswrapper[4771]: I1001 15:40:42.356402 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nc5h\" (UniqueName: \"kubernetes.io/projected/daa6c1cb-f03a-4d56-8953-244f08b4c4c8-kube-api-access-5nc5h\") pod \"redhat-marketplace-w9lkt\" (UID: \"daa6c1cb-f03a-4d56-8953-244f08b4c4c8\") " pod="openshift-marketplace/redhat-marketplace-w9lkt" Oct 01 15:40:42 crc kubenswrapper[4771]: I1001 15:40:42.356726 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/daa6c1cb-f03a-4d56-8953-244f08b4c4c8-utilities\") pod \"redhat-marketplace-w9lkt\" (UID: \"daa6c1cb-f03a-4d56-8953-244f08b4c4c8\") " pod="openshift-marketplace/redhat-marketplace-w9lkt" Oct 01 15:40:42 crc kubenswrapper[4771]: I1001 15:40:42.458780 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/daa6c1cb-f03a-4d56-8953-244f08b4c4c8-utilities\") pod \"redhat-marketplace-w9lkt\" (UID: \"daa6c1cb-f03a-4d56-8953-244f08b4c4c8\") " pod="openshift-marketplace/redhat-marketplace-w9lkt" Oct 01 15:40:42 crc kubenswrapper[4771]: I1001 15:40:42.458874 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/daa6c1cb-f03a-4d56-8953-244f08b4c4c8-catalog-content\") pod \"redhat-marketplace-w9lkt\" (UID: \"daa6c1cb-f03a-4d56-8953-244f08b4c4c8\") " pod="openshift-marketplace/redhat-marketplace-w9lkt" Oct 01 15:40:42 crc kubenswrapper[4771]: I1001 15:40:42.458935 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nc5h\" (UniqueName: \"kubernetes.io/projected/daa6c1cb-f03a-4d56-8953-244f08b4c4c8-kube-api-access-5nc5h\") pod \"redhat-marketplace-w9lkt\" (UID: \"daa6c1cb-f03a-4d56-8953-244f08b4c4c8\") " pod="openshift-marketplace/redhat-marketplace-w9lkt" Oct 01 15:40:42 crc kubenswrapper[4771]: I1001 15:40:42.459486 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/daa6c1cb-f03a-4d56-8953-244f08b4c4c8-utilities\") pod \"redhat-marketplace-w9lkt\" (UID: \"daa6c1cb-f03a-4d56-8953-244f08b4c4c8\") " pod="openshift-marketplace/redhat-marketplace-w9lkt" Oct 01 15:40:42 crc kubenswrapper[4771]: I1001 15:40:42.459578 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/daa6c1cb-f03a-4d56-8953-244f08b4c4c8-catalog-content\") pod \"redhat-marketplace-w9lkt\" (UID: \"daa6c1cb-f03a-4d56-8953-244f08b4c4c8\") " pod="openshift-marketplace/redhat-marketplace-w9lkt" Oct 01 15:40:42 crc kubenswrapper[4771]: I1001 15:40:42.482969 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nc5h\" (UniqueName: \"kubernetes.io/projected/daa6c1cb-f03a-4d56-8953-244f08b4c4c8-kube-api-access-5nc5h\") pod \"redhat-marketplace-w9lkt\" (UID: \"daa6c1cb-f03a-4d56-8953-244f08b4c4c8\") " pod="openshift-marketplace/redhat-marketplace-w9lkt" Oct 01 15:40:42 crc kubenswrapper[4771]: I1001 15:40:42.600085 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w9lkt" Oct 01 15:40:43 crc kubenswrapper[4771]: I1001 15:40:43.062535 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w9lkt"] Oct 01 15:40:43 crc kubenswrapper[4771]: W1001 15:40:43.070404 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddaa6c1cb_f03a_4d56_8953_244f08b4c4c8.slice/crio-0dec2b495399d07b3e777f6248e253fa48528e55634956c50a308f8d68c2b77b WatchSource:0}: Error finding container 0dec2b495399d07b3e777f6248e253fa48528e55634956c50a308f8d68c2b77b: Status 404 returned error can't find the container with id 0dec2b495399d07b3e777f6248e253fa48528e55634956c50a308f8d68c2b77b Oct 01 15:40:43 crc kubenswrapper[4771]: I1001 15:40:43.585530 4771 generic.go:334] "Generic (PLEG): container finished" podID="daa6c1cb-f03a-4d56-8953-244f08b4c4c8" containerID="21f44f1143f3e0e0526932c5f180a5b13bb0e184154b3c377e3be14f6eab28a4" exitCode=0 Oct 01 15:40:43 crc kubenswrapper[4771]: I1001 15:40:43.585653 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w9lkt" event={"ID":"daa6c1cb-f03a-4d56-8953-244f08b4c4c8","Type":"ContainerDied","Data":"21f44f1143f3e0e0526932c5f180a5b13bb0e184154b3c377e3be14f6eab28a4"} Oct 01 15:40:43 crc kubenswrapper[4771]: I1001 15:40:43.585710 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w9lkt" event={"ID":"daa6c1cb-f03a-4d56-8953-244f08b4c4c8","Type":"ContainerStarted","Data":"0dec2b495399d07b3e777f6248e253fa48528e55634956c50a308f8d68c2b77b"} Oct 01 15:40:44 crc kubenswrapper[4771]: I1001 15:40:44.601219 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w9lkt" event={"ID":"daa6c1cb-f03a-4d56-8953-244f08b4c4c8","Type":"ContainerStarted","Data":"dfb7d6b50197c78cbf63decd509700238b025b7bfc14080d4e29d089ad10c1b7"} Oct 01 15:40:45 crc kubenswrapper[4771]: I1001 15:40:45.616812 4771 generic.go:334] "Generic (PLEG): container finished" podID="daa6c1cb-f03a-4d56-8953-244f08b4c4c8" containerID="dfb7d6b50197c78cbf63decd509700238b025b7bfc14080d4e29d089ad10c1b7" exitCode=0 Oct 01 15:40:45 crc kubenswrapper[4771]: I1001 15:40:45.616912 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w9lkt" event={"ID":"daa6c1cb-f03a-4d56-8953-244f08b4c4c8","Type":"ContainerDied","Data":"dfb7d6b50197c78cbf63decd509700238b025b7bfc14080d4e29d089ad10c1b7"} Oct 01 15:40:46 crc kubenswrapper[4771]: I1001 15:40:46.632217 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w9lkt" event={"ID":"daa6c1cb-f03a-4d56-8953-244f08b4c4c8","Type":"ContainerStarted","Data":"00211f7c5f7ab2f72987b524b83bf626882822db9f8f70019e131bb7c4a0ef88"} Oct 01 15:40:46 crc kubenswrapper[4771]: I1001 15:40:46.663334 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-w9lkt" podStartSLOduration=2.22543617 podStartE2EDuration="4.663314629s" podCreationTimestamp="2025-10-01 15:40:42 +0000 UTC" firstStartedPulling="2025-10-01 15:40:43.589461835 +0000 UTC m=+2688.208637026" lastFinishedPulling="2025-10-01 15:40:46.027340304 +0000 UTC m=+2690.646515485" observedRunningTime="2025-10-01 15:40:46.661080692 +0000 UTC m=+2691.280255873" watchObservedRunningTime="2025-10-01 15:40:46.663314629 +0000 UTC m=+2691.282489800" Oct 01 15:40:52 crc kubenswrapper[4771]: I1001 15:40:52.600907 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-w9lkt" Oct 01 15:40:52 crc kubenswrapper[4771]: I1001 15:40:52.601426 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-w9lkt" Oct 01 15:40:52 crc kubenswrapper[4771]: I1001 15:40:52.643544 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-w9lkt" Oct 01 15:40:52 crc kubenswrapper[4771]: I1001 15:40:52.738191 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-w9lkt" Oct 01 15:40:52 crc kubenswrapper[4771]: I1001 15:40:52.876070 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w9lkt"] Oct 01 15:40:54 crc kubenswrapper[4771]: I1001 15:40:54.714345 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-w9lkt" podUID="daa6c1cb-f03a-4d56-8953-244f08b4c4c8" containerName="registry-server" containerID="cri-o://00211f7c5f7ab2f72987b524b83bf626882822db9f8f70019e131bb7c4a0ef88" gracePeriod=2 Oct 01 15:40:55 crc kubenswrapper[4771]: I1001 15:40:55.118850 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w9lkt" Oct 01 15:40:55 crc kubenswrapper[4771]: I1001 15:40:55.218288 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/daa6c1cb-f03a-4d56-8953-244f08b4c4c8-catalog-content\") pod \"daa6c1cb-f03a-4d56-8953-244f08b4c4c8\" (UID: \"daa6c1cb-f03a-4d56-8953-244f08b4c4c8\") " Oct 01 15:40:55 crc kubenswrapper[4771]: I1001 15:40:55.218457 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/daa6c1cb-f03a-4d56-8953-244f08b4c4c8-utilities\") pod \"daa6c1cb-f03a-4d56-8953-244f08b4c4c8\" (UID: \"daa6c1cb-f03a-4d56-8953-244f08b4c4c8\") " Oct 01 15:40:55 crc kubenswrapper[4771]: I1001 15:40:55.218515 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nc5h\" (UniqueName: \"kubernetes.io/projected/daa6c1cb-f03a-4d56-8953-244f08b4c4c8-kube-api-access-5nc5h\") pod \"daa6c1cb-f03a-4d56-8953-244f08b4c4c8\" (UID: \"daa6c1cb-f03a-4d56-8953-244f08b4c4c8\") " Oct 01 15:40:55 crc kubenswrapper[4771]: I1001 15:40:55.219502 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/daa6c1cb-f03a-4d56-8953-244f08b4c4c8-utilities" (OuterVolumeSpecName: "utilities") pod "daa6c1cb-f03a-4d56-8953-244f08b4c4c8" (UID: "daa6c1cb-f03a-4d56-8953-244f08b4c4c8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:40:55 crc kubenswrapper[4771]: I1001 15:40:55.225974 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/daa6c1cb-f03a-4d56-8953-244f08b4c4c8-kube-api-access-5nc5h" (OuterVolumeSpecName: "kube-api-access-5nc5h") pod "daa6c1cb-f03a-4d56-8953-244f08b4c4c8" (UID: "daa6c1cb-f03a-4d56-8953-244f08b4c4c8"). InnerVolumeSpecName "kube-api-access-5nc5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:40:55 crc kubenswrapper[4771]: I1001 15:40:55.231935 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/daa6c1cb-f03a-4d56-8953-244f08b4c4c8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "daa6c1cb-f03a-4d56-8953-244f08b4c4c8" (UID: "daa6c1cb-f03a-4d56-8953-244f08b4c4c8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:40:55 crc kubenswrapper[4771]: I1001 15:40:55.320445 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nc5h\" (UniqueName: \"kubernetes.io/projected/daa6c1cb-f03a-4d56-8953-244f08b4c4c8-kube-api-access-5nc5h\") on node \"crc\" DevicePath \"\"" Oct 01 15:40:55 crc kubenswrapper[4771]: I1001 15:40:55.320488 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/daa6c1cb-f03a-4d56-8953-244f08b4c4c8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 15:40:55 crc kubenswrapper[4771]: I1001 15:40:55.320498 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/daa6c1cb-f03a-4d56-8953-244f08b4c4c8-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 15:40:55 crc kubenswrapper[4771]: I1001 15:40:55.726813 4771 generic.go:334] "Generic (PLEG): container finished" podID="daa6c1cb-f03a-4d56-8953-244f08b4c4c8" containerID="00211f7c5f7ab2f72987b524b83bf626882822db9f8f70019e131bb7c4a0ef88" exitCode=0 Oct 01 15:40:55 crc kubenswrapper[4771]: I1001 15:40:55.726862 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w9lkt" event={"ID":"daa6c1cb-f03a-4d56-8953-244f08b4c4c8","Type":"ContainerDied","Data":"00211f7c5f7ab2f72987b524b83bf626882822db9f8f70019e131bb7c4a0ef88"} Oct 01 15:40:55 crc kubenswrapper[4771]: I1001 15:40:55.726891 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w9lkt" event={"ID":"daa6c1cb-f03a-4d56-8953-244f08b4c4c8","Type":"ContainerDied","Data":"0dec2b495399d07b3e777f6248e253fa48528e55634956c50a308f8d68c2b77b"} Oct 01 15:40:55 crc kubenswrapper[4771]: I1001 15:40:55.726908 4771 scope.go:117] "RemoveContainer" containerID="00211f7c5f7ab2f72987b524b83bf626882822db9f8f70019e131bb7c4a0ef88" Oct 01 15:40:55 crc kubenswrapper[4771]: I1001 15:40:55.726927 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w9lkt" Oct 01 15:40:55 crc kubenswrapper[4771]: I1001 15:40:55.749385 4771 scope.go:117] "RemoveContainer" containerID="dfb7d6b50197c78cbf63decd509700238b025b7bfc14080d4e29d089ad10c1b7" Oct 01 15:40:55 crc kubenswrapper[4771]: I1001 15:40:55.765274 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w9lkt"] Oct 01 15:40:55 crc kubenswrapper[4771]: I1001 15:40:55.772516 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-w9lkt"] Oct 01 15:40:55 crc kubenswrapper[4771]: I1001 15:40:55.776244 4771 scope.go:117] "RemoveContainer" containerID="21f44f1143f3e0e0526932c5f180a5b13bb0e184154b3c377e3be14f6eab28a4" Oct 01 15:40:55 crc kubenswrapper[4771]: I1001 15:40:55.824275 4771 scope.go:117] "RemoveContainer" containerID="00211f7c5f7ab2f72987b524b83bf626882822db9f8f70019e131bb7c4a0ef88" Oct 01 15:40:55 crc kubenswrapper[4771]: E1001 15:40:55.825304 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00211f7c5f7ab2f72987b524b83bf626882822db9f8f70019e131bb7c4a0ef88\": container with ID starting with 00211f7c5f7ab2f72987b524b83bf626882822db9f8f70019e131bb7c4a0ef88 not found: ID does not exist" containerID="00211f7c5f7ab2f72987b524b83bf626882822db9f8f70019e131bb7c4a0ef88" Oct 01 15:40:55 crc kubenswrapper[4771]: I1001 15:40:55.825356 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00211f7c5f7ab2f72987b524b83bf626882822db9f8f70019e131bb7c4a0ef88"} err="failed to get container status \"00211f7c5f7ab2f72987b524b83bf626882822db9f8f70019e131bb7c4a0ef88\": rpc error: code = NotFound desc = could not find container \"00211f7c5f7ab2f72987b524b83bf626882822db9f8f70019e131bb7c4a0ef88\": container with ID starting with 00211f7c5f7ab2f72987b524b83bf626882822db9f8f70019e131bb7c4a0ef88 not found: ID does not exist" Oct 01 15:40:55 crc kubenswrapper[4771]: I1001 15:40:55.825390 4771 scope.go:117] "RemoveContainer" containerID="dfb7d6b50197c78cbf63decd509700238b025b7bfc14080d4e29d089ad10c1b7" Oct 01 15:40:55 crc kubenswrapper[4771]: E1001 15:40:55.825823 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfb7d6b50197c78cbf63decd509700238b025b7bfc14080d4e29d089ad10c1b7\": container with ID starting with dfb7d6b50197c78cbf63decd509700238b025b7bfc14080d4e29d089ad10c1b7 not found: ID does not exist" containerID="dfb7d6b50197c78cbf63decd509700238b025b7bfc14080d4e29d089ad10c1b7" Oct 01 15:40:55 crc kubenswrapper[4771]: I1001 15:40:55.825866 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfb7d6b50197c78cbf63decd509700238b025b7bfc14080d4e29d089ad10c1b7"} err="failed to get container status \"dfb7d6b50197c78cbf63decd509700238b025b7bfc14080d4e29d089ad10c1b7\": rpc error: code = NotFound desc = could not find container \"dfb7d6b50197c78cbf63decd509700238b025b7bfc14080d4e29d089ad10c1b7\": container with ID starting with dfb7d6b50197c78cbf63decd509700238b025b7bfc14080d4e29d089ad10c1b7 not found: ID does not exist" Oct 01 15:40:55 crc kubenswrapper[4771]: I1001 15:40:55.825893 4771 scope.go:117] "RemoveContainer" containerID="21f44f1143f3e0e0526932c5f180a5b13bb0e184154b3c377e3be14f6eab28a4" Oct 01 15:40:55 crc kubenswrapper[4771]: E1001 15:40:55.826511 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21f44f1143f3e0e0526932c5f180a5b13bb0e184154b3c377e3be14f6eab28a4\": container with ID starting with 21f44f1143f3e0e0526932c5f180a5b13bb0e184154b3c377e3be14f6eab28a4 not found: ID does not exist" containerID="21f44f1143f3e0e0526932c5f180a5b13bb0e184154b3c377e3be14f6eab28a4" Oct 01 15:40:55 crc kubenswrapper[4771]: I1001 15:40:55.826551 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21f44f1143f3e0e0526932c5f180a5b13bb0e184154b3c377e3be14f6eab28a4"} err="failed to get container status \"21f44f1143f3e0e0526932c5f180a5b13bb0e184154b3c377e3be14f6eab28a4\": rpc error: code = NotFound desc = could not find container \"21f44f1143f3e0e0526932c5f180a5b13bb0e184154b3c377e3be14f6eab28a4\": container with ID starting with 21f44f1143f3e0e0526932c5f180a5b13bb0e184154b3c377e3be14f6eab28a4 not found: ID does not exist" Oct 01 15:40:56 crc kubenswrapper[4771]: I1001 15:40:56.009164 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="daa6c1cb-f03a-4d56-8953-244f08b4c4c8" path="/var/lib/kubelet/pods/daa6c1cb-f03a-4d56-8953-244f08b4c4c8/volumes" Oct 01 15:41:12 crc kubenswrapper[4771]: I1001 15:41:12.177443 4771 patch_prober.go:28] interesting pod/machine-config-daemon-vck47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 15:41:12 crc kubenswrapper[4771]: I1001 15:41:12.178096 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 15:41:12 crc kubenswrapper[4771]: I1001 15:41:12.178157 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vck47" Oct 01 15:41:12 crc kubenswrapper[4771]: I1001 15:41:12.179033 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0d782d6a5f89be0ead3cef9790ea83544120df7e378bb400716acc9b592f88b8"} pod="openshift-machine-config-operator/machine-config-daemon-vck47" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 15:41:12 crc kubenswrapper[4771]: I1001 15:41:12.179140 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" containerID="cri-o://0d782d6a5f89be0ead3cef9790ea83544120df7e378bb400716acc9b592f88b8" gracePeriod=600 Oct 01 15:41:12 crc kubenswrapper[4771]: I1001 15:41:12.924420 4771 generic.go:334] "Generic (PLEG): container finished" podID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerID="0d782d6a5f89be0ead3cef9790ea83544120df7e378bb400716acc9b592f88b8" exitCode=0 Oct 01 15:41:12 crc kubenswrapper[4771]: I1001 15:41:12.924468 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" event={"ID":"289ee6d3-fabe-417f-964c-76ca03c143cc","Type":"ContainerDied","Data":"0d782d6a5f89be0ead3cef9790ea83544120df7e378bb400716acc9b592f88b8"} Oct 01 15:41:12 crc kubenswrapper[4771]: I1001 15:41:12.924885 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" event={"ID":"289ee6d3-fabe-417f-964c-76ca03c143cc","Type":"ContainerStarted","Data":"f0d1081783e8ca281c7f5f2c017e855d55e05891a57714ef76e9593a71d942f8"} Oct 01 15:41:12 crc kubenswrapper[4771]: I1001 15:41:12.924918 4771 scope.go:117] "RemoveContainer" containerID="16ae180b9499312005658862660284e647910a63ac7bfb5649900bd12b06fcf8" Oct 01 15:41:33 crc kubenswrapper[4771]: I1001 15:41:33.810103 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-r8znx"] Oct 01 15:41:33 crc kubenswrapper[4771]: E1001 15:41:33.811251 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daa6c1cb-f03a-4d56-8953-244f08b4c4c8" containerName="extract-utilities" Oct 01 15:41:33 crc kubenswrapper[4771]: I1001 15:41:33.811274 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="daa6c1cb-f03a-4d56-8953-244f08b4c4c8" containerName="extract-utilities" Oct 01 15:41:33 crc kubenswrapper[4771]: E1001 15:41:33.811331 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daa6c1cb-f03a-4d56-8953-244f08b4c4c8" containerName="extract-content" Oct 01 15:41:33 crc kubenswrapper[4771]: I1001 15:41:33.811342 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="daa6c1cb-f03a-4d56-8953-244f08b4c4c8" containerName="extract-content" Oct 01 15:41:33 crc kubenswrapper[4771]: E1001 15:41:33.811355 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daa6c1cb-f03a-4d56-8953-244f08b4c4c8" containerName="registry-server" Oct 01 15:41:33 crc kubenswrapper[4771]: I1001 15:41:33.811367 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="daa6c1cb-f03a-4d56-8953-244f08b4c4c8" containerName="registry-server" Oct 01 15:41:33 crc kubenswrapper[4771]: I1001 15:41:33.811610 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="daa6c1cb-f03a-4d56-8953-244f08b4c4c8" containerName="registry-server" Oct 01 15:41:33 crc kubenswrapper[4771]: I1001 15:41:33.813617 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r8znx" Oct 01 15:41:33 crc kubenswrapper[4771]: I1001 15:41:33.828271 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r8znx"] Oct 01 15:41:33 crc kubenswrapper[4771]: I1001 15:41:33.911988 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a264cf5-43d0-496e-9b68-214da2676f4c-utilities\") pod \"community-operators-r8znx\" (UID: \"8a264cf5-43d0-496e-9b68-214da2676f4c\") " pod="openshift-marketplace/community-operators-r8znx" Oct 01 15:41:33 crc kubenswrapper[4771]: I1001 15:41:33.912050 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79rqc\" (UniqueName: \"kubernetes.io/projected/8a264cf5-43d0-496e-9b68-214da2676f4c-kube-api-access-79rqc\") pod \"community-operators-r8znx\" (UID: \"8a264cf5-43d0-496e-9b68-214da2676f4c\") " pod="openshift-marketplace/community-operators-r8znx" Oct 01 15:41:33 crc kubenswrapper[4771]: I1001 15:41:33.912079 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a264cf5-43d0-496e-9b68-214da2676f4c-catalog-content\") pod \"community-operators-r8znx\" (UID: \"8a264cf5-43d0-496e-9b68-214da2676f4c\") " pod="openshift-marketplace/community-operators-r8znx" Oct 01 15:41:34 crc kubenswrapper[4771]: I1001 15:41:34.013863 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a264cf5-43d0-496e-9b68-214da2676f4c-utilities\") pod \"community-operators-r8znx\" (UID: \"8a264cf5-43d0-496e-9b68-214da2676f4c\") " pod="openshift-marketplace/community-operators-r8znx" Oct 01 15:41:34 crc kubenswrapper[4771]: I1001 15:41:34.013938 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79rqc\" (UniqueName: \"kubernetes.io/projected/8a264cf5-43d0-496e-9b68-214da2676f4c-kube-api-access-79rqc\") pod \"community-operators-r8znx\" (UID: \"8a264cf5-43d0-496e-9b68-214da2676f4c\") " pod="openshift-marketplace/community-operators-r8znx" Oct 01 15:41:34 crc kubenswrapper[4771]: I1001 15:41:34.013971 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a264cf5-43d0-496e-9b68-214da2676f4c-catalog-content\") pod \"community-operators-r8znx\" (UID: \"8a264cf5-43d0-496e-9b68-214da2676f4c\") " pod="openshift-marketplace/community-operators-r8znx" Oct 01 15:41:34 crc kubenswrapper[4771]: I1001 15:41:34.014414 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a264cf5-43d0-496e-9b68-214da2676f4c-utilities\") pod \"community-operators-r8znx\" (UID: \"8a264cf5-43d0-496e-9b68-214da2676f4c\") " pod="openshift-marketplace/community-operators-r8znx" Oct 01 15:41:34 crc kubenswrapper[4771]: I1001 15:41:34.014459 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a264cf5-43d0-496e-9b68-214da2676f4c-catalog-content\") pod \"community-operators-r8znx\" (UID: \"8a264cf5-43d0-496e-9b68-214da2676f4c\") " pod="openshift-marketplace/community-operators-r8znx" Oct 01 15:41:34 crc kubenswrapper[4771]: I1001 15:41:34.040828 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79rqc\" (UniqueName: \"kubernetes.io/projected/8a264cf5-43d0-496e-9b68-214da2676f4c-kube-api-access-79rqc\") pod \"community-operators-r8znx\" (UID: \"8a264cf5-43d0-496e-9b68-214da2676f4c\") " pod="openshift-marketplace/community-operators-r8znx" Oct 01 15:41:34 crc kubenswrapper[4771]: I1001 15:41:34.140686 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r8znx" Oct 01 15:41:34 crc kubenswrapper[4771]: I1001 15:41:34.696110 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r8znx"] Oct 01 15:41:35 crc kubenswrapper[4771]: I1001 15:41:35.156971 4771 generic.go:334] "Generic (PLEG): container finished" podID="8a264cf5-43d0-496e-9b68-214da2676f4c" containerID="5c5d6ecc23cf71b9a0ef685cd9930c7757647cc2f3b952bfe0d6a1d4b7743302" exitCode=0 Oct 01 15:41:35 crc kubenswrapper[4771]: I1001 15:41:35.157054 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8znx" event={"ID":"8a264cf5-43d0-496e-9b68-214da2676f4c","Type":"ContainerDied","Data":"5c5d6ecc23cf71b9a0ef685cd9930c7757647cc2f3b952bfe0d6a1d4b7743302"} Oct 01 15:41:35 crc kubenswrapper[4771]: I1001 15:41:35.157104 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8znx" event={"ID":"8a264cf5-43d0-496e-9b68-214da2676f4c","Type":"ContainerStarted","Data":"48a587cad01dbaef0c5e4d5bc6ec3731cc958e644662a99075f57bc662b69ee2"} Oct 01 15:41:35 crc kubenswrapper[4771]: I1001 15:41:35.160662 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 15:41:36 crc kubenswrapper[4771]: I1001 15:41:36.166770 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8znx" event={"ID":"8a264cf5-43d0-496e-9b68-214da2676f4c","Type":"ContainerStarted","Data":"ef0c8290df6fabf0c6c10f312ab24107df793446b2ad148b3122057753912c99"} Oct 01 15:41:37 crc kubenswrapper[4771]: I1001 15:41:37.181293 4771 generic.go:334] "Generic (PLEG): container finished" podID="8a264cf5-43d0-496e-9b68-214da2676f4c" containerID="ef0c8290df6fabf0c6c10f312ab24107df793446b2ad148b3122057753912c99" exitCode=0 Oct 01 15:41:37 crc kubenswrapper[4771]: I1001 15:41:37.181450 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8znx" event={"ID":"8a264cf5-43d0-496e-9b68-214da2676f4c","Type":"ContainerDied","Data":"ef0c8290df6fabf0c6c10f312ab24107df793446b2ad148b3122057753912c99"} Oct 01 15:41:38 crc kubenswrapper[4771]: I1001 15:41:38.195784 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8znx" event={"ID":"8a264cf5-43d0-496e-9b68-214da2676f4c","Type":"ContainerStarted","Data":"eae890180264d913769a5967e37a310629615728aa23a6b707de2c3664062930"} Oct 01 15:41:38 crc kubenswrapper[4771]: I1001 15:41:38.232889 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-r8znx" podStartSLOduration=2.693960002 podStartE2EDuration="5.232862047s" podCreationTimestamp="2025-10-01 15:41:33 +0000 UTC" firstStartedPulling="2025-10-01 15:41:35.160359798 +0000 UTC m=+2739.779534989" lastFinishedPulling="2025-10-01 15:41:37.699261853 +0000 UTC m=+2742.318437034" observedRunningTime="2025-10-01 15:41:38.219595873 +0000 UTC m=+2742.838771084" watchObservedRunningTime="2025-10-01 15:41:38.232862047 +0000 UTC m=+2742.852037248" Oct 01 15:41:44 crc kubenswrapper[4771]: I1001 15:41:44.141469 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-r8znx" Oct 01 15:41:44 crc kubenswrapper[4771]: I1001 15:41:44.142097 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-r8znx" Oct 01 15:41:44 crc kubenswrapper[4771]: I1001 15:41:44.229627 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-r8znx" Oct 01 15:41:44 crc kubenswrapper[4771]: I1001 15:41:44.345909 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-r8znx" Oct 01 15:41:44 crc kubenswrapper[4771]: I1001 15:41:44.473944 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r8znx"] Oct 01 15:41:46 crc kubenswrapper[4771]: I1001 15:41:46.301560 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-r8znx" podUID="8a264cf5-43d0-496e-9b68-214da2676f4c" containerName="registry-server" containerID="cri-o://eae890180264d913769a5967e37a310629615728aa23a6b707de2c3664062930" gracePeriod=2 Oct 01 15:41:46 crc kubenswrapper[4771]: I1001 15:41:46.794363 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r8znx" Oct 01 15:41:46 crc kubenswrapper[4771]: I1001 15:41:46.879169 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79rqc\" (UniqueName: \"kubernetes.io/projected/8a264cf5-43d0-496e-9b68-214da2676f4c-kube-api-access-79rqc\") pod \"8a264cf5-43d0-496e-9b68-214da2676f4c\" (UID: \"8a264cf5-43d0-496e-9b68-214da2676f4c\") " Oct 01 15:41:46 crc kubenswrapper[4771]: I1001 15:41:46.879435 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a264cf5-43d0-496e-9b68-214da2676f4c-utilities\") pod \"8a264cf5-43d0-496e-9b68-214da2676f4c\" (UID: \"8a264cf5-43d0-496e-9b68-214da2676f4c\") " Oct 01 15:41:46 crc kubenswrapper[4771]: I1001 15:41:46.879545 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a264cf5-43d0-496e-9b68-214da2676f4c-catalog-content\") pod \"8a264cf5-43d0-496e-9b68-214da2676f4c\" (UID: \"8a264cf5-43d0-496e-9b68-214da2676f4c\") " Oct 01 15:41:46 crc kubenswrapper[4771]: I1001 15:41:46.884073 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a264cf5-43d0-496e-9b68-214da2676f4c-utilities" (OuterVolumeSpecName: "utilities") pod "8a264cf5-43d0-496e-9b68-214da2676f4c" (UID: "8a264cf5-43d0-496e-9b68-214da2676f4c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:41:46 crc kubenswrapper[4771]: I1001 15:41:46.892992 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a264cf5-43d0-496e-9b68-214da2676f4c-kube-api-access-79rqc" (OuterVolumeSpecName: "kube-api-access-79rqc") pod "8a264cf5-43d0-496e-9b68-214da2676f4c" (UID: "8a264cf5-43d0-496e-9b68-214da2676f4c"). InnerVolumeSpecName "kube-api-access-79rqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:41:46 crc kubenswrapper[4771]: I1001 15:41:46.948418 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a264cf5-43d0-496e-9b68-214da2676f4c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8a264cf5-43d0-496e-9b68-214da2676f4c" (UID: "8a264cf5-43d0-496e-9b68-214da2676f4c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:41:47 crc kubenswrapper[4771]: I1001 15:41:47.019539 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a264cf5-43d0-496e-9b68-214da2676f4c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 15:41:47 crc kubenswrapper[4771]: I1001 15:41:47.019576 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79rqc\" (UniqueName: \"kubernetes.io/projected/8a264cf5-43d0-496e-9b68-214da2676f4c-kube-api-access-79rqc\") on node \"crc\" DevicePath \"\"" Oct 01 15:41:47 crc kubenswrapper[4771]: I1001 15:41:47.019589 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a264cf5-43d0-496e-9b68-214da2676f4c-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 15:41:47 crc kubenswrapper[4771]: I1001 15:41:47.312260 4771 generic.go:334] "Generic (PLEG): container finished" podID="8a264cf5-43d0-496e-9b68-214da2676f4c" containerID="eae890180264d913769a5967e37a310629615728aa23a6b707de2c3664062930" exitCode=0 Oct 01 15:41:47 crc kubenswrapper[4771]: I1001 15:41:47.312304 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8znx" event={"ID":"8a264cf5-43d0-496e-9b68-214da2676f4c","Type":"ContainerDied","Data":"eae890180264d913769a5967e37a310629615728aa23a6b707de2c3664062930"} Oct 01 15:41:47 crc kubenswrapper[4771]: I1001 15:41:47.312354 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8znx" event={"ID":"8a264cf5-43d0-496e-9b68-214da2676f4c","Type":"ContainerDied","Data":"48a587cad01dbaef0c5e4d5bc6ec3731cc958e644662a99075f57bc662b69ee2"} Oct 01 15:41:47 crc kubenswrapper[4771]: I1001 15:41:47.312374 4771 scope.go:117] "RemoveContainer" containerID="eae890180264d913769a5967e37a310629615728aa23a6b707de2c3664062930" Oct 01 15:41:47 crc kubenswrapper[4771]: I1001 15:41:47.313558 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r8znx" Oct 01 15:41:47 crc kubenswrapper[4771]: I1001 15:41:47.337834 4771 scope.go:117] "RemoveContainer" containerID="ef0c8290df6fabf0c6c10f312ab24107df793446b2ad148b3122057753912c99" Oct 01 15:41:47 crc kubenswrapper[4771]: I1001 15:41:47.360134 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r8znx"] Oct 01 15:41:47 crc kubenswrapper[4771]: I1001 15:41:47.371849 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-r8znx"] Oct 01 15:41:47 crc kubenswrapper[4771]: I1001 15:41:47.373503 4771 scope.go:117] "RemoveContainer" containerID="5c5d6ecc23cf71b9a0ef685cd9930c7757647cc2f3b952bfe0d6a1d4b7743302" Oct 01 15:41:47 crc kubenswrapper[4771]: I1001 15:41:47.447028 4771 scope.go:117] "RemoveContainer" containerID="eae890180264d913769a5967e37a310629615728aa23a6b707de2c3664062930" Oct 01 15:41:47 crc kubenswrapper[4771]: E1001 15:41:47.447575 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eae890180264d913769a5967e37a310629615728aa23a6b707de2c3664062930\": container with ID starting with eae890180264d913769a5967e37a310629615728aa23a6b707de2c3664062930 not found: ID does not exist" containerID="eae890180264d913769a5967e37a310629615728aa23a6b707de2c3664062930" Oct 01 15:41:47 crc kubenswrapper[4771]: I1001 15:41:47.447633 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eae890180264d913769a5967e37a310629615728aa23a6b707de2c3664062930"} err="failed to get container status \"eae890180264d913769a5967e37a310629615728aa23a6b707de2c3664062930\": rpc error: code = NotFound desc = could not find container \"eae890180264d913769a5967e37a310629615728aa23a6b707de2c3664062930\": container with ID starting with eae890180264d913769a5967e37a310629615728aa23a6b707de2c3664062930 not found: ID does not exist" Oct 01 15:41:47 crc kubenswrapper[4771]: I1001 15:41:47.447663 4771 scope.go:117] "RemoveContainer" containerID="ef0c8290df6fabf0c6c10f312ab24107df793446b2ad148b3122057753912c99" Oct 01 15:41:47 crc kubenswrapper[4771]: E1001 15:41:47.448036 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef0c8290df6fabf0c6c10f312ab24107df793446b2ad148b3122057753912c99\": container with ID starting with ef0c8290df6fabf0c6c10f312ab24107df793446b2ad148b3122057753912c99 not found: ID does not exist" containerID="ef0c8290df6fabf0c6c10f312ab24107df793446b2ad148b3122057753912c99" Oct 01 15:41:47 crc kubenswrapper[4771]: I1001 15:41:47.448067 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef0c8290df6fabf0c6c10f312ab24107df793446b2ad148b3122057753912c99"} err="failed to get container status \"ef0c8290df6fabf0c6c10f312ab24107df793446b2ad148b3122057753912c99\": rpc error: code = NotFound desc = could not find container \"ef0c8290df6fabf0c6c10f312ab24107df793446b2ad148b3122057753912c99\": container with ID starting with ef0c8290df6fabf0c6c10f312ab24107df793446b2ad148b3122057753912c99 not found: ID does not exist" Oct 01 15:41:47 crc kubenswrapper[4771]: I1001 15:41:47.448089 4771 scope.go:117] "RemoveContainer" containerID="5c5d6ecc23cf71b9a0ef685cd9930c7757647cc2f3b952bfe0d6a1d4b7743302" Oct 01 15:41:47 crc kubenswrapper[4771]: E1001 15:41:47.448389 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c5d6ecc23cf71b9a0ef685cd9930c7757647cc2f3b952bfe0d6a1d4b7743302\": container with ID starting with 5c5d6ecc23cf71b9a0ef685cd9930c7757647cc2f3b952bfe0d6a1d4b7743302 not found: ID does not exist" containerID="5c5d6ecc23cf71b9a0ef685cd9930c7757647cc2f3b952bfe0d6a1d4b7743302" Oct 01 15:41:47 crc kubenswrapper[4771]: I1001 15:41:47.448422 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c5d6ecc23cf71b9a0ef685cd9930c7757647cc2f3b952bfe0d6a1d4b7743302"} err="failed to get container status \"5c5d6ecc23cf71b9a0ef685cd9930c7757647cc2f3b952bfe0d6a1d4b7743302\": rpc error: code = NotFound desc = could not find container \"5c5d6ecc23cf71b9a0ef685cd9930c7757647cc2f3b952bfe0d6a1d4b7743302\": container with ID starting with 5c5d6ecc23cf71b9a0ef685cd9930c7757647cc2f3b952bfe0d6a1d4b7743302 not found: ID does not exist" Oct 01 15:41:47 crc kubenswrapper[4771]: I1001 15:41:47.999162 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a264cf5-43d0-496e-9b68-214da2676f4c" path="/var/lib/kubelet/pods/8a264cf5-43d0-496e-9b68-214da2676f4c/volumes" Oct 01 15:41:59 crc kubenswrapper[4771]: I1001 15:41:59.452303 4771 generic.go:334] "Generic (PLEG): container finished" podID="b938863c-4e4f-414a-9b0b-2d2583d9ae0c" containerID="965ed52cfa015a7187563c13643581155aa21ec6fbedf3c00ee8de7838d8b47a" exitCode=0 Oct 01 15:41:59 crc kubenswrapper[4771]: I1001 15:41:59.452422 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6" event={"ID":"b938863c-4e4f-414a-9b0b-2d2583d9ae0c","Type":"ContainerDied","Data":"965ed52cfa015a7187563c13643581155aa21ec6fbedf3c00ee8de7838d8b47a"} Oct 01 15:42:00 crc kubenswrapper[4771]: I1001 15:42:00.935588 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6" Oct 01 15:42:00 crc kubenswrapper[4771]: I1001 15:42:00.977053 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b938863c-4e4f-414a-9b0b-2d2583d9ae0c-telemetry-combined-ca-bundle\") pod \"b938863c-4e4f-414a-9b0b-2d2583d9ae0c\" (UID: \"b938863c-4e4f-414a-9b0b-2d2583d9ae0c\") " Oct 01 15:42:00 crc kubenswrapper[4771]: I1001 15:42:00.977124 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b938863c-4e4f-414a-9b0b-2d2583d9ae0c-inventory\") pod \"b938863c-4e4f-414a-9b0b-2d2583d9ae0c\" (UID: \"b938863c-4e4f-414a-9b0b-2d2583d9ae0c\") " Oct 01 15:42:00 crc kubenswrapper[4771]: I1001 15:42:00.977176 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b938863c-4e4f-414a-9b0b-2d2583d9ae0c-ceilometer-compute-config-data-1\") pod \"b938863c-4e4f-414a-9b0b-2d2583d9ae0c\" (UID: \"b938863c-4e4f-414a-9b0b-2d2583d9ae0c\") " Oct 01 15:42:00 crc kubenswrapper[4771]: I1001 15:42:00.977222 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2qns\" (UniqueName: \"kubernetes.io/projected/b938863c-4e4f-414a-9b0b-2d2583d9ae0c-kube-api-access-d2qns\") pod \"b938863c-4e4f-414a-9b0b-2d2583d9ae0c\" (UID: \"b938863c-4e4f-414a-9b0b-2d2583d9ae0c\") " Oct 01 15:42:00 crc kubenswrapper[4771]: I1001 15:42:00.977303 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b938863c-4e4f-414a-9b0b-2d2583d9ae0c-ceilometer-compute-config-data-2\") pod \"b938863c-4e4f-414a-9b0b-2d2583d9ae0c\" (UID: \"b938863c-4e4f-414a-9b0b-2d2583d9ae0c\") " Oct 01 15:42:00 crc kubenswrapper[4771]: I1001 15:42:00.977352 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b938863c-4e4f-414a-9b0b-2d2583d9ae0c-ssh-key\") pod \"b938863c-4e4f-414a-9b0b-2d2583d9ae0c\" (UID: \"b938863c-4e4f-414a-9b0b-2d2583d9ae0c\") " Oct 01 15:42:00 crc kubenswrapper[4771]: I1001 15:42:00.977397 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b938863c-4e4f-414a-9b0b-2d2583d9ae0c-ceilometer-compute-config-data-0\") pod \"b938863c-4e4f-414a-9b0b-2d2583d9ae0c\" (UID: \"b938863c-4e4f-414a-9b0b-2d2583d9ae0c\") " Oct 01 15:42:00 crc kubenswrapper[4771]: I1001 15:42:00.984556 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b938863c-4e4f-414a-9b0b-2d2583d9ae0c-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "b938863c-4e4f-414a-9b0b-2d2583d9ae0c" (UID: "b938863c-4e4f-414a-9b0b-2d2583d9ae0c"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:42:00 crc kubenswrapper[4771]: I1001 15:42:00.998218 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b938863c-4e4f-414a-9b0b-2d2583d9ae0c-kube-api-access-d2qns" (OuterVolumeSpecName: "kube-api-access-d2qns") pod "b938863c-4e4f-414a-9b0b-2d2583d9ae0c" (UID: "b938863c-4e4f-414a-9b0b-2d2583d9ae0c"). InnerVolumeSpecName "kube-api-access-d2qns". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:42:01 crc kubenswrapper[4771]: I1001 15:42:01.013583 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b938863c-4e4f-414a-9b0b-2d2583d9ae0c-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "b938863c-4e4f-414a-9b0b-2d2583d9ae0c" (UID: "b938863c-4e4f-414a-9b0b-2d2583d9ae0c"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:42:01 crc kubenswrapper[4771]: I1001 15:42:01.014041 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b938863c-4e4f-414a-9b0b-2d2583d9ae0c-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "b938863c-4e4f-414a-9b0b-2d2583d9ae0c" (UID: "b938863c-4e4f-414a-9b0b-2d2583d9ae0c"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:42:01 crc kubenswrapper[4771]: I1001 15:42:01.019356 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b938863c-4e4f-414a-9b0b-2d2583d9ae0c-inventory" (OuterVolumeSpecName: "inventory") pod "b938863c-4e4f-414a-9b0b-2d2583d9ae0c" (UID: "b938863c-4e4f-414a-9b0b-2d2583d9ae0c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:42:01 crc kubenswrapper[4771]: I1001 15:42:01.019901 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b938863c-4e4f-414a-9b0b-2d2583d9ae0c-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "b938863c-4e4f-414a-9b0b-2d2583d9ae0c" (UID: "b938863c-4e4f-414a-9b0b-2d2583d9ae0c"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:42:01 crc kubenswrapper[4771]: I1001 15:42:01.024903 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b938863c-4e4f-414a-9b0b-2d2583d9ae0c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b938863c-4e4f-414a-9b0b-2d2583d9ae0c" (UID: "b938863c-4e4f-414a-9b0b-2d2583d9ae0c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:42:01 crc kubenswrapper[4771]: I1001 15:42:01.079460 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b938863c-4e4f-414a-9b0b-2d2583d9ae0c-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 15:42:01 crc kubenswrapper[4771]: I1001 15:42:01.079487 4771 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/b938863c-4e4f-414a-9b0b-2d2583d9ae0c-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 01 15:42:01 crc kubenswrapper[4771]: I1001 15:42:01.079498 4771 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b938863c-4e4f-414a-9b0b-2d2583d9ae0c-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 15:42:01 crc kubenswrapper[4771]: I1001 15:42:01.079508 4771 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b938863c-4e4f-414a-9b0b-2d2583d9ae0c-inventory\") on node \"crc\" DevicePath \"\"" Oct 01 15:42:01 crc kubenswrapper[4771]: I1001 15:42:01.079516 4771 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/b938863c-4e4f-414a-9b0b-2d2583d9ae0c-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 01 15:42:01 crc kubenswrapper[4771]: I1001 15:42:01.079525 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2qns\" (UniqueName: \"kubernetes.io/projected/b938863c-4e4f-414a-9b0b-2d2583d9ae0c-kube-api-access-d2qns\") on node \"crc\" DevicePath \"\"" Oct 01 15:42:01 crc kubenswrapper[4771]: I1001 15:42:01.079534 4771 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/b938863c-4e4f-414a-9b0b-2d2583d9ae0c-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Oct 01 15:42:01 crc kubenswrapper[4771]: I1001 15:42:01.475534 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6" event={"ID":"b938863c-4e4f-414a-9b0b-2d2583d9ae0c","Type":"ContainerDied","Data":"93d90989e85ee000fe11a13316ddc339c70d03da206489282a7d4c1240ffd1ae"} Oct 01 15:42:01 crc kubenswrapper[4771]: I1001 15:42:01.475575 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93d90989e85ee000fe11a13316ddc339c70d03da206489282a7d4c1240ffd1ae" Oct 01 15:42:01 crc kubenswrapper[4771]: I1001 15:42:01.475683 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6" Oct 01 15:42:38 crc kubenswrapper[4771]: I1001 15:42:38.424624 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7l66t"] Oct 01 15:42:38 crc kubenswrapper[4771]: E1001 15:42:38.425854 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a264cf5-43d0-496e-9b68-214da2676f4c" containerName="extract-utilities" Oct 01 15:42:38 crc kubenswrapper[4771]: I1001 15:42:38.425877 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a264cf5-43d0-496e-9b68-214da2676f4c" containerName="extract-utilities" Oct 01 15:42:38 crc kubenswrapper[4771]: E1001 15:42:38.425939 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a264cf5-43d0-496e-9b68-214da2676f4c" containerName="extract-content" Oct 01 15:42:38 crc kubenswrapper[4771]: I1001 15:42:38.425953 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a264cf5-43d0-496e-9b68-214da2676f4c" containerName="extract-content" Oct 01 15:42:38 crc kubenswrapper[4771]: E1001 15:42:38.425977 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a264cf5-43d0-496e-9b68-214da2676f4c" containerName="registry-server" Oct 01 15:42:38 crc kubenswrapper[4771]: I1001 15:42:38.425993 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a264cf5-43d0-496e-9b68-214da2676f4c" containerName="registry-server" Oct 01 15:42:38 crc kubenswrapper[4771]: E1001 15:42:38.426028 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b938863c-4e4f-414a-9b0b-2d2583d9ae0c" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 01 15:42:38 crc kubenswrapper[4771]: I1001 15:42:38.426041 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b938863c-4e4f-414a-9b0b-2d2583d9ae0c" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 01 15:42:38 crc kubenswrapper[4771]: I1001 15:42:38.426382 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a264cf5-43d0-496e-9b68-214da2676f4c" containerName="registry-server" Oct 01 15:42:38 crc kubenswrapper[4771]: I1001 15:42:38.426409 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b938863c-4e4f-414a-9b0b-2d2583d9ae0c" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 01 15:42:38 crc kubenswrapper[4771]: I1001 15:42:38.428793 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7l66t" Oct 01 15:42:38 crc kubenswrapper[4771]: I1001 15:42:38.453863 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7l66t"] Oct 01 15:42:38 crc kubenswrapper[4771]: I1001 15:42:38.566692 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crq8x\" (UniqueName: \"kubernetes.io/projected/5219ba20-7cd6-45d0-aa66-67e8eff7280d-kube-api-access-crq8x\") pod \"redhat-operators-7l66t\" (UID: \"5219ba20-7cd6-45d0-aa66-67e8eff7280d\") " pod="openshift-marketplace/redhat-operators-7l66t" Oct 01 15:42:38 crc kubenswrapper[4771]: I1001 15:42:38.567293 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5219ba20-7cd6-45d0-aa66-67e8eff7280d-catalog-content\") pod \"redhat-operators-7l66t\" (UID: \"5219ba20-7cd6-45d0-aa66-67e8eff7280d\") " pod="openshift-marketplace/redhat-operators-7l66t" Oct 01 15:42:38 crc kubenswrapper[4771]: I1001 15:42:38.567467 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5219ba20-7cd6-45d0-aa66-67e8eff7280d-utilities\") pod \"redhat-operators-7l66t\" (UID: \"5219ba20-7cd6-45d0-aa66-67e8eff7280d\") " pod="openshift-marketplace/redhat-operators-7l66t" Oct 01 15:42:38 crc kubenswrapper[4771]: I1001 15:42:38.669519 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crq8x\" (UniqueName: \"kubernetes.io/projected/5219ba20-7cd6-45d0-aa66-67e8eff7280d-kube-api-access-crq8x\") pod \"redhat-operators-7l66t\" (UID: \"5219ba20-7cd6-45d0-aa66-67e8eff7280d\") " pod="openshift-marketplace/redhat-operators-7l66t" Oct 01 15:42:38 crc kubenswrapper[4771]: I1001 15:42:38.670170 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5219ba20-7cd6-45d0-aa66-67e8eff7280d-catalog-content\") pod \"redhat-operators-7l66t\" (UID: \"5219ba20-7cd6-45d0-aa66-67e8eff7280d\") " pod="openshift-marketplace/redhat-operators-7l66t" Oct 01 15:42:38 crc kubenswrapper[4771]: I1001 15:42:38.670236 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5219ba20-7cd6-45d0-aa66-67e8eff7280d-utilities\") pod \"redhat-operators-7l66t\" (UID: \"5219ba20-7cd6-45d0-aa66-67e8eff7280d\") " pod="openshift-marketplace/redhat-operators-7l66t" Oct 01 15:42:38 crc kubenswrapper[4771]: I1001 15:42:38.670944 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5219ba20-7cd6-45d0-aa66-67e8eff7280d-utilities\") pod \"redhat-operators-7l66t\" (UID: \"5219ba20-7cd6-45d0-aa66-67e8eff7280d\") " pod="openshift-marketplace/redhat-operators-7l66t" Oct 01 15:42:38 crc kubenswrapper[4771]: I1001 15:42:38.672946 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5219ba20-7cd6-45d0-aa66-67e8eff7280d-catalog-content\") pod \"redhat-operators-7l66t\" (UID: \"5219ba20-7cd6-45d0-aa66-67e8eff7280d\") " pod="openshift-marketplace/redhat-operators-7l66t" Oct 01 15:42:38 crc kubenswrapper[4771]: I1001 15:42:38.706839 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crq8x\" (UniqueName: \"kubernetes.io/projected/5219ba20-7cd6-45d0-aa66-67e8eff7280d-kube-api-access-crq8x\") pod \"redhat-operators-7l66t\" (UID: \"5219ba20-7cd6-45d0-aa66-67e8eff7280d\") " pod="openshift-marketplace/redhat-operators-7l66t" Oct 01 15:42:38 crc kubenswrapper[4771]: I1001 15:42:38.762429 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7l66t" Oct 01 15:42:39 crc kubenswrapper[4771]: I1001 15:42:39.256053 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7l66t"] Oct 01 15:42:39 crc kubenswrapper[4771]: I1001 15:42:39.875630 4771 generic.go:334] "Generic (PLEG): container finished" podID="5219ba20-7cd6-45d0-aa66-67e8eff7280d" containerID="3ac08ffd028ce3f59ebab53c160123ba159423008425397f48e3e6d25e963e9e" exitCode=0 Oct 01 15:42:39 crc kubenswrapper[4771]: I1001 15:42:39.875770 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7l66t" event={"ID":"5219ba20-7cd6-45d0-aa66-67e8eff7280d","Type":"ContainerDied","Data":"3ac08ffd028ce3f59ebab53c160123ba159423008425397f48e3e6d25e963e9e"} Oct 01 15:42:39 crc kubenswrapper[4771]: I1001 15:42:39.876037 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7l66t" event={"ID":"5219ba20-7cd6-45d0-aa66-67e8eff7280d","Type":"ContainerStarted","Data":"a86de9e7d9dee72f3c72900a83afb4dc7a8c1c622aac59f045cc493282be68c7"} Oct 01 15:42:40 crc kubenswrapper[4771]: I1001 15:42:40.886620 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7l66t" event={"ID":"5219ba20-7cd6-45d0-aa66-67e8eff7280d","Type":"ContainerStarted","Data":"5360faacca0e88cf182822506989855d662202326e629bffa73a53b48b991175"} Oct 01 15:42:41 crc kubenswrapper[4771]: I1001 15:42:41.897360 4771 generic.go:334] "Generic (PLEG): container finished" podID="5219ba20-7cd6-45d0-aa66-67e8eff7280d" containerID="5360faacca0e88cf182822506989855d662202326e629bffa73a53b48b991175" exitCode=0 Oct 01 15:42:41 crc kubenswrapper[4771]: I1001 15:42:41.897488 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7l66t" event={"ID":"5219ba20-7cd6-45d0-aa66-67e8eff7280d","Type":"ContainerDied","Data":"5360faacca0e88cf182822506989855d662202326e629bffa73a53b48b991175"} Oct 01 15:42:42 crc kubenswrapper[4771]: I1001 15:42:42.923871 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7l66t" event={"ID":"5219ba20-7cd6-45d0-aa66-67e8eff7280d","Type":"ContainerStarted","Data":"23674b45444395fa685cdb91eaee465b8cb95de4775426aac59ef98f41be3b1b"} Oct 01 15:42:42 crc kubenswrapper[4771]: I1001 15:42:42.955561 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7l66t" podStartSLOduration=2.450699734 podStartE2EDuration="4.95552736s" podCreationTimestamp="2025-10-01 15:42:38 +0000 UTC" firstStartedPulling="2025-10-01 15:42:39.878856889 +0000 UTC m=+2804.498032060" lastFinishedPulling="2025-10-01 15:42:42.383684515 +0000 UTC m=+2807.002859686" observedRunningTime="2025-10-01 15:42:42.951261874 +0000 UTC m=+2807.570437055" watchObservedRunningTime="2025-10-01 15:42:42.95552736 +0000 UTC m=+2807.574702571" Oct 01 15:42:48 crc kubenswrapper[4771]: I1001 15:42:48.763614 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7l66t" Oct 01 15:42:48 crc kubenswrapper[4771]: I1001 15:42:48.764161 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7l66t" Oct 01 15:42:48 crc kubenswrapper[4771]: I1001 15:42:48.845245 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7l66t" Oct 01 15:42:49 crc kubenswrapper[4771]: I1001 15:42:49.055041 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7l66t" Oct 01 15:42:49 crc kubenswrapper[4771]: I1001 15:42:49.115282 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7l66t"] Oct 01 15:42:51 crc kubenswrapper[4771]: I1001 15:42:51.007823 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7l66t" podUID="5219ba20-7cd6-45d0-aa66-67e8eff7280d" containerName="registry-server" containerID="cri-o://23674b45444395fa685cdb91eaee465b8cb95de4775426aac59ef98f41be3b1b" gracePeriod=2 Oct 01 15:42:51 crc kubenswrapper[4771]: I1001 15:42:51.452524 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7l66t" Oct 01 15:42:51 crc kubenswrapper[4771]: I1001 15:42:51.560367 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5219ba20-7cd6-45d0-aa66-67e8eff7280d-catalog-content\") pod \"5219ba20-7cd6-45d0-aa66-67e8eff7280d\" (UID: \"5219ba20-7cd6-45d0-aa66-67e8eff7280d\") " Oct 01 15:42:51 crc kubenswrapper[4771]: I1001 15:42:51.560614 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crq8x\" (UniqueName: \"kubernetes.io/projected/5219ba20-7cd6-45d0-aa66-67e8eff7280d-kube-api-access-crq8x\") pod \"5219ba20-7cd6-45d0-aa66-67e8eff7280d\" (UID: \"5219ba20-7cd6-45d0-aa66-67e8eff7280d\") " Oct 01 15:42:51 crc kubenswrapper[4771]: I1001 15:42:51.560654 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5219ba20-7cd6-45d0-aa66-67e8eff7280d-utilities\") pod \"5219ba20-7cd6-45d0-aa66-67e8eff7280d\" (UID: \"5219ba20-7cd6-45d0-aa66-67e8eff7280d\") " Oct 01 15:42:51 crc kubenswrapper[4771]: I1001 15:42:51.561700 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5219ba20-7cd6-45d0-aa66-67e8eff7280d-utilities" (OuterVolumeSpecName: "utilities") pod "5219ba20-7cd6-45d0-aa66-67e8eff7280d" (UID: "5219ba20-7cd6-45d0-aa66-67e8eff7280d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:42:51 crc kubenswrapper[4771]: I1001 15:42:51.567095 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5219ba20-7cd6-45d0-aa66-67e8eff7280d-kube-api-access-crq8x" (OuterVolumeSpecName: "kube-api-access-crq8x") pod "5219ba20-7cd6-45d0-aa66-67e8eff7280d" (UID: "5219ba20-7cd6-45d0-aa66-67e8eff7280d"). InnerVolumeSpecName "kube-api-access-crq8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:42:51 crc kubenswrapper[4771]: I1001 15:42:51.663293 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crq8x\" (UniqueName: \"kubernetes.io/projected/5219ba20-7cd6-45d0-aa66-67e8eff7280d-kube-api-access-crq8x\") on node \"crc\" DevicePath \"\"" Oct 01 15:42:51 crc kubenswrapper[4771]: I1001 15:42:51.663333 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5219ba20-7cd6-45d0-aa66-67e8eff7280d-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 15:42:52 crc kubenswrapper[4771]: I1001 15:42:52.022282 4771 generic.go:334] "Generic (PLEG): container finished" podID="5219ba20-7cd6-45d0-aa66-67e8eff7280d" containerID="23674b45444395fa685cdb91eaee465b8cb95de4775426aac59ef98f41be3b1b" exitCode=0 Oct 01 15:42:52 crc kubenswrapper[4771]: I1001 15:42:52.022355 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7l66t" event={"ID":"5219ba20-7cd6-45d0-aa66-67e8eff7280d","Type":"ContainerDied","Data":"23674b45444395fa685cdb91eaee465b8cb95de4775426aac59ef98f41be3b1b"} Oct 01 15:42:52 crc kubenswrapper[4771]: I1001 15:42:52.022389 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7l66t" event={"ID":"5219ba20-7cd6-45d0-aa66-67e8eff7280d","Type":"ContainerDied","Data":"a86de9e7d9dee72f3c72900a83afb4dc7a8c1c622aac59f045cc493282be68c7"} Oct 01 15:42:52 crc kubenswrapper[4771]: I1001 15:42:52.022396 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7l66t" Oct 01 15:42:52 crc kubenswrapper[4771]: I1001 15:42:52.022412 4771 scope.go:117] "RemoveContainer" containerID="23674b45444395fa685cdb91eaee465b8cb95de4775426aac59ef98f41be3b1b" Oct 01 15:42:52 crc kubenswrapper[4771]: I1001 15:42:52.055219 4771 scope.go:117] "RemoveContainer" containerID="5360faacca0e88cf182822506989855d662202326e629bffa73a53b48b991175" Oct 01 15:42:52 crc kubenswrapper[4771]: I1001 15:42:52.088390 4771 scope.go:117] "RemoveContainer" containerID="3ac08ffd028ce3f59ebab53c160123ba159423008425397f48e3e6d25e963e9e" Oct 01 15:42:52 crc kubenswrapper[4771]: I1001 15:42:52.126486 4771 scope.go:117] "RemoveContainer" containerID="23674b45444395fa685cdb91eaee465b8cb95de4775426aac59ef98f41be3b1b" Oct 01 15:42:52 crc kubenswrapper[4771]: E1001 15:42:52.126961 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23674b45444395fa685cdb91eaee465b8cb95de4775426aac59ef98f41be3b1b\": container with ID starting with 23674b45444395fa685cdb91eaee465b8cb95de4775426aac59ef98f41be3b1b not found: ID does not exist" containerID="23674b45444395fa685cdb91eaee465b8cb95de4775426aac59ef98f41be3b1b" Oct 01 15:42:52 crc kubenswrapper[4771]: I1001 15:42:52.127011 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23674b45444395fa685cdb91eaee465b8cb95de4775426aac59ef98f41be3b1b"} err="failed to get container status \"23674b45444395fa685cdb91eaee465b8cb95de4775426aac59ef98f41be3b1b\": rpc error: code = NotFound desc = could not find container \"23674b45444395fa685cdb91eaee465b8cb95de4775426aac59ef98f41be3b1b\": container with ID starting with 23674b45444395fa685cdb91eaee465b8cb95de4775426aac59ef98f41be3b1b not found: ID does not exist" Oct 01 15:42:52 crc kubenswrapper[4771]: I1001 15:42:52.127040 4771 scope.go:117] "RemoveContainer" containerID="5360faacca0e88cf182822506989855d662202326e629bffa73a53b48b991175" Oct 01 15:42:52 crc kubenswrapper[4771]: E1001 15:42:52.127402 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5360faacca0e88cf182822506989855d662202326e629bffa73a53b48b991175\": container with ID starting with 5360faacca0e88cf182822506989855d662202326e629bffa73a53b48b991175 not found: ID does not exist" containerID="5360faacca0e88cf182822506989855d662202326e629bffa73a53b48b991175" Oct 01 15:42:52 crc kubenswrapper[4771]: I1001 15:42:52.127422 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5360faacca0e88cf182822506989855d662202326e629bffa73a53b48b991175"} err="failed to get container status \"5360faacca0e88cf182822506989855d662202326e629bffa73a53b48b991175\": rpc error: code = NotFound desc = could not find container \"5360faacca0e88cf182822506989855d662202326e629bffa73a53b48b991175\": container with ID starting with 5360faacca0e88cf182822506989855d662202326e629bffa73a53b48b991175 not found: ID does not exist" Oct 01 15:42:52 crc kubenswrapper[4771]: I1001 15:42:52.127435 4771 scope.go:117] "RemoveContainer" containerID="3ac08ffd028ce3f59ebab53c160123ba159423008425397f48e3e6d25e963e9e" Oct 01 15:42:52 crc kubenswrapper[4771]: E1001 15:42:52.127771 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ac08ffd028ce3f59ebab53c160123ba159423008425397f48e3e6d25e963e9e\": container with ID starting with 3ac08ffd028ce3f59ebab53c160123ba159423008425397f48e3e6d25e963e9e not found: ID does not exist" containerID="3ac08ffd028ce3f59ebab53c160123ba159423008425397f48e3e6d25e963e9e" Oct 01 15:42:52 crc kubenswrapper[4771]: I1001 15:42:52.127839 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ac08ffd028ce3f59ebab53c160123ba159423008425397f48e3e6d25e963e9e"} err="failed to get container status \"3ac08ffd028ce3f59ebab53c160123ba159423008425397f48e3e6d25e963e9e\": rpc error: code = NotFound desc = could not find container \"3ac08ffd028ce3f59ebab53c160123ba159423008425397f48e3e6d25e963e9e\": container with ID starting with 3ac08ffd028ce3f59ebab53c160123ba159423008425397f48e3e6d25e963e9e not found: ID does not exist" Oct 01 15:42:52 crc kubenswrapper[4771]: I1001 15:42:52.887875 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5219ba20-7cd6-45d0-aa66-67e8eff7280d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5219ba20-7cd6-45d0-aa66-67e8eff7280d" (UID: "5219ba20-7cd6-45d0-aa66-67e8eff7280d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:42:52 crc kubenswrapper[4771]: I1001 15:42:52.888135 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5219ba20-7cd6-45d0-aa66-67e8eff7280d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 15:42:52 crc kubenswrapper[4771]: I1001 15:42:52.973623 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7l66t"] Oct 01 15:42:52 crc kubenswrapper[4771]: I1001 15:42:52.984005 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7l66t"] Oct 01 15:42:53 crc kubenswrapper[4771]: I1001 15:42:53.996973 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5219ba20-7cd6-45d0-aa66-67e8eff7280d" path="/var/lib/kubelet/pods/5219ba20-7cd6-45d0-aa66-67e8eff7280d/volumes" Oct 01 15:43:04 crc kubenswrapper[4771]: I1001 15:43:04.014248 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Oct 01 15:43:04 crc kubenswrapper[4771]: E1001 15:43:04.015205 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5219ba20-7cd6-45d0-aa66-67e8eff7280d" containerName="extract-content" Oct 01 15:43:04 crc kubenswrapper[4771]: I1001 15:43:04.015221 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="5219ba20-7cd6-45d0-aa66-67e8eff7280d" containerName="extract-content" Oct 01 15:43:04 crc kubenswrapper[4771]: E1001 15:43:04.015239 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5219ba20-7cd6-45d0-aa66-67e8eff7280d" containerName="registry-server" Oct 01 15:43:04 crc kubenswrapper[4771]: I1001 15:43:04.015248 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="5219ba20-7cd6-45d0-aa66-67e8eff7280d" containerName="registry-server" Oct 01 15:43:04 crc kubenswrapper[4771]: E1001 15:43:04.015284 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5219ba20-7cd6-45d0-aa66-67e8eff7280d" containerName="extract-utilities" Oct 01 15:43:04 crc kubenswrapper[4771]: I1001 15:43:04.015293 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="5219ba20-7cd6-45d0-aa66-67e8eff7280d" containerName="extract-utilities" Oct 01 15:43:04 crc kubenswrapper[4771]: I1001 15:43:04.015526 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="5219ba20-7cd6-45d0-aa66-67e8eff7280d" containerName="registry-server" Oct 01 15:43:04 crc kubenswrapper[4771]: I1001 15:43:04.016260 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 01 15:43:04 crc kubenswrapper[4771]: I1001 15:43:04.019200 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-n7bzx" Oct 01 15:43:04 crc kubenswrapper[4771]: I1001 15:43:04.019316 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 01 15:43:04 crc kubenswrapper[4771]: I1001 15:43:04.019374 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Oct 01 15:43:04 crc kubenswrapper[4771]: I1001 15:43:04.019562 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Oct 01 15:43:04 crc kubenswrapper[4771]: I1001 15:43:04.031360 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 01 15:43:04 crc kubenswrapper[4771]: I1001 15:43:04.125804 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/32741182-3a7c-43a7-b996-1dd78a418dc6-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"32741182-3a7c-43a7-b996-1dd78a418dc6\") " pod="openstack/tempest-tests-tempest" Oct 01 15:43:04 crc kubenswrapper[4771]: I1001 15:43:04.125885 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/32741182-3a7c-43a7-b996-1dd78a418dc6-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"32741182-3a7c-43a7-b996-1dd78a418dc6\") " pod="openstack/tempest-tests-tempest" Oct 01 15:43:04 crc kubenswrapper[4771]: I1001 15:43:04.125947 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/32741182-3a7c-43a7-b996-1dd78a418dc6-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"32741182-3a7c-43a7-b996-1dd78a418dc6\") " pod="openstack/tempest-tests-tempest" Oct 01 15:43:04 crc kubenswrapper[4771]: I1001 15:43:04.125997 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/32741182-3a7c-43a7-b996-1dd78a418dc6-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"32741182-3a7c-43a7-b996-1dd78a418dc6\") " pod="openstack/tempest-tests-tempest" Oct 01 15:43:04 crc kubenswrapper[4771]: I1001 15:43:04.126033 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/32741182-3a7c-43a7-b996-1dd78a418dc6-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"32741182-3a7c-43a7-b996-1dd78a418dc6\") " pod="openstack/tempest-tests-tempest" Oct 01 15:43:04 crc kubenswrapper[4771]: I1001 15:43:04.126066 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"32741182-3a7c-43a7-b996-1dd78a418dc6\") " pod="openstack/tempest-tests-tempest" Oct 01 15:43:04 crc kubenswrapper[4771]: I1001 15:43:04.126081 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/32741182-3a7c-43a7-b996-1dd78a418dc6-config-data\") pod \"tempest-tests-tempest\" (UID: \"32741182-3a7c-43a7-b996-1dd78a418dc6\") " pod="openstack/tempest-tests-tempest" Oct 01 15:43:04 crc kubenswrapper[4771]: I1001 15:43:04.126123 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnps4\" (UniqueName: \"kubernetes.io/projected/32741182-3a7c-43a7-b996-1dd78a418dc6-kube-api-access-gnps4\") pod \"tempest-tests-tempest\" (UID: \"32741182-3a7c-43a7-b996-1dd78a418dc6\") " pod="openstack/tempest-tests-tempest" Oct 01 15:43:04 crc kubenswrapper[4771]: I1001 15:43:04.126633 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/32741182-3a7c-43a7-b996-1dd78a418dc6-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"32741182-3a7c-43a7-b996-1dd78a418dc6\") " pod="openstack/tempest-tests-tempest" Oct 01 15:43:04 crc kubenswrapper[4771]: I1001 15:43:04.228948 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/32741182-3a7c-43a7-b996-1dd78a418dc6-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"32741182-3a7c-43a7-b996-1dd78a418dc6\") " pod="openstack/tempest-tests-tempest" Oct 01 15:43:04 crc kubenswrapper[4771]: I1001 15:43:04.229070 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/32741182-3a7c-43a7-b996-1dd78a418dc6-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"32741182-3a7c-43a7-b996-1dd78a418dc6\") " pod="openstack/tempest-tests-tempest" Oct 01 15:43:04 crc kubenswrapper[4771]: I1001 15:43:04.229144 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"32741182-3a7c-43a7-b996-1dd78a418dc6\") " pod="openstack/tempest-tests-tempest" Oct 01 15:43:04 crc kubenswrapper[4771]: I1001 15:43:04.229180 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/32741182-3a7c-43a7-b996-1dd78a418dc6-config-data\") pod \"tempest-tests-tempest\" (UID: \"32741182-3a7c-43a7-b996-1dd78a418dc6\") " pod="openstack/tempest-tests-tempest" Oct 01 15:43:04 crc kubenswrapper[4771]: I1001 15:43:04.229258 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnps4\" (UniqueName: \"kubernetes.io/projected/32741182-3a7c-43a7-b996-1dd78a418dc6-kube-api-access-gnps4\") pod \"tempest-tests-tempest\" (UID: \"32741182-3a7c-43a7-b996-1dd78a418dc6\") " pod="openstack/tempest-tests-tempest" Oct 01 15:43:04 crc kubenswrapper[4771]: I1001 15:43:04.229295 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/32741182-3a7c-43a7-b996-1dd78a418dc6-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"32741182-3a7c-43a7-b996-1dd78a418dc6\") " pod="openstack/tempest-tests-tempest" Oct 01 15:43:04 crc kubenswrapper[4771]: I1001 15:43:04.229411 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/32741182-3a7c-43a7-b996-1dd78a418dc6-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"32741182-3a7c-43a7-b996-1dd78a418dc6\") " pod="openstack/tempest-tests-tempest" Oct 01 15:43:04 crc kubenswrapper[4771]: I1001 15:43:04.229487 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/32741182-3a7c-43a7-b996-1dd78a418dc6-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"32741182-3a7c-43a7-b996-1dd78a418dc6\") " pod="openstack/tempest-tests-tempest" Oct 01 15:43:04 crc kubenswrapper[4771]: I1001 15:43:04.229547 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/32741182-3a7c-43a7-b996-1dd78a418dc6-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"32741182-3a7c-43a7-b996-1dd78a418dc6\") " pod="openstack/tempest-tests-tempest" Oct 01 15:43:04 crc kubenswrapper[4771]: I1001 15:43:04.229896 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"32741182-3a7c-43a7-b996-1dd78a418dc6\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/tempest-tests-tempest" Oct 01 15:43:04 crc kubenswrapper[4771]: I1001 15:43:04.230559 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/32741182-3a7c-43a7-b996-1dd78a418dc6-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"32741182-3a7c-43a7-b996-1dd78a418dc6\") " pod="openstack/tempest-tests-tempest" Oct 01 15:43:04 crc kubenswrapper[4771]: I1001 15:43:04.231590 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/32741182-3a7c-43a7-b996-1dd78a418dc6-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"32741182-3a7c-43a7-b996-1dd78a418dc6\") " pod="openstack/tempest-tests-tempest" Oct 01 15:43:04 crc kubenswrapper[4771]: I1001 15:43:04.232296 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/32741182-3a7c-43a7-b996-1dd78a418dc6-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"32741182-3a7c-43a7-b996-1dd78a418dc6\") " pod="openstack/tempest-tests-tempest" Oct 01 15:43:04 crc kubenswrapper[4771]: I1001 15:43:04.232819 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/32741182-3a7c-43a7-b996-1dd78a418dc6-config-data\") pod \"tempest-tests-tempest\" (UID: \"32741182-3a7c-43a7-b996-1dd78a418dc6\") " pod="openstack/tempest-tests-tempest" Oct 01 15:43:04 crc kubenswrapper[4771]: I1001 15:43:04.236217 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/32741182-3a7c-43a7-b996-1dd78a418dc6-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"32741182-3a7c-43a7-b996-1dd78a418dc6\") " pod="openstack/tempest-tests-tempest" Oct 01 15:43:04 crc kubenswrapper[4771]: I1001 15:43:04.236277 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/32741182-3a7c-43a7-b996-1dd78a418dc6-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"32741182-3a7c-43a7-b996-1dd78a418dc6\") " pod="openstack/tempest-tests-tempest" Oct 01 15:43:04 crc kubenswrapper[4771]: I1001 15:43:04.240175 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/32741182-3a7c-43a7-b996-1dd78a418dc6-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"32741182-3a7c-43a7-b996-1dd78a418dc6\") " pod="openstack/tempest-tests-tempest" Oct 01 15:43:04 crc kubenswrapper[4771]: I1001 15:43:04.251323 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnps4\" (UniqueName: \"kubernetes.io/projected/32741182-3a7c-43a7-b996-1dd78a418dc6-kube-api-access-gnps4\") pod \"tempest-tests-tempest\" (UID: \"32741182-3a7c-43a7-b996-1dd78a418dc6\") " pod="openstack/tempest-tests-tempest" Oct 01 15:43:04 crc kubenswrapper[4771]: I1001 15:43:04.284112 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"32741182-3a7c-43a7-b996-1dd78a418dc6\") " pod="openstack/tempest-tests-tempest" Oct 01 15:43:04 crc kubenswrapper[4771]: I1001 15:43:04.365875 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 01 15:43:04 crc kubenswrapper[4771]: I1001 15:43:04.871863 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 01 15:43:05 crc kubenswrapper[4771]: I1001 15:43:05.176804 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"32741182-3a7c-43a7-b996-1dd78a418dc6","Type":"ContainerStarted","Data":"cce6e860c528cc587042ee77f1257f13f58e6ed28483e2c8082a80c2f57b3068"} Oct 01 15:43:12 crc kubenswrapper[4771]: I1001 15:43:12.177653 4771 patch_prober.go:28] interesting pod/machine-config-daemon-vck47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 15:43:12 crc kubenswrapper[4771]: I1001 15:43:12.178198 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 15:43:30 crc kubenswrapper[4771]: E1001 15:43:30.963579 4771 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Oct 01 15:43:30 crc kubenswrapper[4771]: E1001 15:43:30.964481 4771 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gnps4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(32741182-3a7c-43a7-b996-1dd78a418dc6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 01 15:43:30 crc kubenswrapper[4771]: E1001 15:43:30.965708 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="32741182-3a7c-43a7-b996-1dd78a418dc6" Oct 01 15:43:31 crc kubenswrapper[4771]: E1001 15:43:31.444901 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="32741182-3a7c-43a7-b996-1dd78a418dc6" Oct 01 15:43:42 crc kubenswrapper[4771]: I1001 15:43:42.177153 4771 patch_prober.go:28] interesting pod/machine-config-daemon-vck47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 15:43:42 crc kubenswrapper[4771]: I1001 15:43:42.177888 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 15:43:43 crc kubenswrapper[4771]: I1001 15:43:43.577477 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"32741182-3a7c-43a7-b996-1dd78a418dc6","Type":"ContainerStarted","Data":"cc9eac0ff84ca09e142b9e58f102604864c84ad74f6ac234bbc3f1bbe8986fca"} Oct 01 15:43:43 crc kubenswrapper[4771]: I1001 15:43:43.596395 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.05830311 podStartE2EDuration="41.596376102s" podCreationTimestamp="2025-10-01 15:43:02 +0000 UTC" firstStartedPulling="2025-10-01 15:43:04.879932681 +0000 UTC m=+2829.499107872" lastFinishedPulling="2025-10-01 15:43:42.418005653 +0000 UTC m=+2867.037180864" observedRunningTime="2025-10-01 15:43:43.594027384 +0000 UTC m=+2868.213202565" watchObservedRunningTime="2025-10-01 15:43:43.596376102 +0000 UTC m=+2868.215551273" Oct 01 15:44:12 crc kubenswrapper[4771]: I1001 15:44:12.177243 4771 patch_prober.go:28] interesting pod/machine-config-daemon-vck47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 15:44:12 crc kubenswrapper[4771]: I1001 15:44:12.177880 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 15:44:12 crc kubenswrapper[4771]: I1001 15:44:12.177947 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vck47" Oct 01 15:44:12 crc kubenswrapper[4771]: I1001 15:44:12.178995 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f0d1081783e8ca281c7f5f2c017e855d55e05891a57714ef76e9593a71d942f8"} pod="openshift-machine-config-operator/machine-config-daemon-vck47" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 15:44:12 crc kubenswrapper[4771]: I1001 15:44:12.179122 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" containerID="cri-o://f0d1081783e8ca281c7f5f2c017e855d55e05891a57714ef76e9593a71d942f8" gracePeriod=600 Oct 01 15:44:12 crc kubenswrapper[4771]: E1001 15:44:12.326752 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:44:12 crc kubenswrapper[4771]: I1001 15:44:12.912380 4771 generic.go:334] "Generic (PLEG): container finished" podID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerID="f0d1081783e8ca281c7f5f2c017e855d55e05891a57714ef76e9593a71d942f8" exitCode=0 Oct 01 15:44:12 crc kubenswrapper[4771]: I1001 15:44:12.912431 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" event={"ID":"289ee6d3-fabe-417f-964c-76ca03c143cc","Type":"ContainerDied","Data":"f0d1081783e8ca281c7f5f2c017e855d55e05891a57714ef76e9593a71d942f8"} Oct 01 15:44:12 crc kubenswrapper[4771]: I1001 15:44:12.912467 4771 scope.go:117] "RemoveContainer" containerID="0d782d6a5f89be0ead3cef9790ea83544120df7e378bb400716acc9b592f88b8" Oct 01 15:44:12 crc kubenswrapper[4771]: I1001 15:44:12.913502 4771 scope.go:117] "RemoveContainer" containerID="f0d1081783e8ca281c7f5f2c017e855d55e05891a57714ef76e9593a71d942f8" Oct 01 15:44:12 crc kubenswrapper[4771]: E1001 15:44:12.914082 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:44:24 crc kubenswrapper[4771]: I1001 15:44:24.985137 4771 scope.go:117] "RemoveContainer" containerID="f0d1081783e8ca281c7f5f2c017e855d55e05891a57714ef76e9593a71d942f8" Oct 01 15:44:24 crc kubenswrapper[4771]: E1001 15:44:24.986254 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:44:36 crc kubenswrapper[4771]: I1001 15:44:36.986798 4771 scope.go:117] "RemoveContainer" containerID="f0d1081783e8ca281c7f5f2c017e855d55e05891a57714ef76e9593a71d942f8" Oct 01 15:44:36 crc kubenswrapper[4771]: E1001 15:44:36.987864 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:44:47 crc kubenswrapper[4771]: I1001 15:44:47.985780 4771 scope.go:117] "RemoveContainer" containerID="f0d1081783e8ca281c7f5f2c017e855d55e05891a57714ef76e9593a71d942f8" Oct 01 15:44:47 crc kubenswrapper[4771]: E1001 15:44:47.986475 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:45:00 crc kubenswrapper[4771]: I1001 15:45:00.207810 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322225-nkdfh"] Oct 01 15:45:00 crc kubenswrapper[4771]: I1001 15:45:00.211662 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322225-nkdfh" Oct 01 15:45:00 crc kubenswrapper[4771]: I1001 15:45:00.214628 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 15:45:00 crc kubenswrapper[4771]: I1001 15:45:00.215396 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 15:45:00 crc kubenswrapper[4771]: I1001 15:45:00.218440 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322225-nkdfh"] Oct 01 15:45:00 crc kubenswrapper[4771]: I1001 15:45:00.337405 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/183116aa-5483-4bc8-ac62-a15897d6aeb3-config-volume\") pod \"collect-profiles-29322225-nkdfh\" (UID: \"183116aa-5483-4bc8-ac62-a15897d6aeb3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322225-nkdfh" Oct 01 15:45:00 crc kubenswrapper[4771]: I1001 15:45:00.337553 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/183116aa-5483-4bc8-ac62-a15897d6aeb3-secret-volume\") pod \"collect-profiles-29322225-nkdfh\" (UID: \"183116aa-5483-4bc8-ac62-a15897d6aeb3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322225-nkdfh" Oct 01 15:45:00 crc kubenswrapper[4771]: I1001 15:45:00.337654 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thdxt\" (UniqueName: \"kubernetes.io/projected/183116aa-5483-4bc8-ac62-a15897d6aeb3-kube-api-access-thdxt\") pod \"collect-profiles-29322225-nkdfh\" (UID: \"183116aa-5483-4bc8-ac62-a15897d6aeb3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322225-nkdfh" Oct 01 15:45:00 crc kubenswrapper[4771]: I1001 15:45:00.439593 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thdxt\" (UniqueName: \"kubernetes.io/projected/183116aa-5483-4bc8-ac62-a15897d6aeb3-kube-api-access-thdxt\") pod \"collect-profiles-29322225-nkdfh\" (UID: \"183116aa-5483-4bc8-ac62-a15897d6aeb3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322225-nkdfh" Oct 01 15:45:00 crc kubenswrapper[4771]: I1001 15:45:00.439888 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/183116aa-5483-4bc8-ac62-a15897d6aeb3-config-volume\") pod \"collect-profiles-29322225-nkdfh\" (UID: \"183116aa-5483-4bc8-ac62-a15897d6aeb3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322225-nkdfh" Oct 01 15:45:00 crc kubenswrapper[4771]: I1001 15:45:00.440001 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/183116aa-5483-4bc8-ac62-a15897d6aeb3-secret-volume\") pod \"collect-profiles-29322225-nkdfh\" (UID: \"183116aa-5483-4bc8-ac62-a15897d6aeb3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322225-nkdfh" Oct 01 15:45:00 crc kubenswrapper[4771]: I1001 15:45:00.441664 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/183116aa-5483-4bc8-ac62-a15897d6aeb3-config-volume\") pod \"collect-profiles-29322225-nkdfh\" (UID: \"183116aa-5483-4bc8-ac62-a15897d6aeb3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322225-nkdfh" Oct 01 15:45:00 crc kubenswrapper[4771]: I1001 15:45:00.450312 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/183116aa-5483-4bc8-ac62-a15897d6aeb3-secret-volume\") pod \"collect-profiles-29322225-nkdfh\" (UID: \"183116aa-5483-4bc8-ac62-a15897d6aeb3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322225-nkdfh" Oct 01 15:45:00 crc kubenswrapper[4771]: I1001 15:45:00.460460 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thdxt\" (UniqueName: \"kubernetes.io/projected/183116aa-5483-4bc8-ac62-a15897d6aeb3-kube-api-access-thdxt\") pod \"collect-profiles-29322225-nkdfh\" (UID: \"183116aa-5483-4bc8-ac62-a15897d6aeb3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322225-nkdfh" Oct 01 15:45:00 crc kubenswrapper[4771]: I1001 15:45:00.531420 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322225-nkdfh" Oct 01 15:45:01 crc kubenswrapper[4771]: I1001 15:45:01.026110 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322225-nkdfh"] Oct 01 15:45:01 crc kubenswrapper[4771]: W1001 15:45:01.032218 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod183116aa_5483_4bc8_ac62_a15897d6aeb3.slice/crio-7f15732d5ea4460f9fdbca680eb9311ad47193b79380664af5b4ef0141758c63 WatchSource:0}: Error finding container 7f15732d5ea4460f9fdbca680eb9311ad47193b79380664af5b4ef0141758c63: Status 404 returned error can't find the container with id 7f15732d5ea4460f9fdbca680eb9311ad47193b79380664af5b4ef0141758c63 Oct 01 15:45:01 crc kubenswrapper[4771]: I1001 15:45:01.389065 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322225-nkdfh" event={"ID":"183116aa-5483-4bc8-ac62-a15897d6aeb3","Type":"ContainerStarted","Data":"7c4e21485776fca0f6ce5efa9db212e7b4aeab2190a100dc1daf1822e6e5235d"} Oct 01 15:45:01 crc kubenswrapper[4771]: I1001 15:45:01.389329 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322225-nkdfh" event={"ID":"183116aa-5483-4bc8-ac62-a15897d6aeb3","Type":"ContainerStarted","Data":"7f15732d5ea4460f9fdbca680eb9311ad47193b79380664af5b4ef0141758c63"} Oct 01 15:45:01 crc kubenswrapper[4771]: I1001 15:45:01.412288 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29322225-nkdfh" podStartSLOduration=1.412267929 podStartE2EDuration="1.412267929s" podCreationTimestamp="2025-10-01 15:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 15:45:01.406195479 +0000 UTC m=+2946.025370700" watchObservedRunningTime="2025-10-01 15:45:01.412267929 +0000 UTC m=+2946.031443100" Oct 01 15:45:02 crc kubenswrapper[4771]: I1001 15:45:02.405200 4771 generic.go:334] "Generic (PLEG): container finished" podID="183116aa-5483-4bc8-ac62-a15897d6aeb3" containerID="7c4e21485776fca0f6ce5efa9db212e7b4aeab2190a100dc1daf1822e6e5235d" exitCode=0 Oct 01 15:45:02 crc kubenswrapper[4771]: I1001 15:45:02.405336 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322225-nkdfh" event={"ID":"183116aa-5483-4bc8-ac62-a15897d6aeb3","Type":"ContainerDied","Data":"7c4e21485776fca0f6ce5efa9db212e7b4aeab2190a100dc1daf1822e6e5235d"} Oct 01 15:45:02 crc kubenswrapper[4771]: I1001 15:45:02.985795 4771 scope.go:117] "RemoveContainer" containerID="f0d1081783e8ca281c7f5f2c017e855d55e05891a57714ef76e9593a71d942f8" Oct 01 15:45:02 crc kubenswrapper[4771]: E1001 15:45:02.986068 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:45:03 crc kubenswrapper[4771]: I1001 15:45:03.934427 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322225-nkdfh" Oct 01 15:45:04 crc kubenswrapper[4771]: I1001 15:45:04.010669 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thdxt\" (UniqueName: \"kubernetes.io/projected/183116aa-5483-4bc8-ac62-a15897d6aeb3-kube-api-access-thdxt\") pod \"183116aa-5483-4bc8-ac62-a15897d6aeb3\" (UID: \"183116aa-5483-4bc8-ac62-a15897d6aeb3\") " Oct 01 15:45:04 crc kubenswrapper[4771]: I1001 15:45:04.011638 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/183116aa-5483-4bc8-ac62-a15897d6aeb3-secret-volume\") pod \"183116aa-5483-4bc8-ac62-a15897d6aeb3\" (UID: \"183116aa-5483-4bc8-ac62-a15897d6aeb3\") " Oct 01 15:45:04 crc kubenswrapper[4771]: I1001 15:45:04.011828 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/183116aa-5483-4bc8-ac62-a15897d6aeb3-config-volume\") pod \"183116aa-5483-4bc8-ac62-a15897d6aeb3\" (UID: \"183116aa-5483-4bc8-ac62-a15897d6aeb3\") " Oct 01 15:45:04 crc kubenswrapper[4771]: I1001 15:45:04.013518 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/183116aa-5483-4bc8-ac62-a15897d6aeb3-config-volume" (OuterVolumeSpecName: "config-volume") pod "183116aa-5483-4bc8-ac62-a15897d6aeb3" (UID: "183116aa-5483-4bc8-ac62-a15897d6aeb3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:45:04 crc kubenswrapper[4771]: I1001 15:45:04.015540 4771 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/183116aa-5483-4bc8-ac62-a15897d6aeb3-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 15:45:04 crc kubenswrapper[4771]: I1001 15:45:04.017792 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/183116aa-5483-4bc8-ac62-a15897d6aeb3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "183116aa-5483-4bc8-ac62-a15897d6aeb3" (UID: "183116aa-5483-4bc8-ac62-a15897d6aeb3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:45:04 crc kubenswrapper[4771]: I1001 15:45:04.025187 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/183116aa-5483-4bc8-ac62-a15897d6aeb3-kube-api-access-thdxt" (OuterVolumeSpecName: "kube-api-access-thdxt") pod "183116aa-5483-4bc8-ac62-a15897d6aeb3" (UID: "183116aa-5483-4bc8-ac62-a15897d6aeb3"). InnerVolumeSpecName "kube-api-access-thdxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:45:04 crc kubenswrapper[4771]: I1001 15:45:04.117460 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thdxt\" (UniqueName: \"kubernetes.io/projected/183116aa-5483-4bc8-ac62-a15897d6aeb3-kube-api-access-thdxt\") on node \"crc\" DevicePath \"\"" Oct 01 15:45:04 crc kubenswrapper[4771]: I1001 15:45:04.117493 4771 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/183116aa-5483-4bc8-ac62-a15897d6aeb3-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 15:45:04 crc kubenswrapper[4771]: I1001 15:45:04.429369 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322225-nkdfh" event={"ID":"183116aa-5483-4bc8-ac62-a15897d6aeb3","Type":"ContainerDied","Data":"7f15732d5ea4460f9fdbca680eb9311ad47193b79380664af5b4ef0141758c63"} Oct 01 15:45:04 crc kubenswrapper[4771]: I1001 15:45:04.429425 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f15732d5ea4460f9fdbca680eb9311ad47193b79380664af5b4ef0141758c63" Oct 01 15:45:04 crc kubenswrapper[4771]: I1001 15:45:04.429422 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322225-nkdfh" Oct 01 15:45:04 crc kubenswrapper[4771]: I1001 15:45:04.494509 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322180-7bqch"] Oct 01 15:45:04 crc kubenswrapper[4771]: I1001 15:45:04.504861 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322180-7bqch"] Oct 01 15:45:06 crc kubenswrapper[4771]: I1001 15:45:06.002228 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f335a8c9-a8a0-4060-a1e0-690673e260de" path="/var/lib/kubelet/pods/f335a8c9-a8a0-4060-a1e0-690673e260de/volumes" Oct 01 15:45:13 crc kubenswrapper[4771]: I1001 15:45:13.986149 4771 scope.go:117] "RemoveContainer" containerID="f0d1081783e8ca281c7f5f2c017e855d55e05891a57714ef76e9593a71d942f8" Oct 01 15:45:13 crc kubenswrapper[4771]: E1001 15:45:13.988666 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:45:25 crc kubenswrapper[4771]: I1001 15:45:25.991854 4771 scope.go:117] "RemoveContainer" containerID="f0d1081783e8ca281c7f5f2c017e855d55e05891a57714ef76e9593a71d942f8" Oct 01 15:45:25 crc kubenswrapper[4771]: E1001 15:45:25.992452 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:45:36 crc kubenswrapper[4771]: I1001 15:45:36.986088 4771 scope.go:117] "RemoveContainer" containerID="f0d1081783e8ca281c7f5f2c017e855d55e05891a57714ef76e9593a71d942f8" Oct 01 15:45:36 crc kubenswrapper[4771]: E1001 15:45:36.987560 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:45:50 crc kubenswrapper[4771]: I1001 15:45:50.986210 4771 scope.go:117] "RemoveContainer" containerID="f0d1081783e8ca281c7f5f2c017e855d55e05891a57714ef76e9593a71d942f8" Oct 01 15:45:50 crc kubenswrapper[4771]: E1001 15:45:50.986890 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:46:03 crc kubenswrapper[4771]: I1001 15:46:03.857230 4771 scope.go:117] "RemoveContainer" containerID="896c92d7cf3f1058b27170ad2e06ed3c8a3e8d2e4a1893fd5d60760e7d6128c6" Oct 01 15:46:04 crc kubenswrapper[4771]: I1001 15:46:04.985381 4771 scope.go:117] "RemoveContainer" containerID="f0d1081783e8ca281c7f5f2c017e855d55e05891a57714ef76e9593a71d942f8" Oct 01 15:46:04 crc kubenswrapper[4771]: E1001 15:46:04.985922 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:46:18 crc kubenswrapper[4771]: I1001 15:46:18.985554 4771 scope.go:117] "RemoveContainer" containerID="f0d1081783e8ca281c7f5f2c017e855d55e05891a57714ef76e9593a71d942f8" Oct 01 15:46:18 crc kubenswrapper[4771]: E1001 15:46:18.986776 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:46:32 crc kubenswrapper[4771]: I1001 15:46:32.985213 4771 scope.go:117] "RemoveContainer" containerID="f0d1081783e8ca281c7f5f2c017e855d55e05891a57714ef76e9593a71d942f8" Oct 01 15:46:32 crc kubenswrapper[4771]: E1001 15:46:32.985839 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:46:43 crc kubenswrapper[4771]: I1001 15:46:43.985113 4771 scope.go:117] "RemoveContainer" containerID="f0d1081783e8ca281c7f5f2c017e855d55e05891a57714ef76e9593a71d942f8" Oct 01 15:46:43 crc kubenswrapper[4771]: E1001 15:46:43.985830 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:46:58 crc kubenswrapper[4771]: I1001 15:46:58.985545 4771 scope.go:117] "RemoveContainer" containerID="f0d1081783e8ca281c7f5f2c017e855d55e05891a57714ef76e9593a71d942f8" Oct 01 15:46:58 crc kubenswrapper[4771]: E1001 15:46:58.988249 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:47:10 crc kubenswrapper[4771]: I1001 15:47:10.984979 4771 scope.go:117] "RemoveContainer" containerID="f0d1081783e8ca281c7f5f2c017e855d55e05891a57714ef76e9593a71d942f8" Oct 01 15:47:10 crc kubenswrapper[4771]: E1001 15:47:10.986040 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:47:21 crc kubenswrapper[4771]: I1001 15:47:21.985539 4771 scope.go:117] "RemoveContainer" containerID="f0d1081783e8ca281c7f5f2c017e855d55e05891a57714ef76e9593a71d942f8" Oct 01 15:47:21 crc kubenswrapper[4771]: E1001 15:47:21.987383 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:47:32 crc kubenswrapper[4771]: I1001 15:47:32.493077 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="3a36d28b-706e-4639-9d68-158427aaa655" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.174:8080/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 01 15:47:32 crc kubenswrapper[4771]: I1001 15:47:32.986686 4771 scope.go:117] "RemoveContainer" containerID="f0d1081783e8ca281c7f5f2c017e855d55e05891a57714ef76e9593a71d942f8" Oct 01 15:47:32 crc kubenswrapper[4771]: E1001 15:47:32.987356 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:47:46 crc kubenswrapper[4771]: I1001 15:47:46.985485 4771 scope.go:117] "RemoveContainer" containerID="f0d1081783e8ca281c7f5f2c017e855d55e05891a57714ef76e9593a71d942f8" Oct 01 15:47:46 crc kubenswrapper[4771]: E1001 15:47:46.986153 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:47:59 crc kubenswrapper[4771]: I1001 15:47:59.984968 4771 scope.go:117] "RemoveContainer" containerID="f0d1081783e8ca281c7f5f2c017e855d55e05891a57714ef76e9593a71d942f8" Oct 01 15:47:59 crc kubenswrapper[4771]: E1001 15:47:59.985818 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:48:13 crc kubenswrapper[4771]: I1001 15:48:13.985461 4771 scope.go:117] "RemoveContainer" containerID="f0d1081783e8ca281c7f5f2c017e855d55e05891a57714ef76e9593a71d942f8" Oct 01 15:48:13 crc kubenswrapper[4771]: E1001 15:48:13.986809 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:48:26 crc kubenswrapper[4771]: I1001 15:48:26.986787 4771 scope.go:117] "RemoveContainer" containerID="f0d1081783e8ca281c7f5f2c017e855d55e05891a57714ef76e9593a71d942f8" Oct 01 15:48:26 crc kubenswrapper[4771]: E1001 15:48:26.988503 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:48:38 crc kubenswrapper[4771]: I1001 15:48:38.985667 4771 scope.go:117] "RemoveContainer" containerID="f0d1081783e8ca281c7f5f2c017e855d55e05891a57714ef76e9593a71d942f8" Oct 01 15:48:38 crc kubenswrapper[4771]: E1001 15:48:38.986303 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:48:53 crc kubenswrapper[4771]: I1001 15:48:53.985532 4771 scope.go:117] "RemoveContainer" containerID="f0d1081783e8ca281c7f5f2c017e855d55e05891a57714ef76e9593a71d942f8" Oct 01 15:48:53 crc kubenswrapper[4771]: E1001 15:48:53.986408 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:49:07 crc kubenswrapper[4771]: I1001 15:49:07.985218 4771 scope.go:117] "RemoveContainer" containerID="f0d1081783e8ca281c7f5f2c017e855d55e05891a57714ef76e9593a71d942f8" Oct 01 15:49:07 crc kubenswrapper[4771]: E1001 15:49:07.985850 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:49:22 crc kubenswrapper[4771]: I1001 15:49:22.985575 4771 scope.go:117] "RemoveContainer" containerID="f0d1081783e8ca281c7f5f2c017e855d55e05891a57714ef76e9593a71d942f8" Oct 01 15:49:24 crc kubenswrapper[4771]: I1001 15:49:24.125659 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" event={"ID":"289ee6d3-fabe-417f-964c-76ca03c143cc","Type":"ContainerStarted","Data":"1abb350ca0e7c414aeed7a01011f0b66cc36f50524fa0b00e58e82c7df6d5615"} Oct 01 15:50:16 crc kubenswrapper[4771]: I1001 15:50:16.089567 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bzdqx"] Oct 01 15:50:16 crc kubenswrapper[4771]: E1001 15:50:16.090663 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="183116aa-5483-4bc8-ac62-a15897d6aeb3" containerName="collect-profiles" Oct 01 15:50:16 crc kubenswrapper[4771]: I1001 15:50:16.090679 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="183116aa-5483-4bc8-ac62-a15897d6aeb3" containerName="collect-profiles" Oct 01 15:50:16 crc kubenswrapper[4771]: I1001 15:50:16.090984 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="183116aa-5483-4bc8-ac62-a15897d6aeb3" containerName="collect-profiles" Oct 01 15:50:16 crc kubenswrapper[4771]: I1001 15:50:16.092797 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bzdqx" Oct 01 15:50:16 crc kubenswrapper[4771]: I1001 15:50:16.098557 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bzdqx"] Oct 01 15:50:16 crc kubenswrapper[4771]: I1001 15:50:16.125609 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-666x4\" (UniqueName: \"kubernetes.io/projected/f9ee8175-b718-4d22-aee2-73f7f9f2ff75-kube-api-access-666x4\") pod \"certified-operators-bzdqx\" (UID: \"f9ee8175-b718-4d22-aee2-73f7f9f2ff75\") " pod="openshift-marketplace/certified-operators-bzdqx" Oct 01 15:50:16 crc kubenswrapper[4771]: I1001 15:50:16.125663 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9ee8175-b718-4d22-aee2-73f7f9f2ff75-catalog-content\") pod \"certified-operators-bzdqx\" (UID: \"f9ee8175-b718-4d22-aee2-73f7f9f2ff75\") " pod="openshift-marketplace/certified-operators-bzdqx" Oct 01 15:50:16 crc kubenswrapper[4771]: I1001 15:50:16.125944 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9ee8175-b718-4d22-aee2-73f7f9f2ff75-utilities\") pod \"certified-operators-bzdqx\" (UID: \"f9ee8175-b718-4d22-aee2-73f7f9f2ff75\") " pod="openshift-marketplace/certified-operators-bzdqx" Oct 01 15:50:16 crc kubenswrapper[4771]: I1001 15:50:16.227449 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9ee8175-b718-4d22-aee2-73f7f9f2ff75-utilities\") pod \"certified-operators-bzdqx\" (UID: \"f9ee8175-b718-4d22-aee2-73f7f9f2ff75\") " pod="openshift-marketplace/certified-operators-bzdqx" Oct 01 15:50:16 crc kubenswrapper[4771]: I1001 15:50:16.227571 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-666x4\" (UniqueName: \"kubernetes.io/projected/f9ee8175-b718-4d22-aee2-73f7f9f2ff75-kube-api-access-666x4\") pod \"certified-operators-bzdqx\" (UID: \"f9ee8175-b718-4d22-aee2-73f7f9f2ff75\") " pod="openshift-marketplace/certified-operators-bzdqx" Oct 01 15:50:16 crc kubenswrapper[4771]: I1001 15:50:16.227594 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9ee8175-b718-4d22-aee2-73f7f9f2ff75-catalog-content\") pod \"certified-operators-bzdqx\" (UID: \"f9ee8175-b718-4d22-aee2-73f7f9f2ff75\") " pod="openshift-marketplace/certified-operators-bzdqx" Oct 01 15:50:16 crc kubenswrapper[4771]: I1001 15:50:16.228256 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9ee8175-b718-4d22-aee2-73f7f9f2ff75-catalog-content\") pod \"certified-operators-bzdqx\" (UID: \"f9ee8175-b718-4d22-aee2-73f7f9f2ff75\") " pod="openshift-marketplace/certified-operators-bzdqx" Oct 01 15:50:16 crc kubenswrapper[4771]: I1001 15:50:16.228304 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9ee8175-b718-4d22-aee2-73f7f9f2ff75-utilities\") pod \"certified-operators-bzdqx\" (UID: \"f9ee8175-b718-4d22-aee2-73f7f9f2ff75\") " pod="openshift-marketplace/certified-operators-bzdqx" Oct 01 15:50:16 crc kubenswrapper[4771]: I1001 15:50:16.251724 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-666x4\" (UniqueName: \"kubernetes.io/projected/f9ee8175-b718-4d22-aee2-73f7f9f2ff75-kube-api-access-666x4\") pod \"certified-operators-bzdqx\" (UID: \"f9ee8175-b718-4d22-aee2-73f7f9f2ff75\") " pod="openshift-marketplace/certified-operators-bzdqx" Oct 01 15:50:16 crc kubenswrapper[4771]: I1001 15:50:16.421406 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bzdqx" Oct 01 15:50:16 crc kubenswrapper[4771]: I1001 15:50:16.932986 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bzdqx"] Oct 01 15:50:17 crc kubenswrapper[4771]: I1001 15:50:17.645867 4771 generic.go:334] "Generic (PLEG): container finished" podID="f9ee8175-b718-4d22-aee2-73f7f9f2ff75" containerID="40c4b55496a0fbeb76a8950ccb7b39c60289c94e0890ccf3b1d53390540b5387" exitCode=0 Oct 01 15:50:17 crc kubenswrapper[4771]: I1001 15:50:17.646128 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bzdqx" event={"ID":"f9ee8175-b718-4d22-aee2-73f7f9f2ff75","Type":"ContainerDied","Data":"40c4b55496a0fbeb76a8950ccb7b39c60289c94e0890ccf3b1d53390540b5387"} Oct 01 15:50:17 crc kubenswrapper[4771]: I1001 15:50:17.646916 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bzdqx" event={"ID":"f9ee8175-b718-4d22-aee2-73f7f9f2ff75","Type":"ContainerStarted","Data":"66351f4ff23278a7a6755be7b2c22aa08f8a8a2b4a090cc68d34be6a03b191e6"} Oct 01 15:50:17 crc kubenswrapper[4771]: I1001 15:50:17.649538 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 15:50:19 crc kubenswrapper[4771]: I1001 15:50:19.664506 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bzdqx" event={"ID":"f9ee8175-b718-4d22-aee2-73f7f9f2ff75","Type":"ContainerStarted","Data":"6b5a1933bca26e257dc0bf57966c355bb4e9bf87ebed73626b1a64c5e0396efe"} Oct 01 15:50:20 crc kubenswrapper[4771]: I1001 15:50:20.679010 4771 generic.go:334] "Generic (PLEG): container finished" podID="f9ee8175-b718-4d22-aee2-73f7f9f2ff75" containerID="6b5a1933bca26e257dc0bf57966c355bb4e9bf87ebed73626b1a64c5e0396efe" exitCode=0 Oct 01 15:50:20 crc kubenswrapper[4771]: I1001 15:50:20.679053 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bzdqx" event={"ID":"f9ee8175-b718-4d22-aee2-73f7f9f2ff75","Type":"ContainerDied","Data":"6b5a1933bca26e257dc0bf57966c355bb4e9bf87ebed73626b1a64c5e0396efe"} Oct 01 15:50:24 crc kubenswrapper[4771]: I1001 15:50:24.755200 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bzdqx" event={"ID":"f9ee8175-b718-4d22-aee2-73f7f9f2ff75","Type":"ContainerStarted","Data":"5ef2d2a36dc7fa3b98090f08d209efff26de1905a11e2be1107454cc1752022f"} Oct 01 15:50:26 crc kubenswrapper[4771]: I1001 15:50:26.421724 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bzdqx" Oct 01 15:50:26 crc kubenswrapper[4771]: I1001 15:50:26.422997 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bzdqx" Oct 01 15:50:26 crc kubenswrapper[4771]: I1001 15:50:26.489613 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bzdqx" Oct 01 15:50:26 crc kubenswrapper[4771]: I1001 15:50:26.513106 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bzdqx" podStartSLOduration=4.471387863 podStartE2EDuration="10.51308707s" podCreationTimestamp="2025-10-01 15:50:16 +0000 UTC" firstStartedPulling="2025-10-01 15:50:17.649250427 +0000 UTC m=+3262.268425608" lastFinishedPulling="2025-10-01 15:50:23.690949644 +0000 UTC m=+3268.310124815" observedRunningTime="2025-10-01 15:50:24.790269922 +0000 UTC m=+3269.409445103" watchObservedRunningTime="2025-10-01 15:50:26.51308707 +0000 UTC m=+3271.132262251" Oct 01 15:50:36 crc kubenswrapper[4771]: I1001 15:50:36.486374 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bzdqx" Oct 01 15:50:36 crc kubenswrapper[4771]: I1001 15:50:36.553412 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bzdqx"] Oct 01 15:50:36 crc kubenswrapper[4771]: I1001 15:50:36.876495 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bzdqx" podUID="f9ee8175-b718-4d22-aee2-73f7f9f2ff75" containerName="registry-server" containerID="cri-o://5ef2d2a36dc7fa3b98090f08d209efff26de1905a11e2be1107454cc1752022f" gracePeriod=2 Oct 01 15:50:37 crc kubenswrapper[4771]: I1001 15:50:37.370950 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bzdqx" Oct 01 15:50:37 crc kubenswrapper[4771]: I1001 15:50:37.457562 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9ee8175-b718-4d22-aee2-73f7f9f2ff75-utilities\") pod \"f9ee8175-b718-4d22-aee2-73f7f9f2ff75\" (UID: \"f9ee8175-b718-4d22-aee2-73f7f9f2ff75\") " Oct 01 15:50:37 crc kubenswrapper[4771]: I1001 15:50:37.458923 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9ee8175-b718-4d22-aee2-73f7f9f2ff75-utilities" (OuterVolumeSpecName: "utilities") pod "f9ee8175-b718-4d22-aee2-73f7f9f2ff75" (UID: "f9ee8175-b718-4d22-aee2-73f7f9f2ff75"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:50:37 crc kubenswrapper[4771]: I1001 15:50:37.459140 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9ee8175-b718-4d22-aee2-73f7f9f2ff75-catalog-content\") pod \"f9ee8175-b718-4d22-aee2-73f7f9f2ff75\" (UID: \"f9ee8175-b718-4d22-aee2-73f7f9f2ff75\") " Oct 01 15:50:37 crc kubenswrapper[4771]: I1001 15:50:37.459378 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-666x4\" (UniqueName: \"kubernetes.io/projected/f9ee8175-b718-4d22-aee2-73f7f9f2ff75-kube-api-access-666x4\") pod \"f9ee8175-b718-4d22-aee2-73f7f9f2ff75\" (UID: \"f9ee8175-b718-4d22-aee2-73f7f9f2ff75\") " Oct 01 15:50:37 crc kubenswrapper[4771]: I1001 15:50:37.460349 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9ee8175-b718-4d22-aee2-73f7f9f2ff75-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 15:50:37 crc kubenswrapper[4771]: I1001 15:50:37.466379 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9ee8175-b718-4d22-aee2-73f7f9f2ff75-kube-api-access-666x4" (OuterVolumeSpecName: "kube-api-access-666x4") pod "f9ee8175-b718-4d22-aee2-73f7f9f2ff75" (UID: "f9ee8175-b718-4d22-aee2-73f7f9f2ff75"). InnerVolumeSpecName "kube-api-access-666x4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:50:37 crc kubenswrapper[4771]: I1001 15:50:37.511618 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9ee8175-b718-4d22-aee2-73f7f9f2ff75-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f9ee8175-b718-4d22-aee2-73f7f9f2ff75" (UID: "f9ee8175-b718-4d22-aee2-73f7f9f2ff75"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:50:37 crc kubenswrapper[4771]: I1001 15:50:37.562245 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-666x4\" (UniqueName: \"kubernetes.io/projected/f9ee8175-b718-4d22-aee2-73f7f9f2ff75-kube-api-access-666x4\") on node \"crc\" DevicePath \"\"" Oct 01 15:50:37 crc kubenswrapper[4771]: I1001 15:50:37.562320 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9ee8175-b718-4d22-aee2-73f7f9f2ff75-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 15:50:37 crc kubenswrapper[4771]: I1001 15:50:37.884661 4771 generic.go:334] "Generic (PLEG): container finished" podID="f9ee8175-b718-4d22-aee2-73f7f9f2ff75" containerID="5ef2d2a36dc7fa3b98090f08d209efff26de1905a11e2be1107454cc1752022f" exitCode=0 Oct 01 15:50:37 crc kubenswrapper[4771]: I1001 15:50:37.884712 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bzdqx" event={"ID":"f9ee8175-b718-4d22-aee2-73f7f9f2ff75","Type":"ContainerDied","Data":"5ef2d2a36dc7fa3b98090f08d209efff26de1905a11e2be1107454cc1752022f"} Oct 01 15:50:37 crc kubenswrapper[4771]: I1001 15:50:37.884761 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bzdqx" event={"ID":"f9ee8175-b718-4d22-aee2-73f7f9f2ff75","Type":"ContainerDied","Data":"66351f4ff23278a7a6755be7b2c22aa08f8a8a2b4a090cc68d34be6a03b191e6"} Oct 01 15:50:37 crc kubenswrapper[4771]: I1001 15:50:37.884783 4771 scope.go:117] "RemoveContainer" containerID="5ef2d2a36dc7fa3b98090f08d209efff26de1905a11e2be1107454cc1752022f" Oct 01 15:50:37 crc kubenswrapper[4771]: I1001 15:50:37.884792 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bzdqx" Oct 01 15:50:37 crc kubenswrapper[4771]: I1001 15:50:37.906880 4771 scope.go:117] "RemoveContainer" containerID="6b5a1933bca26e257dc0bf57966c355bb4e9bf87ebed73626b1a64c5e0396efe" Oct 01 15:50:37 crc kubenswrapper[4771]: I1001 15:50:37.921149 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bzdqx"] Oct 01 15:50:37 crc kubenswrapper[4771]: I1001 15:50:37.929181 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bzdqx"] Oct 01 15:50:37 crc kubenswrapper[4771]: I1001 15:50:37.949364 4771 scope.go:117] "RemoveContainer" containerID="40c4b55496a0fbeb76a8950ccb7b39c60289c94e0890ccf3b1d53390540b5387" Oct 01 15:50:37 crc kubenswrapper[4771]: I1001 15:50:37.995995 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9ee8175-b718-4d22-aee2-73f7f9f2ff75" path="/var/lib/kubelet/pods/f9ee8175-b718-4d22-aee2-73f7f9f2ff75/volumes" Oct 01 15:50:38 crc kubenswrapper[4771]: I1001 15:50:38.031808 4771 scope.go:117] "RemoveContainer" containerID="5ef2d2a36dc7fa3b98090f08d209efff26de1905a11e2be1107454cc1752022f" Oct 01 15:50:38 crc kubenswrapper[4771]: E1001 15:50:38.032300 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ef2d2a36dc7fa3b98090f08d209efff26de1905a11e2be1107454cc1752022f\": container with ID starting with 5ef2d2a36dc7fa3b98090f08d209efff26de1905a11e2be1107454cc1752022f not found: ID does not exist" containerID="5ef2d2a36dc7fa3b98090f08d209efff26de1905a11e2be1107454cc1752022f" Oct 01 15:50:38 crc kubenswrapper[4771]: I1001 15:50:38.032357 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ef2d2a36dc7fa3b98090f08d209efff26de1905a11e2be1107454cc1752022f"} err="failed to get container status \"5ef2d2a36dc7fa3b98090f08d209efff26de1905a11e2be1107454cc1752022f\": rpc error: code = NotFound desc = could not find container \"5ef2d2a36dc7fa3b98090f08d209efff26de1905a11e2be1107454cc1752022f\": container with ID starting with 5ef2d2a36dc7fa3b98090f08d209efff26de1905a11e2be1107454cc1752022f not found: ID does not exist" Oct 01 15:50:38 crc kubenswrapper[4771]: I1001 15:50:38.032391 4771 scope.go:117] "RemoveContainer" containerID="6b5a1933bca26e257dc0bf57966c355bb4e9bf87ebed73626b1a64c5e0396efe" Oct 01 15:50:38 crc kubenswrapper[4771]: E1001 15:50:38.033561 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b5a1933bca26e257dc0bf57966c355bb4e9bf87ebed73626b1a64c5e0396efe\": container with ID starting with 6b5a1933bca26e257dc0bf57966c355bb4e9bf87ebed73626b1a64c5e0396efe not found: ID does not exist" containerID="6b5a1933bca26e257dc0bf57966c355bb4e9bf87ebed73626b1a64c5e0396efe" Oct 01 15:50:38 crc kubenswrapper[4771]: I1001 15:50:38.033598 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b5a1933bca26e257dc0bf57966c355bb4e9bf87ebed73626b1a64c5e0396efe"} err="failed to get container status \"6b5a1933bca26e257dc0bf57966c355bb4e9bf87ebed73626b1a64c5e0396efe\": rpc error: code = NotFound desc = could not find container \"6b5a1933bca26e257dc0bf57966c355bb4e9bf87ebed73626b1a64c5e0396efe\": container with ID starting with 6b5a1933bca26e257dc0bf57966c355bb4e9bf87ebed73626b1a64c5e0396efe not found: ID does not exist" Oct 01 15:50:38 crc kubenswrapper[4771]: I1001 15:50:38.033619 4771 scope.go:117] "RemoveContainer" containerID="40c4b55496a0fbeb76a8950ccb7b39c60289c94e0890ccf3b1d53390540b5387" Oct 01 15:50:38 crc kubenswrapper[4771]: E1001 15:50:38.034071 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40c4b55496a0fbeb76a8950ccb7b39c60289c94e0890ccf3b1d53390540b5387\": container with ID starting with 40c4b55496a0fbeb76a8950ccb7b39c60289c94e0890ccf3b1d53390540b5387 not found: ID does not exist" containerID="40c4b55496a0fbeb76a8950ccb7b39c60289c94e0890ccf3b1d53390540b5387" Oct 01 15:50:38 crc kubenswrapper[4771]: I1001 15:50:38.034115 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40c4b55496a0fbeb76a8950ccb7b39c60289c94e0890ccf3b1d53390540b5387"} err="failed to get container status \"40c4b55496a0fbeb76a8950ccb7b39c60289c94e0890ccf3b1d53390540b5387\": rpc error: code = NotFound desc = could not find container \"40c4b55496a0fbeb76a8950ccb7b39c60289c94e0890ccf3b1d53390540b5387\": container with ID starting with 40c4b55496a0fbeb76a8950ccb7b39c60289c94e0890ccf3b1d53390540b5387 not found: ID does not exist" Oct 01 15:50:55 crc kubenswrapper[4771]: I1001 15:50:55.910823 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gsf5j"] Oct 01 15:50:55 crc kubenswrapper[4771]: E1001 15:50:55.911961 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9ee8175-b718-4d22-aee2-73f7f9f2ff75" containerName="extract-utilities" Oct 01 15:50:55 crc kubenswrapper[4771]: I1001 15:50:55.912092 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9ee8175-b718-4d22-aee2-73f7f9f2ff75" containerName="extract-utilities" Oct 01 15:50:55 crc kubenswrapper[4771]: E1001 15:50:55.912138 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9ee8175-b718-4d22-aee2-73f7f9f2ff75" containerName="extract-content" Oct 01 15:50:55 crc kubenswrapper[4771]: I1001 15:50:55.912147 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9ee8175-b718-4d22-aee2-73f7f9f2ff75" containerName="extract-content" Oct 01 15:50:55 crc kubenswrapper[4771]: E1001 15:50:55.912163 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9ee8175-b718-4d22-aee2-73f7f9f2ff75" containerName="registry-server" Oct 01 15:50:55 crc kubenswrapper[4771]: I1001 15:50:55.912172 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9ee8175-b718-4d22-aee2-73f7f9f2ff75" containerName="registry-server" Oct 01 15:50:55 crc kubenswrapper[4771]: I1001 15:50:55.912421 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9ee8175-b718-4d22-aee2-73f7f9f2ff75" containerName="registry-server" Oct 01 15:50:55 crc kubenswrapper[4771]: I1001 15:50:55.914284 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gsf5j" Oct 01 15:50:55 crc kubenswrapper[4771]: I1001 15:50:55.925890 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gsf5j"] Oct 01 15:50:56 crc kubenswrapper[4771]: I1001 15:50:56.004620 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c5ffd2e-7290-4ba5-ac2b-708af722de19-utilities\") pod \"redhat-marketplace-gsf5j\" (UID: \"7c5ffd2e-7290-4ba5-ac2b-708af722de19\") " pod="openshift-marketplace/redhat-marketplace-gsf5j" Oct 01 15:50:56 crc kubenswrapper[4771]: I1001 15:50:56.004721 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvjz9\" (UniqueName: \"kubernetes.io/projected/7c5ffd2e-7290-4ba5-ac2b-708af722de19-kube-api-access-nvjz9\") pod \"redhat-marketplace-gsf5j\" (UID: \"7c5ffd2e-7290-4ba5-ac2b-708af722de19\") " pod="openshift-marketplace/redhat-marketplace-gsf5j" Oct 01 15:50:56 crc kubenswrapper[4771]: I1001 15:50:56.004830 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c5ffd2e-7290-4ba5-ac2b-708af722de19-catalog-content\") pod \"redhat-marketplace-gsf5j\" (UID: \"7c5ffd2e-7290-4ba5-ac2b-708af722de19\") " pod="openshift-marketplace/redhat-marketplace-gsf5j" Oct 01 15:50:56 crc kubenswrapper[4771]: I1001 15:50:56.106201 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvjz9\" (UniqueName: \"kubernetes.io/projected/7c5ffd2e-7290-4ba5-ac2b-708af722de19-kube-api-access-nvjz9\") pod \"redhat-marketplace-gsf5j\" (UID: \"7c5ffd2e-7290-4ba5-ac2b-708af722de19\") " pod="openshift-marketplace/redhat-marketplace-gsf5j" Oct 01 15:50:56 crc kubenswrapper[4771]: I1001 15:50:56.106357 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c5ffd2e-7290-4ba5-ac2b-708af722de19-catalog-content\") pod \"redhat-marketplace-gsf5j\" (UID: \"7c5ffd2e-7290-4ba5-ac2b-708af722de19\") " pod="openshift-marketplace/redhat-marketplace-gsf5j" Oct 01 15:50:56 crc kubenswrapper[4771]: I1001 15:50:56.106513 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c5ffd2e-7290-4ba5-ac2b-708af722de19-utilities\") pod \"redhat-marketplace-gsf5j\" (UID: \"7c5ffd2e-7290-4ba5-ac2b-708af722de19\") " pod="openshift-marketplace/redhat-marketplace-gsf5j" Oct 01 15:50:56 crc kubenswrapper[4771]: I1001 15:50:56.107090 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c5ffd2e-7290-4ba5-ac2b-708af722de19-utilities\") pod \"redhat-marketplace-gsf5j\" (UID: \"7c5ffd2e-7290-4ba5-ac2b-708af722de19\") " pod="openshift-marketplace/redhat-marketplace-gsf5j" Oct 01 15:50:56 crc kubenswrapper[4771]: I1001 15:50:56.107237 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c5ffd2e-7290-4ba5-ac2b-708af722de19-catalog-content\") pod \"redhat-marketplace-gsf5j\" (UID: \"7c5ffd2e-7290-4ba5-ac2b-708af722de19\") " pod="openshift-marketplace/redhat-marketplace-gsf5j" Oct 01 15:50:56 crc kubenswrapper[4771]: I1001 15:50:56.142807 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvjz9\" (UniqueName: \"kubernetes.io/projected/7c5ffd2e-7290-4ba5-ac2b-708af722de19-kube-api-access-nvjz9\") pod \"redhat-marketplace-gsf5j\" (UID: \"7c5ffd2e-7290-4ba5-ac2b-708af722de19\") " pod="openshift-marketplace/redhat-marketplace-gsf5j" Oct 01 15:50:56 crc kubenswrapper[4771]: I1001 15:50:56.235634 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gsf5j" Oct 01 15:50:56 crc kubenswrapper[4771]: I1001 15:50:56.713665 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gsf5j"] Oct 01 15:50:57 crc kubenswrapper[4771]: I1001 15:50:57.066718 4771 generic.go:334] "Generic (PLEG): container finished" podID="7c5ffd2e-7290-4ba5-ac2b-708af722de19" containerID="936b2c1655230641d400a61df40112aad752ce84f171dcda2444f43ac63a79c8" exitCode=0 Oct 01 15:50:57 crc kubenswrapper[4771]: I1001 15:50:57.066835 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gsf5j" event={"ID":"7c5ffd2e-7290-4ba5-ac2b-708af722de19","Type":"ContainerDied","Data":"936b2c1655230641d400a61df40112aad752ce84f171dcda2444f43ac63a79c8"} Oct 01 15:50:57 crc kubenswrapper[4771]: I1001 15:50:57.067016 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gsf5j" event={"ID":"7c5ffd2e-7290-4ba5-ac2b-708af722de19","Type":"ContainerStarted","Data":"97e46f8c988f9607faffca1b048eee31c1d205d83039427cd6839ff3cedffb22"} Oct 01 15:50:59 crc kubenswrapper[4771]: I1001 15:50:59.086968 4771 generic.go:334] "Generic (PLEG): container finished" podID="7c5ffd2e-7290-4ba5-ac2b-708af722de19" containerID="b19e5c69fc41868126d5d4f16d17af21f0cdaa590eca8f4b4bd460abb0b7da67" exitCode=0 Oct 01 15:50:59 crc kubenswrapper[4771]: I1001 15:50:59.087041 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gsf5j" event={"ID":"7c5ffd2e-7290-4ba5-ac2b-708af722de19","Type":"ContainerDied","Data":"b19e5c69fc41868126d5d4f16d17af21f0cdaa590eca8f4b4bd460abb0b7da67"} Oct 01 15:51:00 crc kubenswrapper[4771]: I1001 15:51:00.098221 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gsf5j" event={"ID":"7c5ffd2e-7290-4ba5-ac2b-708af722de19","Type":"ContainerStarted","Data":"bfa61e3b2a8801adde6468cae392f06e39613d81d7b9565b8f449318bd7f79b8"} Oct 01 15:51:00 crc kubenswrapper[4771]: I1001 15:51:00.123309 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gsf5j" podStartSLOduration=2.563425331 podStartE2EDuration="5.123292997s" podCreationTimestamp="2025-10-01 15:50:55 +0000 UTC" firstStartedPulling="2025-10-01 15:50:57.068323031 +0000 UTC m=+3301.687498202" lastFinishedPulling="2025-10-01 15:50:59.628190697 +0000 UTC m=+3304.247365868" observedRunningTime="2025-10-01 15:51:00.120179881 +0000 UTC m=+3304.739355052" watchObservedRunningTime="2025-10-01 15:51:00.123292997 +0000 UTC m=+3304.742468168" Oct 01 15:51:06 crc kubenswrapper[4771]: I1001 15:51:06.236022 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gsf5j" Oct 01 15:51:06 crc kubenswrapper[4771]: I1001 15:51:06.236538 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gsf5j" Oct 01 15:51:06 crc kubenswrapper[4771]: I1001 15:51:06.287346 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gsf5j" Oct 01 15:51:07 crc kubenswrapper[4771]: I1001 15:51:07.219315 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gsf5j" Oct 01 15:51:07 crc kubenswrapper[4771]: I1001 15:51:07.271314 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gsf5j"] Oct 01 15:51:09 crc kubenswrapper[4771]: I1001 15:51:09.188155 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gsf5j" podUID="7c5ffd2e-7290-4ba5-ac2b-708af722de19" containerName="registry-server" containerID="cri-o://bfa61e3b2a8801adde6468cae392f06e39613d81d7b9565b8f449318bd7f79b8" gracePeriod=2 Oct 01 15:51:09 crc kubenswrapper[4771]: I1001 15:51:09.651347 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gsf5j" Oct 01 15:51:09 crc kubenswrapper[4771]: I1001 15:51:09.772290 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c5ffd2e-7290-4ba5-ac2b-708af722de19-utilities\") pod \"7c5ffd2e-7290-4ba5-ac2b-708af722de19\" (UID: \"7c5ffd2e-7290-4ba5-ac2b-708af722de19\") " Oct 01 15:51:09 crc kubenswrapper[4771]: I1001 15:51:09.772480 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvjz9\" (UniqueName: \"kubernetes.io/projected/7c5ffd2e-7290-4ba5-ac2b-708af722de19-kube-api-access-nvjz9\") pod \"7c5ffd2e-7290-4ba5-ac2b-708af722de19\" (UID: \"7c5ffd2e-7290-4ba5-ac2b-708af722de19\") " Oct 01 15:51:09 crc kubenswrapper[4771]: I1001 15:51:09.772600 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c5ffd2e-7290-4ba5-ac2b-708af722de19-catalog-content\") pod \"7c5ffd2e-7290-4ba5-ac2b-708af722de19\" (UID: \"7c5ffd2e-7290-4ba5-ac2b-708af722de19\") " Oct 01 15:51:09 crc kubenswrapper[4771]: I1001 15:51:09.773474 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c5ffd2e-7290-4ba5-ac2b-708af722de19-utilities" (OuterVolumeSpecName: "utilities") pod "7c5ffd2e-7290-4ba5-ac2b-708af722de19" (UID: "7c5ffd2e-7290-4ba5-ac2b-708af722de19"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:51:09 crc kubenswrapper[4771]: I1001 15:51:09.777814 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c5ffd2e-7290-4ba5-ac2b-708af722de19-kube-api-access-nvjz9" (OuterVolumeSpecName: "kube-api-access-nvjz9") pod "7c5ffd2e-7290-4ba5-ac2b-708af722de19" (UID: "7c5ffd2e-7290-4ba5-ac2b-708af722de19"). InnerVolumeSpecName "kube-api-access-nvjz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:51:09 crc kubenswrapper[4771]: I1001 15:51:09.795129 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c5ffd2e-7290-4ba5-ac2b-708af722de19-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7c5ffd2e-7290-4ba5-ac2b-708af722de19" (UID: "7c5ffd2e-7290-4ba5-ac2b-708af722de19"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:51:09 crc kubenswrapper[4771]: I1001 15:51:09.875405 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c5ffd2e-7290-4ba5-ac2b-708af722de19-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 15:51:09 crc kubenswrapper[4771]: I1001 15:51:09.875710 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvjz9\" (UniqueName: \"kubernetes.io/projected/7c5ffd2e-7290-4ba5-ac2b-708af722de19-kube-api-access-nvjz9\") on node \"crc\" DevicePath \"\"" Oct 01 15:51:09 crc kubenswrapper[4771]: I1001 15:51:09.875805 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c5ffd2e-7290-4ba5-ac2b-708af722de19-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 15:51:10 crc kubenswrapper[4771]: I1001 15:51:10.198115 4771 generic.go:334] "Generic (PLEG): container finished" podID="7c5ffd2e-7290-4ba5-ac2b-708af722de19" containerID="bfa61e3b2a8801adde6468cae392f06e39613d81d7b9565b8f449318bd7f79b8" exitCode=0 Oct 01 15:51:10 crc kubenswrapper[4771]: I1001 15:51:10.198168 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gsf5j" event={"ID":"7c5ffd2e-7290-4ba5-ac2b-708af722de19","Type":"ContainerDied","Data":"bfa61e3b2a8801adde6468cae392f06e39613d81d7b9565b8f449318bd7f79b8"} Oct 01 15:51:10 crc kubenswrapper[4771]: I1001 15:51:10.198201 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gsf5j" Oct 01 15:51:10 crc kubenswrapper[4771]: I1001 15:51:10.198220 4771 scope.go:117] "RemoveContainer" containerID="bfa61e3b2a8801adde6468cae392f06e39613d81d7b9565b8f449318bd7f79b8" Oct 01 15:51:10 crc kubenswrapper[4771]: I1001 15:51:10.198204 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gsf5j" event={"ID":"7c5ffd2e-7290-4ba5-ac2b-708af722de19","Type":"ContainerDied","Data":"97e46f8c988f9607faffca1b048eee31c1d205d83039427cd6839ff3cedffb22"} Oct 01 15:51:10 crc kubenswrapper[4771]: I1001 15:51:10.218809 4771 scope.go:117] "RemoveContainer" containerID="b19e5c69fc41868126d5d4f16d17af21f0cdaa590eca8f4b4bd460abb0b7da67" Oct 01 15:51:10 crc kubenswrapper[4771]: I1001 15:51:10.232773 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gsf5j"] Oct 01 15:51:10 crc kubenswrapper[4771]: I1001 15:51:10.237854 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gsf5j"] Oct 01 15:51:10 crc kubenswrapper[4771]: I1001 15:51:10.252962 4771 scope.go:117] "RemoveContainer" containerID="936b2c1655230641d400a61df40112aad752ce84f171dcda2444f43ac63a79c8" Oct 01 15:51:10 crc kubenswrapper[4771]: I1001 15:51:10.294012 4771 scope.go:117] "RemoveContainer" containerID="bfa61e3b2a8801adde6468cae392f06e39613d81d7b9565b8f449318bd7f79b8" Oct 01 15:51:10 crc kubenswrapper[4771]: E1001 15:51:10.294548 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfa61e3b2a8801adde6468cae392f06e39613d81d7b9565b8f449318bd7f79b8\": container with ID starting with bfa61e3b2a8801adde6468cae392f06e39613d81d7b9565b8f449318bd7f79b8 not found: ID does not exist" containerID="bfa61e3b2a8801adde6468cae392f06e39613d81d7b9565b8f449318bd7f79b8" Oct 01 15:51:10 crc kubenswrapper[4771]: I1001 15:51:10.294582 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfa61e3b2a8801adde6468cae392f06e39613d81d7b9565b8f449318bd7f79b8"} err="failed to get container status \"bfa61e3b2a8801adde6468cae392f06e39613d81d7b9565b8f449318bd7f79b8\": rpc error: code = NotFound desc = could not find container \"bfa61e3b2a8801adde6468cae392f06e39613d81d7b9565b8f449318bd7f79b8\": container with ID starting with bfa61e3b2a8801adde6468cae392f06e39613d81d7b9565b8f449318bd7f79b8 not found: ID does not exist" Oct 01 15:51:10 crc kubenswrapper[4771]: I1001 15:51:10.294601 4771 scope.go:117] "RemoveContainer" containerID="b19e5c69fc41868126d5d4f16d17af21f0cdaa590eca8f4b4bd460abb0b7da67" Oct 01 15:51:10 crc kubenswrapper[4771]: E1001 15:51:10.295105 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b19e5c69fc41868126d5d4f16d17af21f0cdaa590eca8f4b4bd460abb0b7da67\": container with ID starting with b19e5c69fc41868126d5d4f16d17af21f0cdaa590eca8f4b4bd460abb0b7da67 not found: ID does not exist" containerID="b19e5c69fc41868126d5d4f16d17af21f0cdaa590eca8f4b4bd460abb0b7da67" Oct 01 15:51:10 crc kubenswrapper[4771]: I1001 15:51:10.295167 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b19e5c69fc41868126d5d4f16d17af21f0cdaa590eca8f4b4bd460abb0b7da67"} err="failed to get container status \"b19e5c69fc41868126d5d4f16d17af21f0cdaa590eca8f4b4bd460abb0b7da67\": rpc error: code = NotFound desc = could not find container \"b19e5c69fc41868126d5d4f16d17af21f0cdaa590eca8f4b4bd460abb0b7da67\": container with ID starting with b19e5c69fc41868126d5d4f16d17af21f0cdaa590eca8f4b4bd460abb0b7da67 not found: ID does not exist" Oct 01 15:51:10 crc kubenswrapper[4771]: I1001 15:51:10.295197 4771 scope.go:117] "RemoveContainer" containerID="936b2c1655230641d400a61df40112aad752ce84f171dcda2444f43ac63a79c8" Oct 01 15:51:10 crc kubenswrapper[4771]: E1001 15:51:10.295672 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"936b2c1655230641d400a61df40112aad752ce84f171dcda2444f43ac63a79c8\": container with ID starting with 936b2c1655230641d400a61df40112aad752ce84f171dcda2444f43ac63a79c8 not found: ID does not exist" containerID="936b2c1655230641d400a61df40112aad752ce84f171dcda2444f43ac63a79c8" Oct 01 15:51:10 crc kubenswrapper[4771]: I1001 15:51:10.295768 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"936b2c1655230641d400a61df40112aad752ce84f171dcda2444f43ac63a79c8"} err="failed to get container status \"936b2c1655230641d400a61df40112aad752ce84f171dcda2444f43ac63a79c8\": rpc error: code = NotFound desc = could not find container \"936b2c1655230641d400a61df40112aad752ce84f171dcda2444f43ac63a79c8\": container with ID starting with 936b2c1655230641d400a61df40112aad752ce84f171dcda2444f43ac63a79c8 not found: ID does not exist" Oct 01 15:51:12 crc kubenswrapper[4771]: I1001 15:51:12.014359 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c5ffd2e-7290-4ba5-ac2b-708af722de19" path="/var/lib/kubelet/pods/7c5ffd2e-7290-4ba5-ac2b-708af722de19/volumes" Oct 01 15:51:42 crc kubenswrapper[4771]: I1001 15:51:42.177215 4771 patch_prober.go:28] interesting pod/machine-config-daemon-vck47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 15:51:42 crc kubenswrapper[4771]: I1001 15:51:42.177911 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 15:52:12 crc kubenswrapper[4771]: I1001 15:52:12.177076 4771 patch_prober.go:28] interesting pod/machine-config-daemon-vck47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 15:52:12 crc kubenswrapper[4771]: I1001 15:52:12.177672 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 15:52:42 crc kubenswrapper[4771]: I1001 15:52:42.177589 4771 patch_prober.go:28] interesting pod/machine-config-daemon-vck47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 15:52:42 crc kubenswrapper[4771]: I1001 15:52:42.178419 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 15:52:42 crc kubenswrapper[4771]: I1001 15:52:42.178500 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vck47" Oct 01 15:52:42 crc kubenswrapper[4771]: I1001 15:52:42.180018 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1abb350ca0e7c414aeed7a01011f0b66cc36f50524fa0b00e58e82c7df6d5615"} pod="openshift-machine-config-operator/machine-config-daemon-vck47" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 15:52:42 crc kubenswrapper[4771]: I1001 15:52:42.180151 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" containerID="cri-o://1abb350ca0e7c414aeed7a01011f0b66cc36f50524fa0b00e58e82c7df6d5615" gracePeriod=600 Oct 01 15:52:43 crc kubenswrapper[4771]: I1001 15:52:43.169125 4771 generic.go:334] "Generic (PLEG): container finished" podID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerID="1abb350ca0e7c414aeed7a01011f0b66cc36f50524fa0b00e58e82c7df6d5615" exitCode=0 Oct 01 15:52:43 crc kubenswrapper[4771]: I1001 15:52:43.169384 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" event={"ID":"289ee6d3-fabe-417f-964c-76ca03c143cc","Type":"ContainerDied","Data":"1abb350ca0e7c414aeed7a01011f0b66cc36f50524fa0b00e58e82c7df6d5615"} Oct 01 15:52:43 crc kubenswrapper[4771]: I1001 15:52:43.169720 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" event={"ID":"289ee6d3-fabe-417f-964c-76ca03c143cc","Type":"ContainerStarted","Data":"a902474c94f7a5091bc2240a7b526fc1a45caf03e740411bdb189e934ec62269"} Oct 01 15:52:43 crc kubenswrapper[4771]: I1001 15:52:43.169835 4771 scope.go:117] "RemoveContainer" containerID="f0d1081783e8ca281c7f5f2c017e855d55e05891a57714ef76e9593a71d942f8" Oct 01 15:52:47 crc kubenswrapper[4771]: I1001 15:52:47.018763 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pjtl6"] Oct 01 15:52:47 crc kubenswrapper[4771]: E1001 15:52:47.019809 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c5ffd2e-7290-4ba5-ac2b-708af722de19" containerName="extract-utilities" Oct 01 15:52:47 crc kubenswrapper[4771]: I1001 15:52:47.019826 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c5ffd2e-7290-4ba5-ac2b-708af722de19" containerName="extract-utilities" Oct 01 15:52:47 crc kubenswrapper[4771]: E1001 15:52:47.019856 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c5ffd2e-7290-4ba5-ac2b-708af722de19" containerName="registry-server" Oct 01 15:52:47 crc kubenswrapper[4771]: I1001 15:52:47.019864 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c5ffd2e-7290-4ba5-ac2b-708af722de19" containerName="registry-server" Oct 01 15:52:47 crc kubenswrapper[4771]: E1001 15:52:47.019876 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c5ffd2e-7290-4ba5-ac2b-708af722de19" containerName="extract-content" Oct 01 15:52:47 crc kubenswrapper[4771]: I1001 15:52:47.019884 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c5ffd2e-7290-4ba5-ac2b-708af722de19" containerName="extract-content" Oct 01 15:52:47 crc kubenswrapper[4771]: I1001 15:52:47.020120 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c5ffd2e-7290-4ba5-ac2b-708af722de19" containerName="registry-server" Oct 01 15:52:47 crc kubenswrapper[4771]: I1001 15:52:47.021513 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pjtl6" Oct 01 15:52:47 crc kubenswrapper[4771]: I1001 15:52:47.029838 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pjtl6"] Oct 01 15:52:47 crc kubenswrapper[4771]: I1001 15:52:47.061752 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b99c7a81-3d50-4289-b666-d8078af0865b-utilities\") pod \"redhat-operators-pjtl6\" (UID: \"b99c7a81-3d50-4289-b666-d8078af0865b\") " pod="openshift-marketplace/redhat-operators-pjtl6" Oct 01 15:52:47 crc kubenswrapper[4771]: I1001 15:52:47.061823 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5zf5\" (UniqueName: \"kubernetes.io/projected/b99c7a81-3d50-4289-b666-d8078af0865b-kube-api-access-f5zf5\") pod \"redhat-operators-pjtl6\" (UID: \"b99c7a81-3d50-4289-b666-d8078af0865b\") " pod="openshift-marketplace/redhat-operators-pjtl6" Oct 01 15:52:47 crc kubenswrapper[4771]: I1001 15:52:47.061891 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b99c7a81-3d50-4289-b666-d8078af0865b-catalog-content\") pod \"redhat-operators-pjtl6\" (UID: \"b99c7a81-3d50-4289-b666-d8078af0865b\") " pod="openshift-marketplace/redhat-operators-pjtl6" Oct 01 15:52:47 crc kubenswrapper[4771]: I1001 15:52:47.164285 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b99c7a81-3d50-4289-b666-d8078af0865b-utilities\") pod \"redhat-operators-pjtl6\" (UID: \"b99c7a81-3d50-4289-b666-d8078af0865b\") " pod="openshift-marketplace/redhat-operators-pjtl6" Oct 01 15:52:47 crc kubenswrapper[4771]: I1001 15:52:47.164373 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5zf5\" (UniqueName: \"kubernetes.io/projected/b99c7a81-3d50-4289-b666-d8078af0865b-kube-api-access-f5zf5\") pod \"redhat-operators-pjtl6\" (UID: \"b99c7a81-3d50-4289-b666-d8078af0865b\") " pod="openshift-marketplace/redhat-operators-pjtl6" Oct 01 15:52:47 crc kubenswrapper[4771]: I1001 15:52:47.164449 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b99c7a81-3d50-4289-b666-d8078af0865b-catalog-content\") pod \"redhat-operators-pjtl6\" (UID: \"b99c7a81-3d50-4289-b666-d8078af0865b\") " pod="openshift-marketplace/redhat-operators-pjtl6" Oct 01 15:52:47 crc kubenswrapper[4771]: I1001 15:52:47.164885 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b99c7a81-3d50-4289-b666-d8078af0865b-catalog-content\") pod \"redhat-operators-pjtl6\" (UID: \"b99c7a81-3d50-4289-b666-d8078af0865b\") " pod="openshift-marketplace/redhat-operators-pjtl6" Oct 01 15:52:47 crc kubenswrapper[4771]: I1001 15:52:47.165096 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b99c7a81-3d50-4289-b666-d8078af0865b-utilities\") pod \"redhat-operators-pjtl6\" (UID: \"b99c7a81-3d50-4289-b666-d8078af0865b\") " pod="openshift-marketplace/redhat-operators-pjtl6" Oct 01 15:52:47 crc kubenswrapper[4771]: I1001 15:52:47.192607 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5zf5\" (UniqueName: \"kubernetes.io/projected/b99c7a81-3d50-4289-b666-d8078af0865b-kube-api-access-f5zf5\") pod \"redhat-operators-pjtl6\" (UID: \"b99c7a81-3d50-4289-b666-d8078af0865b\") " pod="openshift-marketplace/redhat-operators-pjtl6" Oct 01 15:52:47 crc kubenswrapper[4771]: I1001 15:52:47.354003 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pjtl6" Oct 01 15:52:47 crc kubenswrapper[4771]: I1001 15:52:47.836757 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pjtl6"] Oct 01 15:52:48 crc kubenswrapper[4771]: I1001 15:52:48.216417 4771 generic.go:334] "Generic (PLEG): container finished" podID="b99c7a81-3d50-4289-b666-d8078af0865b" containerID="e7574521c44babd49fe89c0d2d89214d21b0d4b43365a833c08e82e7a06420a7" exitCode=0 Oct 01 15:52:48 crc kubenswrapper[4771]: I1001 15:52:48.216592 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pjtl6" event={"ID":"b99c7a81-3d50-4289-b666-d8078af0865b","Type":"ContainerDied","Data":"e7574521c44babd49fe89c0d2d89214d21b0d4b43365a833c08e82e7a06420a7"} Oct 01 15:52:48 crc kubenswrapper[4771]: I1001 15:52:48.216839 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pjtl6" event={"ID":"b99c7a81-3d50-4289-b666-d8078af0865b","Type":"ContainerStarted","Data":"40f384f21eb0be696e2626844ea0f9ad757f70bbb0c9f1d6a5cd2647e6cef165"} Oct 01 15:52:50 crc kubenswrapper[4771]: I1001 15:52:50.244481 4771 generic.go:334] "Generic (PLEG): container finished" podID="b99c7a81-3d50-4289-b666-d8078af0865b" containerID="00f40e8a91d8da7c2ecb06a78445d23eacb44eb39ed1715edd5554c44a99f891" exitCode=0 Oct 01 15:52:50 crc kubenswrapper[4771]: I1001 15:52:50.245982 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pjtl6" event={"ID":"b99c7a81-3d50-4289-b666-d8078af0865b","Type":"ContainerDied","Data":"00f40e8a91d8da7c2ecb06a78445d23eacb44eb39ed1715edd5554c44a99f891"} Oct 01 15:52:51 crc kubenswrapper[4771]: I1001 15:52:51.260589 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pjtl6" event={"ID":"b99c7a81-3d50-4289-b666-d8078af0865b","Type":"ContainerStarted","Data":"1a617e85c2a30960acccb7a71eba704a7fd8b60f7043a3a77b738baa9f952fa2"} Oct 01 15:52:51 crc kubenswrapper[4771]: I1001 15:52:51.288874 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pjtl6" podStartSLOduration=2.843155001 podStartE2EDuration="5.288853419s" podCreationTimestamp="2025-10-01 15:52:46 +0000 UTC" firstStartedPulling="2025-10-01 15:52:48.219294905 +0000 UTC m=+3412.838470076" lastFinishedPulling="2025-10-01 15:52:50.664993313 +0000 UTC m=+3415.284168494" observedRunningTime="2025-10-01 15:52:51.284308447 +0000 UTC m=+3415.903483628" watchObservedRunningTime="2025-10-01 15:52:51.288853419 +0000 UTC m=+3415.908028590" Oct 01 15:52:57 crc kubenswrapper[4771]: I1001 15:52:57.355019 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pjtl6" Oct 01 15:52:57 crc kubenswrapper[4771]: I1001 15:52:57.355760 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pjtl6" Oct 01 15:52:57 crc kubenswrapper[4771]: I1001 15:52:57.430072 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pjtl6" Oct 01 15:52:58 crc kubenswrapper[4771]: I1001 15:52:58.401845 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pjtl6" Oct 01 15:52:58 crc kubenswrapper[4771]: I1001 15:52:58.463595 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pjtl6"] Oct 01 15:53:00 crc kubenswrapper[4771]: I1001 15:53:00.361658 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pjtl6" podUID="b99c7a81-3d50-4289-b666-d8078af0865b" containerName="registry-server" containerID="cri-o://1a617e85c2a30960acccb7a71eba704a7fd8b60f7043a3a77b738baa9f952fa2" gracePeriod=2 Oct 01 15:53:00 crc kubenswrapper[4771]: I1001 15:53:00.955620 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pjtl6" Oct 01 15:53:01 crc kubenswrapper[4771]: I1001 15:53:01.104103 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5zf5\" (UniqueName: \"kubernetes.io/projected/b99c7a81-3d50-4289-b666-d8078af0865b-kube-api-access-f5zf5\") pod \"b99c7a81-3d50-4289-b666-d8078af0865b\" (UID: \"b99c7a81-3d50-4289-b666-d8078af0865b\") " Oct 01 15:53:01 crc kubenswrapper[4771]: I1001 15:53:01.104407 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b99c7a81-3d50-4289-b666-d8078af0865b-catalog-content\") pod \"b99c7a81-3d50-4289-b666-d8078af0865b\" (UID: \"b99c7a81-3d50-4289-b666-d8078af0865b\") " Oct 01 15:53:01 crc kubenswrapper[4771]: I1001 15:53:01.104625 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b99c7a81-3d50-4289-b666-d8078af0865b-utilities\") pod \"b99c7a81-3d50-4289-b666-d8078af0865b\" (UID: \"b99c7a81-3d50-4289-b666-d8078af0865b\") " Oct 01 15:53:01 crc kubenswrapper[4771]: I1001 15:53:01.105997 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b99c7a81-3d50-4289-b666-d8078af0865b-utilities" (OuterVolumeSpecName: "utilities") pod "b99c7a81-3d50-4289-b666-d8078af0865b" (UID: "b99c7a81-3d50-4289-b666-d8078af0865b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:53:01 crc kubenswrapper[4771]: I1001 15:53:01.110084 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b99c7a81-3d50-4289-b666-d8078af0865b-kube-api-access-f5zf5" (OuterVolumeSpecName: "kube-api-access-f5zf5") pod "b99c7a81-3d50-4289-b666-d8078af0865b" (UID: "b99c7a81-3d50-4289-b666-d8078af0865b"). InnerVolumeSpecName "kube-api-access-f5zf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:53:01 crc kubenswrapper[4771]: I1001 15:53:01.169186 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b99c7a81-3d50-4289-b666-d8078af0865b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b99c7a81-3d50-4289-b666-d8078af0865b" (UID: "b99c7a81-3d50-4289-b666-d8078af0865b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:53:01 crc kubenswrapper[4771]: I1001 15:53:01.207701 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b99c7a81-3d50-4289-b666-d8078af0865b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 15:53:01 crc kubenswrapper[4771]: I1001 15:53:01.207746 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b99c7a81-3d50-4289-b666-d8078af0865b-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 15:53:01 crc kubenswrapper[4771]: I1001 15:53:01.207757 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5zf5\" (UniqueName: \"kubernetes.io/projected/b99c7a81-3d50-4289-b666-d8078af0865b-kube-api-access-f5zf5\") on node \"crc\" DevicePath \"\"" Oct 01 15:53:01 crc kubenswrapper[4771]: I1001 15:53:01.374306 4771 generic.go:334] "Generic (PLEG): container finished" podID="b99c7a81-3d50-4289-b666-d8078af0865b" containerID="1a617e85c2a30960acccb7a71eba704a7fd8b60f7043a3a77b738baa9f952fa2" exitCode=0 Oct 01 15:53:01 crc kubenswrapper[4771]: I1001 15:53:01.374368 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pjtl6" Oct 01 15:53:01 crc kubenswrapper[4771]: I1001 15:53:01.374403 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pjtl6" event={"ID":"b99c7a81-3d50-4289-b666-d8078af0865b","Type":"ContainerDied","Data":"1a617e85c2a30960acccb7a71eba704a7fd8b60f7043a3a77b738baa9f952fa2"} Oct 01 15:53:01 crc kubenswrapper[4771]: I1001 15:53:01.374902 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pjtl6" event={"ID":"b99c7a81-3d50-4289-b666-d8078af0865b","Type":"ContainerDied","Data":"40f384f21eb0be696e2626844ea0f9ad757f70bbb0c9f1d6a5cd2647e6cef165"} Oct 01 15:53:01 crc kubenswrapper[4771]: I1001 15:53:01.374930 4771 scope.go:117] "RemoveContainer" containerID="1a617e85c2a30960acccb7a71eba704a7fd8b60f7043a3a77b738baa9f952fa2" Oct 01 15:53:01 crc kubenswrapper[4771]: I1001 15:53:01.411300 4771 scope.go:117] "RemoveContainer" containerID="00f40e8a91d8da7c2ecb06a78445d23eacb44eb39ed1715edd5554c44a99f891" Oct 01 15:53:01 crc kubenswrapper[4771]: I1001 15:53:01.420331 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pjtl6"] Oct 01 15:53:01 crc kubenswrapper[4771]: I1001 15:53:01.430636 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pjtl6"] Oct 01 15:53:01 crc kubenswrapper[4771]: I1001 15:53:01.445920 4771 scope.go:117] "RemoveContainer" containerID="e7574521c44babd49fe89c0d2d89214d21b0d4b43365a833c08e82e7a06420a7" Oct 01 15:53:01 crc kubenswrapper[4771]: I1001 15:53:01.494926 4771 scope.go:117] "RemoveContainer" containerID="1a617e85c2a30960acccb7a71eba704a7fd8b60f7043a3a77b738baa9f952fa2" Oct 01 15:53:01 crc kubenswrapper[4771]: E1001 15:53:01.495343 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a617e85c2a30960acccb7a71eba704a7fd8b60f7043a3a77b738baa9f952fa2\": container with ID starting with 1a617e85c2a30960acccb7a71eba704a7fd8b60f7043a3a77b738baa9f952fa2 not found: ID does not exist" containerID="1a617e85c2a30960acccb7a71eba704a7fd8b60f7043a3a77b738baa9f952fa2" Oct 01 15:53:01 crc kubenswrapper[4771]: I1001 15:53:01.495374 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a617e85c2a30960acccb7a71eba704a7fd8b60f7043a3a77b738baa9f952fa2"} err="failed to get container status \"1a617e85c2a30960acccb7a71eba704a7fd8b60f7043a3a77b738baa9f952fa2\": rpc error: code = NotFound desc = could not find container \"1a617e85c2a30960acccb7a71eba704a7fd8b60f7043a3a77b738baa9f952fa2\": container with ID starting with 1a617e85c2a30960acccb7a71eba704a7fd8b60f7043a3a77b738baa9f952fa2 not found: ID does not exist" Oct 01 15:53:01 crc kubenswrapper[4771]: I1001 15:53:01.495399 4771 scope.go:117] "RemoveContainer" containerID="00f40e8a91d8da7c2ecb06a78445d23eacb44eb39ed1715edd5554c44a99f891" Oct 01 15:53:01 crc kubenswrapper[4771]: E1001 15:53:01.495792 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00f40e8a91d8da7c2ecb06a78445d23eacb44eb39ed1715edd5554c44a99f891\": container with ID starting with 00f40e8a91d8da7c2ecb06a78445d23eacb44eb39ed1715edd5554c44a99f891 not found: ID does not exist" containerID="00f40e8a91d8da7c2ecb06a78445d23eacb44eb39ed1715edd5554c44a99f891" Oct 01 15:53:01 crc kubenswrapper[4771]: I1001 15:53:01.495854 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00f40e8a91d8da7c2ecb06a78445d23eacb44eb39ed1715edd5554c44a99f891"} err="failed to get container status \"00f40e8a91d8da7c2ecb06a78445d23eacb44eb39ed1715edd5554c44a99f891\": rpc error: code = NotFound desc = could not find container \"00f40e8a91d8da7c2ecb06a78445d23eacb44eb39ed1715edd5554c44a99f891\": container with ID starting with 00f40e8a91d8da7c2ecb06a78445d23eacb44eb39ed1715edd5554c44a99f891 not found: ID does not exist" Oct 01 15:53:01 crc kubenswrapper[4771]: I1001 15:53:01.495900 4771 scope.go:117] "RemoveContainer" containerID="e7574521c44babd49fe89c0d2d89214d21b0d4b43365a833c08e82e7a06420a7" Oct 01 15:53:01 crc kubenswrapper[4771]: E1001 15:53:01.496253 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7574521c44babd49fe89c0d2d89214d21b0d4b43365a833c08e82e7a06420a7\": container with ID starting with e7574521c44babd49fe89c0d2d89214d21b0d4b43365a833c08e82e7a06420a7 not found: ID does not exist" containerID="e7574521c44babd49fe89c0d2d89214d21b0d4b43365a833c08e82e7a06420a7" Oct 01 15:53:01 crc kubenswrapper[4771]: I1001 15:53:01.496297 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7574521c44babd49fe89c0d2d89214d21b0d4b43365a833c08e82e7a06420a7"} err="failed to get container status \"e7574521c44babd49fe89c0d2d89214d21b0d4b43365a833c08e82e7a06420a7\": rpc error: code = NotFound desc = could not find container \"e7574521c44babd49fe89c0d2d89214d21b0d4b43365a833c08e82e7a06420a7\": container with ID starting with e7574521c44babd49fe89c0d2d89214d21b0d4b43365a833c08e82e7a06420a7 not found: ID does not exist" Oct 01 15:53:02 crc kubenswrapper[4771]: I1001 15:53:02.002414 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b99c7a81-3d50-4289-b666-d8078af0865b" path="/var/lib/kubelet/pods/b99c7a81-3d50-4289-b666-d8078af0865b/volumes" Oct 01 15:54:42 crc kubenswrapper[4771]: I1001 15:54:42.177445 4771 patch_prober.go:28] interesting pod/machine-config-daemon-vck47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 15:54:42 crc kubenswrapper[4771]: I1001 15:54:42.178056 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 15:54:47 crc kubenswrapper[4771]: I1001 15:54:47.417960 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pm2wx"] Oct 01 15:54:47 crc kubenswrapper[4771]: E1001 15:54:47.419130 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b99c7a81-3d50-4289-b666-d8078af0865b" containerName="registry-server" Oct 01 15:54:47 crc kubenswrapper[4771]: I1001 15:54:47.419151 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b99c7a81-3d50-4289-b666-d8078af0865b" containerName="registry-server" Oct 01 15:54:47 crc kubenswrapper[4771]: E1001 15:54:47.419171 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b99c7a81-3d50-4289-b666-d8078af0865b" containerName="extract-utilities" Oct 01 15:54:47 crc kubenswrapper[4771]: I1001 15:54:47.419181 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b99c7a81-3d50-4289-b666-d8078af0865b" containerName="extract-utilities" Oct 01 15:54:47 crc kubenswrapper[4771]: E1001 15:54:47.419218 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b99c7a81-3d50-4289-b666-d8078af0865b" containerName="extract-content" Oct 01 15:54:47 crc kubenswrapper[4771]: I1001 15:54:47.419229 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="b99c7a81-3d50-4289-b666-d8078af0865b" containerName="extract-content" Oct 01 15:54:47 crc kubenswrapper[4771]: I1001 15:54:47.419531 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="b99c7a81-3d50-4289-b666-d8078af0865b" containerName="registry-server" Oct 01 15:54:47 crc kubenswrapper[4771]: I1001 15:54:47.421690 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pm2wx" Oct 01 15:54:47 crc kubenswrapper[4771]: I1001 15:54:47.435913 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pm2wx"] Oct 01 15:54:47 crc kubenswrapper[4771]: I1001 15:54:47.473464 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cac11e97-1d2f-48a7-92a2-a45aac8aa867-catalog-content\") pod \"community-operators-pm2wx\" (UID: \"cac11e97-1d2f-48a7-92a2-a45aac8aa867\") " pod="openshift-marketplace/community-operators-pm2wx" Oct 01 15:54:47 crc kubenswrapper[4771]: I1001 15:54:47.473561 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cac11e97-1d2f-48a7-92a2-a45aac8aa867-utilities\") pod \"community-operators-pm2wx\" (UID: \"cac11e97-1d2f-48a7-92a2-a45aac8aa867\") " pod="openshift-marketplace/community-operators-pm2wx" Oct 01 15:54:47 crc kubenswrapper[4771]: I1001 15:54:47.473607 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c48v\" (UniqueName: \"kubernetes.io/projected/cac11e97-1d2f-48a7-92a2-a45aac8aa867-kube-api-access-6c48v\") pod \"community-operators-pm2wx\" (UID: \"cac11e97-1d2f-48a7-92a2-a45aac8aa867\") " pod="openshift-marketplace/community-operators-pm2wx" Oct 01 15:54:47 crc kubenswrapper[4771]: I1001 15:54:47.575289 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cac11e97-1d2f-48a7-92a2-a45aac8aa867-utilities\") pod \"community-operators-pm2wx\" (UID: \"cac11e97-1d2f-48a7-92a2-a45aac8aa867\") " pod="openshift-marketplace/community-operators-pm2wx" Oct 01 15:54:47 crc kubenswrapper[4771]: I1001 15:54:47.575424 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c48v\" (UniqueName: \"kubernetes.io/projected/cac11e97-1d2f-48a7-92a2-a45aac8aa867-kube-api-access-6c48v\") pod \"community-operators-pm2wx\" (UID: \"cac11e97-1d2f-48a7-92a2-a45aac8aa867\") " pod="openshift-marketplace/community-operators-pm2wx" Oct 01 15:54:47 crc kubenswrapper[4771]: I1001 15:54:47.575798 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cac11e97-1d2f-48a7-92a2-a45aac8aa867-catalog-content\") pod \"community-operators-pm2wx\" (UID: \"cac11e97-1d2f-48a7-92a2-a45aac8aa867\") " pod="openshift-marketplace/community-operators-pm2wx" Oct 01 15:54:47 crc kubenswrapper[4771]: I1001 15:54:47.576377 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cac11e97-1d2f-48a7-92a2-a45aac8aa867-utilities\") pod \"community-operators-pm2wx\" (UID: \"cac11e97-1d2f-48a7-92a2-a45aac8aa867\") " pod="openshift-marketplace/community-operators-pm2wx" Oct 01 15:54:47 crc kubenswrapper[4771]: I1001 15:54:47.576446 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cac11e97-1d2f-48a7-92a2-a45aac8aa867-catalog-content\") pod \"community-operators-pm2wx\" (UID: \"cac11e97-1d2f-48a7-92a2-a45aac8aa867\") " pod="openshift-marketplace/community-operators-pm2wx" Oct 01 15:54:47 crc kubenswrapper[4771]: I1001 15:54:47.597100 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c48v\" (UniqueName: \"kubernetes.io/projected/cac11e97-1d2f-48a7-92a2-a45aac8aa867-kube-api-access-6c48v\") pod \"community-operators-pm2wx\" (UID: \"cac11e97-1d2f-48a7-92a2-a45aac8aa867\") " pod="openshift-marketplace/community-operators-pm2wx" Oct 01 15:54:47 crc kubenswrapper[4771]: I1001 15:54:47.746751 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pm2wx" Oct 01 15:54:48 crc kubenswrapper[4771]: I1001 15:54:48.304522 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pm2wx"] Oct 01 15:54:48 crc kubenswrapper[4771]: I1001 15:54:48.527067 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pm2wx" event={"ID":"cac11e97-1d2f-48a7-92a2-a45aac8aa867","Type":"ContainerStarted","Data":"102373705b246b859057b96d4df53d97df77c2f1400edf2885c629032729a38d"} Oct 01 15:54:49 crc kubenswrapper[4771]: I1001 15:54:49.540250 4771 generic.go:334] "Generic (PLEG): container finished" podID="cac11e97-1d2f-48a7-92a2-a45aac8aa867" containerID="06c698def1a582b28d1b8e986125a8f9c2b6fdfbdae86a7b603fa7aa21b70617" exitCode=0 Oct 01 15:54:49 crc kubenswrapper[4771]: I1001 15:54:49.540396 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pm2wx" event={"ID":"cac11e97-1d2f-48a7-92a2-a45aac8aa867","Type":"ContainerDied","Data":"06c698def1a582b28d1b8e986125a8f9c2b6fdfbdae86a7b603fa7aa21b70617"} Oct 01 15:54:51 crc kubenswrapper[4771]: I1001 15:54:51.565202 4771 generic.go:334] "Generic (PLEG): container finished" podID="cac11e97-1d2f-48a7-92a2-a45aac8aa867" containerID="5e1b89ca1e5c584329aa943704661b6f68743a5708d52f9a698bcf1bb0cb9b05" exitCode=0 Oct 01 15:54:51 crc kubenswrapper[4771]: I1001 15:54:51.565296 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pm2wx" event={"ID":"cac11e97-1d2f-48a7-92a2-a45aac8aa867","Type":"ContainerDied","Data":"5e1b89ca1e5c584329aa943704661b6f68743a5708d52f9a698bcf1bb0cb9b05"} Oct 01 15:54:52 crc kubenswrapper[4771]: I1001 15:54:52.578992 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pm2wx" event={"ID":"cac11e97-1d2f-48a7-92a2-a45aac8aa867","Type":"ContainerStarted","Data":"0a367c49556be31788725f6f60eaab6125f2d8cc4f0f0174c096a19992ff3807"} Oct 01 15:54:52 crc kubenswrapper[4771]: I1001 15:54:52.612675 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pm2wx" podStartSLOduration=2.982832832 podStartE2EDuration="5.612647151s" podCreationTimestamp="2025-10-01 15:54:47 +0000 UTC" firstStartedPulling="2025-10-01 15:54:49.542641766 +0000 UTC m=+3534.161816947" lastFinishedPulling="2025-10-01 15:54:52.172456095 +0000 UTC m=+3536.791631266" observedRunningTime="2025-10-01 15:54:52.598373089 +0000 UTC m=+3537.217548270" watchObservedRunningTime="2025-10-01 15:54:52.612647151 +0000 UTC m=+3537.231822332" Oct 01 15:54:57 crc kubenswrapper[4771]: I1001 15:54:57.748575 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pm2wx" Oct 01 15:54:57 crc kubenswrapper[4771]: I1001 15:54:57.749389 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pm2wx" Oct 01 15:54:57 crc kubenswrapper[4771]: I1001 15:54:57.842076 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pm2wx" Oct 01 15:54:58 crc kubenswrapper[4771]: I1001 15:54:58.724863 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pm2wx" Oct 01 15:54:58 crc kubenswrapper[4771]: I1001 15:54:58.781887 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pm2wx"] Oct 01 15:55:00 crc kubenswrapper[4771]: I1001 15:55:00.694568 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pm2wx" podUID="cac11e97-1d2f-48a7-92a2-a45aac8aa867" containerName="registry-server" containerID="cri-o://0a367c49556be31788725f6f60eaab6125f2d8cc4f0f0174c096a19992ff3807" gracePeriod=2 Oct 01 15:55:00 crc kubenswrapper[4771]: E1001 15:55:00.815877 4771 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcac11e97_1d2f_48a7_92a2_a45aac8aa867.slice/crio-0a367c49556be31788725f6f60eaab6125f2d8cc4f0f0174c096a19992ff3807.scope\": RecentStats: unable to find data in memory cache]" Oct 01 15:55:01 crc kubenswrapper[4771]: I1001 15:55:01.189303 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pm2wx" Oct 01 15:55:01 crc kubenswrapper[4771]: I1001 15:55:01.347024 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6c48v\" (UniqueName: \"kubernetes.io/projected/cac11e97-1d2f-48a7-92a2-a45aac8aa867-kube-api-access-6c48v\") pod \"cac11e97-1d2f-48a7-92a2-a45aac8aa867\" (UID: \"cac11e97-1d2f-48a7-92a2-a45aac8aa867\") " Oct 01 15:55:01 crc kubenswrapper[4771]: I1001 15:55:01.347291 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cac11e97-1d2f-48a7-92a2-a45aac8aa867-catalog-content\") pod \"cac11e97-1d2f-48a7-92a2-a45aac8aa867\" (UID: \"cac11e97-1d2f-48a7-92a2-a45aac8aa867\") " Oct 01 15:55:01 crc kubenswrapper[4771]: I1001 15:55:01.347374 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cac11e97-1d2f-48a7-92a2-a45aac8aa867-utilities\") pod \"cac11e97-1d2f-48a7-92a2-a45aac8aa867\" (UID: \"cac11e97-1d2f-48a7-92a2-a45aac8aa867\") " Oct 01 15:55:01 crc kubenswrapper[4771]: I1001 15:55:01.348348 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cac11e97-1d2f-48a7-92a2-a45aac8aa867-utilities" (OuterVolumeSpecName: "utilities") pod "cac11e97-1d2f-48a7-92a2-a45aac8aa867" (UID: "cac11e97-1d2f-48a7-92a2-a45aac8aa867"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:55:01 crc kubenswrapper[4771]: I1001 15:55:01.352401 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cac11e97-1d2f-48a7-92a2-a45aac8aa867-kube-api-access-6c48v" (OuterVolumeSpecName: "kube-api-access-6c48v") pod "cac11e97-1d2f-48a7-92a2-a45aac8aa867" (UID: "cac11e97-1d2f-48a7-92a2-a45aac8aa867"). InnerVolumeSpecName "kube-api-access-6c48v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:55:01 crc kubenswrapper[4771]: I1001 15:55:01.399850 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cac11e97-1d2f-48a7-92a2-a45aac8aa867-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cac11e97-1d2f-48a7-92a2-a45aac8aa867" (UID: "cac11e97-1d2f-48a7-92a2-a45aac8aa867"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:55:01 crc kubenswrapper[4771]: I1001 15:55:01.450293 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6c48v\" (UniqueName: \"kubernetes.io/projected/cac11e97-1d2f-48a7-92a2-a45aac8aa867-kube-api-access-6c48v\") on node \"crc\" DevicePath \"\"" Oct 01 15:55:01 crc kubenswrapper[4771]: I1001 15:55:01.450340 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cac11e97-1d2f-48a7-92a2-a45aac8aa867-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 15:55:01 crc kubenswrapper[4771]: I1001 15:55:01.450358 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cac11e97-1d2f-48a7-92a2-a45aac8aa867-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 15:55:01 crc kubenswrapper[4771]: I1001 15:55:01.708360 4771 generic.go:334] "Generic (PLEG): container finished" podID="cac11e97-1d2f-48a7-92a2-a45aac8aa867" containerID="0a367c49556be31788725f6f60eaab6125f2d8cc4f0f0174c096a19992ff3807" exitCode=0 Oct 01 15:55:01 crc kubenswrapper[4771]: I1001 15:55:01.708432 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pm2wx" Oct 01 15:55:01 crc kubenswrapper[4771]: I1001 15:55:01.708479 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pm2wx" event={"ID":"cac11e97-1d2f-48a7-92a2-a45aac8aa867","Type":"ContainerDied","Data":"0a367c49556be31788725f6f60eaab6125f2d8cc4f0f0174c096a19992ff3807"} Oct 01 15:55:01 crc kubenswrapper[4771]: I1001 15:55:01.709465 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pm2wx" event={"ID":"cac11e97-1d2f-48a7-92a2-a45aac8aa867","Type":"ContainerDied","Data":"102373705b246b859057b96d4df53d97df77c2f1400edf2885c629032729a38d"} Oct 01 15:55:01 crc kubenswrapper[4771]: I1001 15:55:01.709508 4771 scope.go:117] "RemoveContainer" containerID="0a367c49556be31788725f6f60eaab6125f2d8cc4f0f0174c096a19992ff3807" Oct 01 15:55:01 crc kubenswrapper[4771]: I1001 15:55:01.740306 4771 scope.go:117] "RemoveContainer" containerID="5e1b89ca1e5c584329aa943704661b6f68743a5708d52f9a698bcf1bb0cb9b05" Oct 01 15:55:01 crc kubenswrapper[4771]: I1001 15:55:01.750185 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pm2wx"] Oct 01 15:55:01 crc kubenswrapper[4771]: I1001 15:55:01.759550 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pm2wx"] Oct 01 15:55:01 crc kubenswrapper[4771]: I1001 15:55:01.773310 4771 scope.go:117] "RemoveContainer" containerID="06c698def1a582b28d1b8e986125a8f9c2b6fdfbdae86a7b603fa7aa21b70617" Oct 01 15:55:01 crc kubenswrapper[4771]: I1001 15:55:01.814221 4771 scope.go:117] "RemoveContainer" containerID="0a367c49556be31788725f6f60eaab6125f2d8cc4f0f0174c096a19992ff3807" Oct 01 15:55:01 crc kubenswrapper[4771]: E1001 15:55:01.814666 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a367c49556be31788725f6f60eaab6125f2d8cc4f0f0174c096a19992ff3807\": container with ID starting with 0a367c49556be31788725f6f60eaab6125f2d8cc4f0f0174c096a19992ff3807 not found: ID does not exist" containerID="0a367c49556be31788725f6f60eaab6125f2d8cc4f0f0174c096a19992ff3807" Oct 01 15:55:01 crc kubenswrapper[4771]: I1001 15:55:01.814701 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a367c49556be31788725f6f60eaab6125f2d8cc4f0f0174c096a19992ff3807"} err="failed to get container status \"0a367c49556be31788725f6f60eaab6125f2d8cc4f0f0174c096a19992ff3807\": rpc error: code = NotFound desc = could not find container \"0a367c49556be31788725f6f60eaab6125f2d8cc4f0f0174c096a19992ff3807\": container with ID starting with 0a367c49556be31788725f6f60eaab6125f2d8cc4f0f0174c096a19992ff3807 not found: ID does not exist" Oct 01 15:55:01 crc kubenswrapper[4771]: I1001 15:55:01.814741 4771 scope.go:117] "RemoveContainer" containerID="5e1b89ca1e5c584329aa943704661b6f68743a5708d52f9a698bcf1bb0cb9b05" Oct 01 15:55:01 crc kubenswrapper[4771]: E1001 15:55:01.815330 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e1b89ca1e5c584329aa943704661b6f68743a5708d52f9a698bcf1bb0cb9b05\": container with ID starting with 5e1b89ca1e5c584329aa943704661b6f68743a5708d52f9a698bcf1bb0cb9b05 not found: ID does not exist" containerID="5e1b89ca1e5c584329aa943704661b6f68743a5708d52f9a698bcf1bb0cb9b05" Oct 01 15:55:01 crc kubenswrapper[4771]: I1001 15:55:01.815377 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e1b89ca1e5c584329aa943704661b6f68743a5708d52f9a698bcf1bb0cb9b05"} err="failed to get container status \"5e1b89ca1e5c584329aa943704661b6f68743a5708d52f9a698bcf1bb0cb9b05\": rpc error: code = NotFound desc = could not find container \"5e1b89ca1e5c584329aa943704661b6f68743a5708d52f9a698bcf1bb0cb9b05\": container with ID starting with 5e1b89ca1e5c584329aa943704661b6f68743a5708d52f9a698bcf1bb0cb9b05 not found: ID does not exist" Oct 01 15:55:01 crc kubenswrapper[4771]: I1001 15:55:01.815408 4771 scope.go:117] "RemoveContainer" containerID="06c698def1a582b28d1b8e986125a8f9c2b6fdfbdae86a7b603fa7aa21b70617" Oct 01 15:55:01 crc kubenswrapper[4771]: E1001 15:55:01.815860 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06c698def1a582b28d1b8e986125a8f9c2b6fdfbdae86a7b603fa7aa21b70617\": container with ID starting with 06c698def1a582b28d1b8e986125a8f9c2b6fdfbdae86a7b603fa7aa21b70617 not found: ID does not exist" containerID="06c698def1a582b28d1b8e986125a8f9c2b6fdfbdae86a7b603fa7aa21b70617" Oct 01 15:55:01 crc kubenswrapper[4771]: I1001 15:55:01.815885 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06c698def1a582b28d1b8e986125a8f9c2b6fdfbdae86a7b603fa7aa21b70617"} err="failed to get container status \"06c698def1a582b28d1b8e986125a8f9c2b6fdfbdae86a7b603fa7aa21b70617\": rpc error: code = NotFound desc = could not find container \"06c698def1a582b28d1b8e986125a8f9c2b6fdfbdae86a7b603fa7aa21b70617\": container with ID starting with 06c698def1a582b28d1b8e986125a8f9c2b6fdfbdae86a7b603fa7aa21b70617 not found: ID does not exist" Oct 01 15:55:01 crc kubenswrapper[4771]: I1001 15:55:01.997280 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cac11e97-1d2f-48a7-92a2-a45aac8aa867" path="/var/lib/kubelet/pods/cac11e97-1d2f-48a7-92a2-a45aac8aa867/volumes" Oct 01 15:55:12 crc kubenswrapper[4771]: I1001 15:55:12.176781 4771 patch_prober.go:28] interesting pod/machine-config-daemon-vck47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 15:55:12 crc kubenswrapper[4771]: I1001 15:55:12.177346 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 15:55:42 crc kubenswrapper[4771]: I1001 15:55:42.176852 4771 patch_prober.go:28] interesting pod/machine-config-daemon-vck47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 15:55:42 crc kubenswrapper[4771]: I1001 15:55:42.177513 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 15:55:42 crc kubenswrapper[4771]: I1001 15:55:42.177573 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vck47" Oct 01 15:55:42 crc kubenswrapper[4771]: I1001 15:55:42.178381 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a902474c94f7a5091bc2240a7b526fc1a45caf03e740411bdb189e934ec62269"} pod="openshift-machine-config-operator/machine-config-daemon-vck47" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 15:55:42 crc kubenswrapper[4771]: I1001 15:55:42.178452 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" containerID="cri-o://a902474c94f7a5091bc2240a7b526fc1a45caf03e740411bdb189e934ec62269" gracePeriod=600 Oct 01 15:55:42 crc kubenswrapper[4771]: E1001 15:55:42.311309 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:55:43 crc kubenswrapper[4771]: I1001 15:55:43.135564 4771 generic.go:334] "Generic (PLEG): container finished" podID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerID="a902474c94f7a5091bc2240a7b526fc1a45caf03e740411bdb189e934ec62269" exitCode=0 Oct 01 15:55:43 crc kubenswrapper[4771]: I1001 15:55:43.135709 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" event={"ID":"289ee6d3-fabe-417f-964c-76ca03c143cc","Type":"ContainerDied","Data":"a902474c94f7a5091bc2240a7b526fc1a45caf03e740411bdb189e934ec62269"} Oct 01 15:55:43 crc kubenswrapper[4771]: I1001 15:55:43.135975 4771 scope.go:117] "RemoveContainer" containerID="1abb350ca0e7c414aeed7a01011f0b66cc36f50524fa0b00e58e82c7df6d5615" Oct 01 15:55:43 crc kubenswrapper[4771]: I1001 15:55:43.136701 4771 scope.go:117] "RemoveContainer" containerID="a902474c94f7a5091bc2240a7b526fc1a45caf03e740411bdb189e934ec62269" Oct 01 15:55:43 crc kubenswrapper[4771]: E1001 15:55:43.136981 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:55:51 crc kubenswrapper[4771]: I1001 15:55:51.215129 4771 generic.go:334] "Generic (PLEG): container finished" podID="32741182-3a7c-43a7-b996-1dd78a418dc6" containerID="cc9eac0ff84ca09e142b9e58f102604864c84ad74f6ac234bbc3f1bbe8986fca" exitCode=0 Oct 01 15:55:51 crc kubenswrapper[4771]: I1001 15:55:51.215248 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"32741182-3a7c-43a7-b996-1dd78a418dc6","Type":"ContainerDied","Data":"cc9eac0ff84ca09e142b9e58f102604864c84ad74f6ac234bbc3f1bbe8986fca"} Oct 01 15:55:52 crc kubenswrapper[4771]: I1001 15:55:52.630460 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 01 15:55:52 crc kubenswrapper[4771]: I1001 15:55:52.756153 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnps4\" (UniqueName: \"kubernetes.io/projected/32741182-3a7c-43a7-b996-1dd78a418dc6-kube-api-access-gnps4\") pod \"32741182-3a7c-43a7-b996-1dd78a418dc6\" (UID: \"32741182-3a7c-43a7-b996-1dd78a418dc6\") " Oct 01 15:55:52 crc kubenswrapper[4771]: I1001 15:55:52.756264 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/32741182-3a7c-43a7-b996-1dd78a418dc6-ca-certs\") pod \"32741182-3a7c-43a7-b996-1dd78a418dc6\" (UID: \"32741182-3a7c-43a7-b996-1dd78a418dc6\") " Oct 01 15:55:52 crc kubenswrapper[4771]: I1001 15:55:52.756339 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/32741182-3a7c-43a7-b996-1dd78a418dc6-openstack-config\") pod \"32741182-3a7c-43a7-b996-1dd78a418dc6\" (UID: \"32741182-3a7c-43a7-b996-1dd78a418dc6\") " Oct 01 15:55:52 crc kubenswrapper[4771]: I1001 15:55:52.756410 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/32741182-3a7c-43a7-b996-1dd78a418dc6-ssh-key\") pod \"32741182-3a7c-43a7-b996-1dd78a418dc6\" (UID: \"32741182-3a7c-43a7-b996-1dd78a418dc6\") " Oct 01 15:55:52 crc kubenswrapper[4771]: I1001 15:55:52.756486 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/32741182-3a7c-43a7-b996-1dd78a418dc6-openstack-config-secret\") pod \"32741182-3a7c-43a7-b996-1dd78a418dc6\" (UID: \"32741182-3a7c-43a7-b996-1dd78a418dc6\") " Oct 01 15:55:52 crc kubenswrapper[4771]: I1001 15:55:52.757258 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/32741182-3a7c-43a7-b996-1dd78a418dc6-test-operator-ephemeral-temporary\") pod \"32741182-3a7c-43a7-b996-1dd78a418dc6\" (UID: \"32741182-3a7c-43a7-b996-1dd78a418dc6\") " Oct 01 15:55:52 crc kubenswrapper[4771]: I1001 15:55:52.757312 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/32741182-3a7c-43a7-b996-1dd78a418dc6-config-data\") pod \"32741182-3a7c-43a7-b996-1dd78a418dc6\" (UID: \"32741182-3a7c-43a7-b996-1dd78a418dc6\") " Oct 01 15:55:52 crc kubenswrapper[4771]: I1001 15:55:52.757469 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"32741182-3a7c-43a7-b996-1dd78a418dc6\" (UID: \"32741182-3a7c-43a7-b996-1dd78a418dc6\") " Oct 01 15:55:52 crc kubenswrapper[4771]: I1001 15:55:52.757527 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/32741182-3a7c-43a7-b996-1dd78a418dc6-test-operator-ephemeral-workdir\") pod \"32741182-3a7c-43a7-b996-1dd78a418dc6\" (UID: \"32741182-3a7c-43a7-b996-1dd78a418dc6\") " Oct 01 15:55:52 crc kubenswrapper[4771]: I1001 15:55:52.758046 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32741182-3a7c-43a7-b996-1dd78a418dc6-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "32741182-3a7c-43a7-b996-1dd78a418dc6" (UID: "32741182-3a7c-43a7-b996-1dd78a418dc6"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:55:52 crc kubenswrapper[4771]: I1001 15:55:52.758923 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32741182-3a7c-43a7-b996-1dd78a418dc6-config-data" (OuterVolumeSpecName: "config-data") pod "32741182-3a7c-43a7-b996-1dd78a418dc6" (UID: "32741182-3a7c-43a7-b996-1dd78a418dc6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:55:52 crc kubenswrapper[4771]: I1001 15:55:52.761799 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32741182-3a7c-43a7-b996-1dd78a418dc6-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "32741182-3a7c-43a7-b996-1dd78a418dc6" (UID: "32741182-3a7c-43a7-b996-1dd78a418dc6"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 15:55:52 crc kubenswrapper[4771]: I1001 15:55:52.764938 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "test-operator-logs") pod "32741182-3a7c-43a7-b996-1dd78a418dc6" (UID: "32741182-3a7c-43a7-b996-1dd78a418dc6"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 01 15:55:52 crc kubenswrapper[4771]: I1001 15:55:52.765030 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32741182-3a7c-43a7-b996-1dd78a418dc6-kube-api-access-gnps4" (OuterVolumeSpecName: "kube-api-access-gnps4") pod "32741182-3a7c-43a7-b996-1dd78a418dc6" (UID: "32741182-3a7c-43a7-b996-1dd78a418dc6"). InnerVolumeSpecName "kube-api-access-gnps4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:55:52 crc kubenswrapper[4771]: I1001 15:55:52.794699 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32741182-3a7c-43a7-b996-1dd78a418dc6-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "32741182-3a7c-43a7-b996-1dd78a418dc6" (UID: "32741182-3a7c-43a7-b996-1dd78a418dc6"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:55:52 crc kubenswrapper[4771]: I1001 15:55:52.796132 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32741182-3a7c-43a7-b996-1dd78a418dc6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "32741182-3a7c-43a7-b996-1dd78a418dc6" (UID: "32741182-3a7c-43a7-b996-1dd78a418dc6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:55:52 crc kubenswrapper[4771]: I1001 15:55:52.812377 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32741182-3a7c-43a7-b996-1dd78a418dc6-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "32741182-3a7c-43a7-b996-1dd78a418dc6" (UID: "32741182-3a7c-43a7-b996-1dd78a418dc6"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 15:55:52 crc kubenswrapper[4771]: I1001 15:55:52.829456 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/32741182-3a7c-43a7-b996-1dd78a418dc6-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "32741182-3a7c-43a7-b996-1dd78a418dc6" (UID: "32741182-3a7c-43a7-b996-1dd78a418dc6"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 15:55:52 crc kubenswrapper[4771]: I1001 15:55:52.861209 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnps4\" (UniqueName: \"kubernetes.io/projected/32741182-3a7c-43a7-b996-1dd78a418dc6-kube-api-access-gnps4\") on node \"crc\" DevicePath \"\"" Oct 01 15:55:52 crc kubenswrapper[4771]: I1001 15:55:52.861260 4771 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/32741182-3a7c-43a7-b996-1dd78a418dc6-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 01 15:55:52 crc kubenswrapper[4771]: I1001 15:55:52.861363 4771 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/32741182-3a7c-43a7-b996-1dd78a418dc6-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 01 15:55:52 crc kubenswrapper[4771]: I1001 15:55:52.861381 4771 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/32741182-3a7c-43a7-b996-1dd78a418dc6-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 01 15:55:52 crc kubenswrapper[4771]: I1001 15:55:52.861395 4771 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/32741182-3a7c-43a7-b996-1dd78a418dc6-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 01 15:55:52 crc kubenswrapper[4771]: I1001 15:55:52.861409 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/32741182-3a7c-43a7-b996-1dd78a418dc6-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 15:55:52 crc kubenswrapper[4771]: I1001 15:55:52.861453 4771 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/32741182-3a7c-43a7-b996-1dd78a418dc6-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 01 15:55:52 crc kubenswrapper[4771]: I1001 15:55:52.861536 4771 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Oct 01 15:55:52 crc kubenswrapper[4771]: I1001 15:55:52.861554 4771 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/32741182-3a7c-43a7-b996-1dd78a418dc6-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 01 15:55:52 crc kubenswrapper[4771]: I1001 15:55:52.888969 4771 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Oct 01 15:55:52 crc kubenswrapper[4771]: I1001 15:55:52.963190 4771 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Oct 01 15:55:53 crc kubenswrapper[4771]: I1001 15:55:53.238091 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"32741182-3a7c-43a7-b996-1dd78a418dc6","Type":"ContainerDied","Data":"cce6e860c528cc587042ee77f1257f13f58e6ed28483e2c8082a80c2f57b3068"} Oct 01 15:55:53 crc kubenswrapper[4771]: I1001 15:55:53.238136 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cce6e860c528cc587042ee77f1257f13f58e6ed28483e2c8082a80c2f57b3068" Oct 01 15:55:53 crc kubenswrapper[4771]: I1001 15:55:53.238137 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 01 15:55:57 crc kubenswrapper[4771]: I1001 15:55:57.984839 4771 scope.go:117] "RemoveContainer" containerID="a902474c94f7a5091bc2240a7b526fc1a45caf03e740411bdb189e934ec62269" Oct 01 15:55:57 crc kubenswrapper[4771]: E1001 15:55:57.985505 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:55:58 crc kubenswrapper[4771]: I1001 15:55:58.198401 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 01 15:55:58 crc kubenswrapper[4771]: E1001 15:55:58.198879 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32741182-3a7c-43a7-b996-1dd78a418dc6" containerName="tempest-tests-tempest-tests-runner" Oct 01 15:55:58 crc kubenswrapper[4771]: I1001 15:55:58.198899 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="32741182-3a7c-43a7-b996-1dd78a418dc6" containerName="tempest-tests-tempest-tests-runner" Oct 01 15:55:58 crc kubenswrapper[4771]: E1001 15:55:58.198928 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cac11e97-1d2f-48a7-92a2-a45aac8aa867" containerName="registry-server" Oct 01 15:55:58 crc kubenswrapper[4771]: I1001 15:55:58.198936 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="cac11e97-1d2f-48a7-92a2-a45aac8aa867" containerName="registry-server" Oct 01 15:55:58 crc kubenswrapper[4771]: E1001 15:55:58.198948 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cac11e97-1d2f-48a7-92a2-a45aac8aa867" containerName="extract-content" Oct 01 15:55:58 crc kubenswrapper[4771]: I1001 15:55:58.198956 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="cac11e97-1d2f-48a7-92a2-a45aac8aa867" containerName="extract-content" Oct 01 15:55:58 crc kubenswrapper[4771]: E1001 15:55:58.198992 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cac11e97-1d2f-48a7-92a2-a45aac8aa867" containerName="extract-utilities" Oct 01 15:55:58 crc kubenswrapper[4771]: I1001 15:55:58.199000 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="cac11e97-1d2f-48a7-92a2-a45aac8aa867" containerName="extract-utilities" Oct 01 15:55:58 crc kubenswrapper[4771]: I1001 15:55:58.199222 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="32741182-3a7c-43a7-b996-1dd78a418dc6" containerName="tempest-tests-tempest-tests-runner" Oct 01 15:55:58 crc kubenswrapper[4771]: I1001 15:55:58.199259 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="cac11e97-1d2f-48a7-92a2-a45aac8aa867" containerName="registry-server" Oct 01 15:55:58 crc kubenswrapper[4771]: I1001 15:55:58.200250 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 15:55:58 crc kubenswrapper[4771]: I1001 15:55:58.204628 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-n7bzx" Oct 01 15:55:58 crc kubenswrapper[4771]: I1001 15:55:58.221970 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 01 15:55:58 crc kubenswrapper[4771]: I1001 15:55:58.275537 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e0b503da-0417-4b9a-b62e-3f13be34b988\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 15:55:58 crc kubenswrapper[4771]: I1001 15:55:58.275649 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-566fl\" (UniqueName: \"kubernetes.io/projected/e0b503da-0417-4b9a-b62e-3f13be34b988-kube-api-access-566fl\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e0b503da-0417-4b9a-b62e-3f13be34b988\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 15:55:58 crc kubenswrapper[4771]: I1001 15:55:58.387453 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-566fl\" (UniqueName: \"kubernetes.io/projected/e0b503da-0417-4b9a-b62e-3f13be34b988-kube-api-access-566fl\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e0b503da-0417-4b9a-b62e-3f13be34b988\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 15:55:58 crc kubenswrapper[4771]: I1001 15:55:58.387668 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e0b503da-0417-4b9a-b62e-3f13be34b988\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 15:55:58 crc kubenswrapper[4771]: I1001 15:55:58.388313 4771 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e0b503da-0417-4b9a-b62e-3f13be34b988\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 15:55:58 crc kubenswrapper[4771]: I1001 15:55:58.406802 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-566fl\" (UniqueName: \"kubernetes.io/projected/e0b503da-0417-4b9a-b62e-3f13be34b988-kube-api-access-566fl\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e0b503da-0417-4b9a-b62e-3f13be34b988\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 15:55:58 crc kubenswrapper[4771]: I1001 15:55:58.415637 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e0b503da-0417-4b9a-b62e-3f13be34b988\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 15:55:58 crc kubenswrapper[4771]: I1001 15:55:58.529902 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 01 15:55:59 crc kubenswrapper[4771]: I1001 15:55:59.041428 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 01 15:55:59 crc kubenswrapper[4771]: I1001 15:55:59.058004 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 15:55:59 crc kubenswrapper[4771]: I1001 15:55:59.302567 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"e0b503da-0417-4b9a-b62e-3f13be34b988","Type":"ContainerStarted","Data":"29fc876a135b30a5d5f2d749f3d2f418dbe5c0e80802528d0e0e6c991c9ed91d"} Oct 01 15:56:01 crc kubenswrapper[4771]: I1001 15:56:01.327528 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"e0b503da-0417-4b9a-b62e-3f13be34b988","Type":"ContainerStarted","Data":"30c888e2b7a091042d436efb13ffc08ea7e6277370d5bc86c9ecc8d618e32ea6"} Oct 01 15:56:01 crc kubenswrapper[4771]: I1001 15:56:01.347212 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.129539679 podStartE2EDuration="3.347188954s" podCreationTimestamp="2025-10-01 15:55:58 +0000 UTC" firstStartedPulling="2025-10-01 15:55:59.057625619 +0000 UTC m=+3603.676800800" lastFinishedPulling="2025-10-01 15:56:00.275274904 +0000 UTC m=+3604.894450075" observedRunningTime="2025-10-01 15:56:01.342532649 +0000 UTC m=+3605.961707820" watchObservedRunningTime="2025-10-01 15:56:01.347188954 +0000 UTC m=+3605.966364135" Oct 01 15:56:10 crc kubenswrapper[4771]: I1001 15:56:10.984985 4771 scope.go:117] "RemoveContainer" containerID="a902474c94f7a5091bc2240a7b526fc1a45caf03e740411bdb189e934ec62269" Oct 01 15:56:10 crc kubenswrapper[4771]: E1001 15:56:10.985807 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:56:17 crc kubenswrapper[4771]: I1001 15:56:17.469433 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rf44r/must-gather-2mzcl"] Oct 01 15:56:17 crc kubenswrapper[4771]: I1001 15:56:17.471441 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rf44r/must-gather-2mzcl" Oct 01 15:56:17 crc kubenswrapper[4771]: I1001 15:56:17.474593 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-rf44r"/"kube-root-ca.crt" Oct 01 15:56:17 crc kubenswrapper[4771]: I1001 15:56:17.476551 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-rf44r"/"openshift-service-ca.crt" Oct 01 15:56:17 crc kubenswrapper[4771]: I1001 15:56:17.521807 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rf44r/must-gather-2mzcl"] Oct 01 15:56:17 crc kubenswrapper[4771]: I1001 15:56:17.588203 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ltb2\" (UniqueName: \"kubernetes.io/projected/7d8f06d5-e155-46cc-bf0b-0a690ab4a3a9-kube-api-access-8ltb2\") pod \"must-gather-2mzcl\" (UID: \"7d8f06d5-e155-46cc-bf0b-0a690ab4a3a9\") " pod="openshift-must-gather-rf44r/must-gather-2mzcl" Oct 01 15:56:17 crc kubenswrapper[4771]: I1001 15:56:17.588647 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7d8f06d5-e155-46cc-bf0b-0a690ab4a3a9-must-gather-output\") pod \"must-gather-2mzcl\" (UID: \"7d8f06d5-e155-46cc-bf0b-0a690ab4a3a9\") " pod="openshift-must-gather-rf44r/must-gather-2mzcl" Oct 01 15:56:17 crc kubenswrapper[4771]: I1001 15:56:17.690042 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ltb2\" (UniqueName: \"kubernetes.io/projected/7d8f06d5-e155-46cc-bf0b-0a690ab4a3a9-kube-api-access-8ltb2\") pod \"must-gather-2mzcl\" (UID: \"7d8f06d5-e155-46cc-bf0b-0a690ab4a3a9\") " pod="openshift-must-gather-rf44r/must-gather-2mzcl" Oct 01 15:56:17 crc kubenswrapper[4771]: I1001 15:56:17.690181 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7d8f06d5-e155-46cc-bf0b-0a690ab4a3a9-must-gather-output\") pod \"must-gather-2mzcl\" (UID: \"7d8f06d5-e155-46cc-bf0b-0a690ab4a3a9\") " pod="openshift-must-gather-rf44r/must-gather-2mzcl" Oct 01 15:56:17 crc kubenswrapper[4771]: I1001 15:56:17.690568 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7d8f06d5-e155-46cc-bf0b-0a690ab4a3a9-must-gather-output\") pod \"must-gather-2mzcl\" (UID: \"7d8f06d5-e155-46cc-bf0b-0a690ab4a3a9\") " pod="openshift-must-gather-rf44r/must-gather-2mzcl" Oct 01 15:56:17 crc kubenswrapper[4771]: I1001 15:56:17.707460 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ltb2\" (UniqueName: \"kubernetes.io/projected/7d8f06d5-e155-46cc-bf0b-0a690ab4a3a9-kube-api-access-8ltb2\") pod \"must-gather-2mzcl\" (UID: \"7d8f06d5-e155-46cc-bf0b-0a690ab4a3a9\") " pod="openshift-must-gather-rf44r/must-gather-2mzcl" Oct 01 15:56:17 crc kubenswrapper[4771]: I1001 15:56:17.793975 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rf44r/must-gather-2mzcl" Oct 01 15:56:18 crc kubenswrapper[4771]: I1001 15:56:18.239238 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rf44r/must-gather-2mzcl"] Oct 01 15:56:18 crc kubenswrapper[4771]: W1001 15:56:18.247624 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d8f06d5_e155_46cc_bf0b_0a690ab4a3a9.slice/crio-1d1dce0add14819bc6af73be8b4a2f6eae4bb923342c68326166b1b7b75d1978 WatchSource:0}: Error finding container 1d1dce0add14819bc6af73be8b4a2f6eae4bb923342c68326166b1b7b75d1978: Status 404 returned error can't find the container with id 1d1dce0add14819bc6af73be8b4a2f6eae4bb923342c68326166b1b7b75d1978 Oct 01 15:56:18 crc kubenswrapper[4771]: I1001 15:56:18.508279 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rf44r/must-gather-2mzcl" event={"ID":"7d8f06d5-e155-46cc-bf0b-0a690ab4a3a9","Type":"ContainerStarted","Data":"1d1dce0add14819bc6af73be8b4a2f6eae4bb923342c68326166b1b7b75d1978"} Oct 01 15:56:21 crc kubenswrapper[4771]: I1001 15:56:21.986478 4771 scope.go:117] "RemoveContainer" containerID="a902474c94f7a5091bc2240a7b526fc1a45caf03e740411bdb189e934ec62269" Oct 01 15:56:21 crc kubenswrapper[4771]: E1001 15:56:21.987346 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:56:26 crc kubenswrapper[4771]: I1001 15:56:26.593356 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rf44r/must-gather-2mzcl" event={"ID":"7d8f06d5-e155-46cc-bf0b-0a690ab4a3a9","Type":"ContainerStarted","Data":"b43a2123bb1eaa535c05de6ee8efc3e138362e16812b651b02a41593cc9cecda"} Oct 01 15:56:26 crc kubenswrapper[4771]: I1001 15:56:26.594947 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rf44r/must-gather-2mzcl" event={"ID":"7d8f06d5-e155-46cc-bf0b-0a690ab4a3a9","Type":"ContainerStarted","Data":"201b1073dbb9a79ad0370e3e4ee1d23658754299a0614d3fd78e16a0863e704c"} Oct 01 15:56:26 crc kubenswrapper[4771]: I1001 15:56:26.613612 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rf44r/must-gather-2mzcl" podStartSLOduration=1.946147302 podStartE2EDuration="9.613590173s" podCreationTimestamp="2025-10-01 15:56:17 +0000 UTC" firstStartedPulling="2025-10-01 15:56:18.250003727 +0000 UTC m=+3622.869178898" lastFinishedPulling="2025-10-01 15:56:25.917446588 +0000 UTC m=+3630.536621769" observedRunningTime="2025-10-01 15:56:26.607508033 +0000 UTC m=+3631.226683204" watchObservedRunningTime="2025-10-01 15:56:26.613590173 +0000 UTC m=+3631.232765364" Oct 01 15:56:29 crc kubenswrapper[4771]: I1001 15:56:29.836380 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rf44r/crc-debug-v8ttv"] Oct 01 15:56:29 crc kubenswrapper[4771]: I1001 15:56:29.838110 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rf44r/crc-debug-v8ttv" Oct 01 15:56:29 crc kubenswrapper[4771]: I1001 15:56:29.840298 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-rf44r"/"default-dockercfg-r4t4p" Oct 01 15:56:29 crc kubenswrapper[4771]: I1001 15:56:29.927325 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d71374b2-c1f0-4d3a-b8b7-783495ecb58e-host\") pod \"crc-debug-v8ttv\" (UID: \"d71374b2-c1f0-4d3a-b8b7-783495ecb58e\") " pod="openshift-must-gather-rf44r/crc-debug-v8ttv" Oct 01 15:56:29 crc kubenswrapper[4771]: I1001 15:56:29.927416 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh6fv\" (UniqueName: \"kubernetes.io/projected/d71374b2-c1f0-4d3a-b8b7-783495ecb58e-kube-api-access-vh6fv\") pod \"crc-debug-v8ttv\" (UID: \"d71374b2-c1f0-4d3a-b8b7-783495ecb58e\") " pod="openshift-must-gather-rf44r/crc-debug-v8ttv" Oct 01 15:56:30 crc kubenswrapper[4771]: I1001 15:56:30.029042 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh6fv\" (UniqueName: \"kubernetes.io/projected/d71374b2-c1f0-4d3a-b8b7-783495ecb58e-kube-api-access-vh6fv\") pod \"crc-debug-v8ttv\" (UID: \"d71374b2-c1f0-4d3a-b8b7-783495ecb58e\") " pod="openshift-must-gather-rf44r/crc-debug-v8ttv" Oct 01 15:56:30 crc kubenswrapper[4771]: I1001 15:56:30.029357 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d71374b2-c1f0-4d3a-b8b7-783495ecb58e-host\") pod \"crc-debug-v8ttv\" (UID: \"d71374b2-c1f0-4d3a-b8b7-783495ecb58e\") " pod="openshift-must-gather-rf44r/crc-debug-v8ttv" Oct 01 15:56:30 crc kubenswrapper[4771]: I1001 15:56:30.029908 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d71374b2-c1f0-4d3a-b8b7-783495ecb58e-host\") pod \"crc-debug-v8ttv\" (UID: \"d71374b2-c1f0-4d3a-b8b7-783495ecb58e\") " pod="openshift-must-gather-rf44r/crc-debug-v8ttv" Oct 01 15:56:30 crc kubenswrapper[4771]: I1001 15:56:30.053678 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh6fv\" (UniqueName: \"kubernetes.io/projected/d71374b2-c1f0-4d3a-b8b7-783495ecb58e-kube-api-access-vh6fv\") pod \"crc-debug-v8ttv\" (UID: \"d71374b2-c1f0-4d3a-b8b7-783495ecb58e\") " pod="openshift-must-gather-rf44r/crc-debug-v8ttv" Oct 01 15:56:30 crc kubenswrapper[4771]: I1001 15:56:30.157272 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rf44r/crc-debug-v8ttv" Oct 01 15:56:30 crc kubenswrapper[4771]: I1001 15:56:30.630846 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rf44r/crc-debug-v8ttv" event={"ID":"d71374b2-c1f0-4d3a-b8b7-783495ecb58e","Type":"ContainerStarted","Data":"159a32fd714ef1c096c5fa40e06836e2f0859f3001ccd3d790cd920d6c39adc2"} Oct 01 15:56:33 crc kubenswrapper[4771]: I1001 15:56:33.985070 4771 scope.go:117] "RemoveContainer" containerID="a902474c94f7a5091bc2240a7b526fc1a45caf03e740411bdb189e934ec62269" Oct 01 15:56:33 crc kubenswrapper[4771]: E1001 15:56:33.986025 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:56:43 crc kubenswrapper[4771]: I1001 15:56:43.764007 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rf44r/crc-debug-v8ttv" event={"ID":"d71374b2-c1f0-4d3a-b8b7-783495ecb58e","Type":"ContainerStarted","Data":"90187f2c2543904cf59f6abfb11ad8624253bfd98d8bdaa9f89618cfe5d81bfe"} Oct 01 15:56:43 crc kubenswrapper[4771]: I1001 15:56:43.785707 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rf44r/crc-debug-v8ttv" podStartSLOduration=1.6584656899999999 podStartE2EDuration="14.785684896s" podCreationTimestamp="2025-10-01 15:56:29 +0000 UTC" firstStartedPulling="2025-10-01 15:56:30.196889468 +0000 UTC m=+3634.816064649" lastFinishedPulling="2025-10-01 15:56:43.324108674 +0000 UTC m=+3647.943283855" observedRunningTime="2025-10-01 15:56:43.778656012 +0000 UTC m=+3648.397831183" watchObservedRunningTime="2025-10-01 15:56:43.785684896 +0000 UTC m=+3648.404860067" Oct 01 15:56:47 crc kubenswrapper[4771]: I1001 15:56:47.985474 4771 scope.go:117] "RemoveContainer" containerID="a902474c94f7a5091bc2240a7b526fc1a45caf03e740411bdb189e934ec62269" Oct 01 15:56:47 crc kubenswrapper[4771]: E1001 15:56:47.986224 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:56:59 crc kubenswrapper[4771]: I1001 15:56:59.986085 4771 scope.go:117] "RemoveContainer" containerID="a902474c94f7a5091bc2240a7b526fc1a45caf03e740411bdb189e934ec62269" Oct 01 15:56:59 crc kubenswrapper[4771]: E1001 15:56:59.987165 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:57:14 crc kubenswrapper[4771]: I1001 15:57:14.985596 4771 scope.go:117] "RemoveContainer" containerID="a902474c94f7a5091bc2240a7b526fc1a45caf03e740411bdb189e934ec62269" Oct 01 15:57:14 crc kubenswrapper[4771]: E1001 15:57:14.986424 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:57:29 crc kubenswrapper[4771]: I1001 15:57:29.323074 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-59879d9576-fvcgl_d7137719-9397-4b5e-97ae-10176a7deea3/barbican-api/0.log" Oct 01 15:57:29 crc kubenswrapper[4771]: I1001 15:57:29.413972 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-59879d9576-fvcgl_d7137719-9397-4b5e-97ae-10176a7deea3/barbican-api-log/0.log" Oct 01 15:57:29 crc kubenswrapper[4771]: I1001 15:57:29.491885 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-755c69f65b-sb4nj_90aaa270-c5a1-47b4-8adc-2bd096da3ab0/barbican-keystone-listener/0.log" Oct 01 15:57:29 crc kubenswrapper[4771]: I1001 15:57:29.645186 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-755c69f65b-sb4nj_90aaa270-c5a1-47b4-8adc-2bd096da3ab0/barbican-keystone-listener-log/0.log" Oct 01 15:57:29 crc kubenswrapper[4771]: I1001 15:57:29.696068 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-74dd9b479f-cpgmx_962b1815-b3dd-47fc-afdf-97a82cc67893/barbican-worker/0.log" Oct 01 15:57:29 crc kubenswrapper[4771]: I1001 15:57:29.823157 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-74dd9b479f-cpgmx_962b1815-b3dd-47fc-afdf-97a82cc67893/barbican-worker-log/0.log" Oct 01 15:57:29 crc kubenswrapper[4771]: I1001 15:57:29.978007 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-2hsb8_0498d724-f802-4a21-9197-f87079f3c96e/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 15:57:29 crc kubenswrapper[4771]: I1001 15:57:29.986073 4771 scope.go:117] "RemoveContainer" containerID="a902474c94f7a5091bc2240a7b526fc1a45caf03e740411bdb189e934ec62269" Oct 01 15:57:29 crc kubenswrapper[4771]: E1001 15:57:29.986435 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:57:30 crc kubenswrapper[4771]: I1001 15:57:30.127016 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_3fb688fd-269e-4d0f-a84f-ccb670696d20/ceilometer-central-agent/0.log" Oct 01 15:57:30 crc kubenswrapper[4771]: I1001 15:57:30.215705 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_3fb688fd-269e-4d0f-a84f-ccb670696d20/ceilometer-notification-agent/0.log" Oct 01 15:57:30 crc kubenswrapper[4771]: I1001 15:57:30.264947 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_3fb688fd-269e-4d0f-a84f-ccb670696d20/proxy-httpd/0.log" Oct 01 15:57:30 crc kubenswrapper[4771]: I1001 15:57:30.290887 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_3fb688fd-269e-4d0f-a84f-ccb670696d20/sg-core/0.log" Oct 01 15:57:30 crc kubenswrapper[4771]: I1001 15:57:30.569256 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_12fc4771-6958-4668-adcc-6aa10e36e1ea/cinder-api/0.log" Oct 01 15:57:30 crc kubenswrapper[4771]: I1001 15:57:30.577170 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_12fc4771-6958-4668-adcc-6aa10e36e1ea/cinder-api-log/0.log" Oct 01 15:57:30 crc kubenswrapper[4771]: I1001 15:57:30.813854 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_3a36d28b-706e-4639-9d68-158427aaa655/cinder-scheduler/0.log" Oct 01 15:57:30 crc kubenswrapper[4771]: I1001 15:57:30.825309 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_3a36d28b-706e-4639-9d68-158427aaa655/probe/0.log" Oct 01 15:57:31 crc kubenswrapper[4771]: I1001 15:57:31.001198 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-ptcjt_5598c0d1-a4ba-4824-8111-dddf70823911/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 15:57:31 crc kubenswrapper[4771]: I1001 15:57:31.150710 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-txt9q_a64a2e26-92a1-4578-9a5a-fc5e8062f1b8/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 15:57:31 crc kubenswrapper[4771]: I1001 15:57:31.259563 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-vv89q_ae5eb9bd-1612-4698-850a-21e0b335a920/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 15:57:31 crc kubenswrapper[4771]: I1001 15:57:31.446920 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-952sj_e5a19de7-6ecc-4f22-bc0c-18f3761eef3c/init/0.log" Oct 01 15:57:31 crc kubenswrapper[4771]: I1001 15:57:31.615322 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-952sj_e5a19de7-6ecc-4f22-bc0c-18f3761eef3c/init/0.log" Oct 01 15:57:31 crc kubenswrapper[4771]: I1001 15:57:31.650270 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-952sj_e5a19de7-6ecc-4f22-bc0c-18f3761eef3c/dnsmasq-dns/0.log" Oct 01 15:57:31 crc kubenswrapper[4771]: I1001 15:57:31.802570 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-9h2g6_c69dcf56-20fa-4a9a-992c-a73435ff9102/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 15:57:31 crc kubenswrapper[4771]: I1001 15:57:31.954652 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_bd39907f-ab26-47d6-9d78-0b0437de4b04/glance-httpd/0.log" Oct 01 15:57:32 crc kubenswrapper[4771]: I1001 15:57:32.038410 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_bd39907f-ab26-47d6-9d78-0b0437de4b04/glance-log/0.log" Oct 01 15:57:32 crc kubenswrapper[4771]: I1001 15:57:32.142068 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_83dc7d05-3ef5-4da2-b4ea-58d3c11d4528/glance-httpd/0.log" Oct 01 15:57:32 crc kubenswrapper[4771]: I1001 15:57:32.278895 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_83dc7d05-3ef5-4da2-b4ea-58d3c11d4528/glance-log/0.log" Oct 01 15:57:32 crc kubenswrapper[4771]: I1001 15:57:32.501594 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-66679756f6-g56hw_405e12dd-6888-4994-ac26-b2836ad9069c/horizon/0.log" Oct 01 15:57:32 crc kubenswrapper[4771]: I1001 15:57:32.618548 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-qk797_e6e7232e-0b6f-433f-a1e5-f99aab22ed8a/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 15:57:32 crc kubenswrapper[4771]: I1001 15:57:32.826063 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-66679756f6-g56hw_405e12dd-6888-4994-ac26-b2836ad9069c/horizon-log/0.log" Oct 01 15:57:32 crc kubenswrapper[4771]: I1001 15:57:32.846864 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-zkh9n_333518f5-86a1-4afc-974d-c3dbee185c42/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 15:57:33 crc kubenswrapper[4771]: I1001 15:57:33.051051 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_bbfa5749-f148-47da-8cbf-b88b1ea7bd9f/kube-state-metrics/0.log" Oct 01 15:57:33 crc kubenswrapper[4771]: I1001 15:57:33.140670 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-74b8bdcb7c-xttgq_5e356f03-9445-4825-a39d-8b564bd4ea1c/keystone-api/0.log" Oct 01 15:57:33 crc kubenswrapper[4771]: I1001 15:57:33.296338 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-hvr6k_87149516-d807-4412-90a5-e127c03943e0/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 15:57:33 crc kubenswrapper[4771]: I1001 15:57:33.914610 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-67bb68cc5c-l7gnn_a7b28e9a-d59d-4aba-97c6-9102ada72a28/neutron-httpd/0.log" Oct 01 15:57:33 crc kubenswrapper[4771]: I1001 15:57:33.926451 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-67bb68cc5c-l7gnn_a7b28e9a-d59d-4aba-97c6-9102ada72a28/neutron-api/0.log" Oct 01 15:57:34 crc kubenswrapper[4771]: I1001 15:57:34.099298 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s_9c50c132-15f0-45c7-a895-46fe2be6003e/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 15:57:34 crc kubenswrapper[4771]: I1001 15:57:34.609467 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_f5ed9a0d-0d21-4432-aae5-dca422c5c331/nova-api-log/0.log" Oct 01 15:57:34 crc kubenswrapper[4771]: I1001 15:57:34.721979 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_f5ed9a0d-0d21-4432-aae5-dca422c5c331/nova-api-api/0.log" Oct 01 15:57:34 crc kubenswrapper[4771]: I1001 15:57:34.756860 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_a01636b1-c705-4844-94b8-bb58e65faa1f/nova-cell0-conductor-conductor/0.log" Oct 01 15:57:35 crc kubenswrapper[4771]: I1001 15:57:35.019177 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_8a756a41-c563-4d6c-a5e8-907724f1847c/nova-cell1-conductor-conductor/0.log" Oct 01 15:57:35 crc kubenswrapper[4771]: I1001 15:57:35.117111 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_a8db97b3-f960-4eff-a879-c2b42c4e6364/nova-cell1-novncproxy-novncproxy/0.log" Oct 01 15:57:35 crc kubenswrapper[4771]: I1001 15:57:35.292772 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-lbdsd_ea0299a3-63d8-41e7-a23d-4ddd7491df9c/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 15:57:35 crc kubenswrapper[4771]: I1001 15:57:35.618691 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_10227ed2-4069-45bc-b3b9-091bb98d72af/nova-metadata-log/0.log" Oct 01 15:57:36 crc kubenswrapper[4771]: I1001 15:57:36.011710 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_f5a9190d-63c4-47e3-9fcd-ed0e0615d807/nova-scheduler-scheduler/0.log" Oct 01 15:57:36 crc kubenswrapper[4771]: I1001 15:57:36.152654 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_37c38012-d257-4269-86fa-8cf3ef4de4cd/mysql-bootstrap/0.log" Oct 01 15:57:36 crc kubenswrapper[4771]: I1001 15:57:36.420602 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_37c38012-d257-4269-86fa-8cf3ef4de4cd/mysql-bootstrap/0.log" Oct 01 15:57:36 crc kubenswrapper[4771]: I1001 15:57:36.440181 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_37c38012-d257-4269-86fa-8cf3ef4de4cd/galera/0.log" Oct 01 15:57:36 crc kubenswrapper[4771]: I1001 15:57:36.680409 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e902a14c-a59a-4278-b560-33de2cb50d32/mysql-bootstrap/0.log" Oct 01 15:57:36 crc kubenswrapper[4771]: I1001 15:57:36.786037 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_10227ed2-4069-45bc-b3b9-091bb98d72af/nova-metadata-metadata/0.log" Oct 01 15:57:36 crc kubenswrapper[4771]: I1001 15:57:36.900335 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e902a14c-a59a-4278-b560-33de2cb50d32/mysql-bootstrap/0.log" Oct 01 15:57:37 crc kubenswrapper[4771]: I1001 15:57:37.001270 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e902a14c-a59a-4278-b560-33de2cb50d32/galera/0.log" Oct 01 15:57:37 crc kubenswrapper[4771]: I1001 15:57:37.114550 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_a40ee0c4-9c6b-4ed0-9f06-1bb104bb9a11/openstackclient/0.log" Oct 01 15:57:37 crc kubenswrapper[4771]: I1001 15:57:37.320504 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-tckkt_553f381f-ef83-4876-8a81-df81a5be7dd8/openstack-network-exporter/0.log" Oct 01 15:57:37 crc kubenswrapper[4771]: I1001 15:57:37.530487 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rv4hj_74fee09d-11ad-45f4-a779-4c352b6dc67f/ovsdb-server-init/0.log" Oct 01 15:57:37 crc kubenswrapper[4771]: I1001 15:57:37.761111 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rv4hj_74fee09d-11ad-45f4-a779-4c352b6dc67f/ovsdb-server-init/0.log" Oct 01 15:57:37 crc kubenswrapper[4771]: I1001 15:57:37.827540 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rv4hj_74fee09d-11ad-45f4-a779-4c352b6dc67f/ovs-vswitchd/0.log" Oct 01 15:57:37 crc kubenswrapper[4771]: I1001 15:57:37.834604 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rv4hj_74fee09d-11ad-45f4-a779-4c352b6dc67f/ovsdb-server/0.log" Oct 01 15:57:38 crc kubenswrapper[4771]: I1001 15:57:38.102365 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-zpdvh_20d8761e-4ce2-4312-8a80-8c3ce8908f2c/ovn-controller/0.log" Oct 01 15:57:38 crc kubenswrapper[4771]: I1001 15:57:38.301992 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-h96xv_0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 15:57:38 crc kubenswrapper[4771]: I1001 15:57:38.370319 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_553def8f-6710-4724-a4b7-a9f6e2c310e6/openstack-network-exporter/0.log" Oct 01 15:57:38 crc kubenswrapper[4771]: I1001 15:57:38.507588 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_553def8f-6710-4724-a4b7-a9f6e2c310e6/ovn-northd/0.log" Oct 01 15:57:38 crc kubenswrapper[4771]: I1001 15:57:38.604525 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_064359ac-92c0-4674-a919-ccb8ffc0a5df/openstack-network-exporter/0.log" Oct 01 15:57:38 crc kubenswrapper[4771]: I1001 15:57:38.732882 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_064359ac-92c0-4674-a919-ccb8ffc0a5df/ovsdbserver-nb/0.log" Oct 01 15:57:38 crc kubenswrapper[4771]: I1001 15:57:38.827342 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_999431d1-6d92-46de-ba0f-b253f96fe627/openstack-network-exporter/0.log" Oct 01 15:57:38 crc kubenswrapper[4771]: I1001 15:57:38.976483 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_999431d1-6d92-46de-ba0f-b253f96fe627/ovsdbserver-sb/0.log" Oct 01 15:57:39 crc kubenswrapper[4771]: I1001 15:57:39.131415 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-67bb557f68-mz5cv_6dfa4374-0400-489e-90eb-baca0f8afdfd/placement-api/0.log" Oct 01 15:57:39 crc kubenswrapper[4771]: I1001 15:57:39.269602 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-67bb557f68-mz5cv_6dfa4374-0400-489e-90eb-baca0f8afdfd/placement-log/0.log" Oct 01 15:57:39 crc kubenswrapper[4771]: I1001 15:57:39.394406 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_15eb6248-64ff-4f3d-bcb4-4d78026673d4/setup-container/0.log" Oct 01 15:57:39 crc kubenswrapper[4771]: I1001 15:57:39.597581 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_15eb6248-64ff-4f3d-bcb4-4d78026673d4/rabbitmq/0.log" Oct 01 15:57:39 crc kubenswrapper[4771]: I1001 15:57:39.620875 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_15eb6248-64ff-4f3d-bcb4-4d78026673d4/setup-container/0.log" Oct 01 15:57:39 crc kubenswrapper[4771]: I1001 15:57:39.821257 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ad32a9ec-a803-4d44-a4c1-03447e26e983/setup-container/0.log" Oct 01 15:57:39 crc kubenswrapper[4771]: I1001 15:57:39.988365 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ad32a9ec-a803-4d44-a4c1-03447e26e983/setup-container/0.log" Oct 01 15:57:40 crc kubenswrapper[4771]: I1001 15:57:40.055135 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ad32a9ec-a803-4d44-a4c1-03447e26e983/rabbitmq/0.log" Oct 01 15:57:40 crc kubenswrapper[4771]: I1001 15:57:40.511599 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-l4srh_856ab139-589a-4b24-89ab-37ef20ef1762/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 15:57:40 crc kubenswrapper[4771]: I1001 15:57:40.539488 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-6gkfz_560e443b-7ae0-4b0c-912d-6f7895b3a8dd/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 15:57:40 crc kubenswrapper[4771]: I1001 15:57:40.704212 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-r8pxs_a91e8801-f8f9-4ce4-ba42-a4fa54057ec1/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 15:57:40 crc kubenswrapper[4771]: I1001 15:57:40.813819 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-d6dm9_8a04e35a-e2e5-412d-ab61-896f5271ac14/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 15:57:41 crc kubenswrapper[4771]: I1001 15:57:41.083202 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-cg72d_3d9fa44b-220b-4f14-824c-1393dd61fc88/ssh-known-hosts-edpm-deployment/0.log" Oct 01 15:57:41 crc kubenswrapper[4771]: I1001 15:57:41.315725 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-d6484bc47-hkzdw_e4b812be-6e39-4ac8-b43f-dba345603f74/proxy-httpd/0.log" Oct 01 15:57:41 crc kubenswrapper[4771]: I1001 15:57:41.352744 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-d6484bc47-hkzdw_e4b812be-6e39-4ac8-b43f-dba345603f74/proxy-server/0.log" Oct 01 15:57:41 crc kubenswrapper[4771]: I1001 15:57:41.540587 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-hzvjt_a56d3441-b413-4629-870b-49c208943243/swift-ring-rebalance/0.log" Oct 01 15:57:41 crc kubenswrapper[4771]: I1001 15:57:41.548673 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1916131b-f4ff-4f49-8abc-a640dc07abc4/account-auditor/0.log" Oct 01 15:57:41 crc kubenswrapper[4771]: I1001 15:57:41.827141 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1916131b-f4ff-4f49-8abc-a640dc07abc4/account-reaper/0.log" Oct 01 15:57:41 crc kubenswrapper[4771]: I1001 15:57:41.884780 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1916131b-f4ff-4f49-8abc-a640dc07abc4/account-server/0.log" Oct 01 15:57:41 crc kubenswrapper[4771]: I1001 15:57:41.901250 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1916131b-f4ff-4f49-8abc-a640dc07abc4/account-replicator/0.log" Oct 01 15:57:42 crc kubenswrapper[4771]: I1001 15:57:42.009458 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1916131b-f4ff-4f49-8abc-a640dc07abc4/container-auditor/0.log" Oct 01 15:57:42 crc kubenswrapper[4771]: I1001 15:57:42.092436 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1916131b-f4ff-4f49-8abc-a640dc07abc4/container-server/0.log" Oct 01 15:57:42 crc kubenswrapper[4771]: I1001 15:57:42.170691 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1916131b-f4ff-4f49-8abc-a640dc07abc4/container-replicator/0.log" Oct 01 15:57:42 crc kubenswrapper[4771]: I1001 15:57:42.180222 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1916131b-f4ff-4f49-8abc-a640dc07abc4/container-updater/0.log" Oct 01 15:57:42 crc kubenswrapper[4771]: I1001 15:57:42.282575 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1916131b-f4ff-4f49-8abc-a640dc07abc4/object-auditor/0.log" Oct 01 15:57:42 crc kubenswrapper[4771]: I1001 15:57:42.374706 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1916131b-f4ff-4f49-8abc-a640dc07abc4/object-expirer/0.log" Oct 01 15:57:42 crc kubenswrapper[4771]: I1001 15:57:42.449724 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1916131b-f4ff-4f49-8abc-a640dc07abc4/object-replicator/0.log" Oct 01 15:57:42 crc kubenswrapper[4771]: I1001 15:57:42.525069 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1916131b-f4ff-4f49-8abc-a640dc07abc4/object-server/0.log" Oct 01 15:57:42 crc kubenswrapper[4771]: I1001 15:57:42.638073 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1916131b-f4ff-4f49-8abc-a640dc07abc4/object-updater/0.log" Oct 01 15:57:42 crc kubenswrapper[4771]: I1001 15:57:42.707281 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1916131b-f4ff-4f49-8abc-a640dc07abc4/rsync/0.log" Oct 01 15:57:42 crc kubenswrapper[4771]: I1001 15:57:42.773721 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1916131b-f4ff-4f49-8abc-a640dc07abc4/swift-recon-cron/0.log" Oct 01 15:57:42 crc kubenswrapper[4771]: I1001 15:57:42.986053 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6_b938863c-4e4f-414a-9b0b-2d2583d9ae0c/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 15:57:43 crc kubenswrapper[4771]: I1001 15:57:43.109313 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_32741182-3a7c-43a7-b996-1dd78a418dc6/tempest-tests-tempest-tests-runner/0.log" Oct 01 15:57:43 crc kubenswrapper[4771]: I1001 15:57:43.236934 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_e0b503da-0417-4b9a-b62e-3f13be34b988/test-operator-logs-container/0.log" Oct 01 15:57:43 crc kubenswrapper[4771]: I1001 15:57:43.399963 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-nnjkx_9fb926c9-80f3-4d82-9de5-a4f0fc314ef5/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 15:57:43 crc kubenswrapper[4771]: I1001 15:57:43.987893 4771 scope.go:117] "RemoveContainer" containerID="a902474c94f7a5091bc2240a7b526fc1a45caf03e740411bdb189e934ec62269" Oct 01 15:57:43 crc kubenswrapper[4771]: E1001 15:57:43.988509 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:57:50 crc kubenswrapper[4771]: I1001 15:57:50.482838 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_107cd834-196a-4454-b70b-cbb3ab3631df/memcached/0.log" Oct 01 15:57:58 crc kubenswrapper[4771]: I1001 15:57:58.985154 4771 scope.go:117] "RemoveContainer" containerID="a902474c94f7a5091bc2240a7b526fc1a45caf03e740411bdb189e934ec62269" Oct 01 15:57:58 crc kubenswrapper[4771]: E1001 15:57:58.985969 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:58:13 crc kubenswrapper[4771]: I1001 15:58:13.985445 4771 scope.go:117] "RemoveContainer" containerID="a902474c94f7a5091bc2240a7b526fc1a45caf03e740411bdb189e934ec62269" Oct 01 15:58:13 crc kubenswrapper[4771]: E1001 15:58:13.986710 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:58:28 crc kubenswrapper[4771]: I1001 15:58:28.985342 4771 scope.go:117] "RemoveContainer" containerID="a902474c94f7a5091bc2240a7b526fc1a45caf03e740411bdb189e934ec62269" Oct 01 15:58:28 crc kubenswrapper[4771]: E1001 15:58:28.986132 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:58:38 crc kubenswrapper[4771]: I1001 15:58:38.707202 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-vxhj6" Oct 01 15:58:41 crc kubenswrapper[4771]: I1001 15:58:41.985763 4771 scope.go:117] "RemoveContainer" containerID="a902474c94f7a5091bc2240a7b526fc1a45caf03e740411bdb189e934ec62269" Oct 01 15:58:41 crc kubenswrapper[4771]: E1001 15:58:41.986647 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:58:43 crc kubenswrapper[4771]: I1001 15:58:43.971335 4771 generic.go:334] "Generic (PLEG): container finished" podID="d71374b2-c1f0-4d3a-b8b7-783495ecb58e" containerID="90187f2c2543904cf59f6abfb11ad8624253bfd98d8bdaa9f89618cfe5d81bfe" exitCode=0 Oct 01 15:58:43 crc kubenswrapper[4771]: I1001 15:58:43.971428 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rf44r/crc-debug-v8ttv" event={"ID":"d71374b2-c1f0-4d3a-b8b7-783495ecb58e","Type":"ContainerDied","Data":"90187f2c2543904cf59f6abfb11ad8624253bfd98d8bdaa9f89618cfe5d81bfe"} Oct 01 15:58:45 crc kubenswrapper[4771]: I1001 15:58:45.121007 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rf44r/crc-debug-v8ttv" Oct 01 15:58:45 crc kubenswrapper[4771]: I1001 15:58:45.157486 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-rf44r/crc-debug-v8ttv"] Oct 01 15:58:45 crc kubenswrapper[4771]: I1001 15:58:45.163834 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-rf44r/crc-debug-v8ttv"] Oct 01 15:58:45 crc kubenswrapper[4771]: I1001 15:58:45.259256 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vh6fv\" (UniqueName: \"kubernetes.io/projected/d71374b2-c1f0-4d3a-b8b7-783495ecb58e-kube-api-access-vh6fv\") pod \"d71374b2-c1f0-4d3a-b8b7-783495ecb58e\" (UID: \"d71374b2-c1f0-4d3a-b8b7-783495ecb58e\") " Oct 01 15:58:45 crc kubenswrapper[4771]: I1001 15:58:45.259449 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d71374b2-c1f0-4d3a-b8b7-783495ecb58e-host\") pod \"d71374b2-c1f0-4d3a-b8b7-783495ecb58e\" (UID: \"d71374b2-c1f0-4d3a-b8b7-783495ecb58e\") " Oct 01 15:58:45 crc kubenswrapper[4771]: I1001 15:58:45.259575 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d71374b2-c1f0-4d3a-b8b7-783495ecb58e-host" (OuterVolumeSpecName: "host") pod "d71374b2-c1f0-4d3a-b8b7-783495ecb58e" (UID: "d71374b2-c1f0-4d3a-b8b7-783495ecb58e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 15:58:45 crc kubenswrapper[4771]: I1001 15:58:45.260464 4771 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d71374b2-c1f0-4d3a-b8b7-783495ecb58e-host\") on node \"crc\" DevicePath \"\"" Oct 01 15:58:45 crc kubenswrapper[4771]: I1001 15:58:45.264583 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d71374b2-c1f0-4d3a-b8b7-783495ecb58e-kube-api-access-vh6fv" (OuterVolumeSpecName: "kube-api-access-vh6fv") pod "d71374b2-c1f0-4d3a-b8b7-783495ecb58e" (UID: "d71374b2-c1f0-4d3a-b8b7-783495ecb58e"). InnerVolumeSpecName "kube-api-access-vh6fv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:58:45 crc kubenswrapper[4771]: I1001 15:58:45.361976 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vh6fv\" (UniqueName: \"kubernetes.io/projected/d71374b2-c1f0-4d3a-b8b7-783495ecb58e-kube-api-access-vh6fv\") on node \"crc\" DevicePath \"\"" Oct 01 15:58:46 crc kubenswrapper[4771]: I1001 15:58:46.000628 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rf44r/crc-debug-v8ttv" Oct 01 15:58:46 crc kubenswrapper[4771]: I1001 15:58:46.005050 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d71374b2-c1f0-4d3a-b8b7-783495ecb58e" path="/var/lib/kubelet/pods/d71374b2-c1f0-4d3a-b8b7-783495ecb58e/volumes" Oct 01 15:58:46 crc kubenswrapper[4771]: I1001 15:58:46.006391 4771 scope.go:117] "RemoveContainer" containerID="90187f2c2543904cf59f6abfb11ad8624253bfd98d8bdaa9f89618cfe5d81bfe" Oct 01 15:58:46 crc kubenswrapper[4771]: I1001 15:58:46.377884 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rf44r/crc-debug-qb4f6"] Oct 01 15:58:46 crc kubenswrapper[4771]: E1001 15:58:46.378477 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d71374b2-c1f0-4d3a-b8b7-783495ecb58e" containerName="container-00" Oct 01 15:58:46 crc kubenswrapper[4771]: I1001 15:58:46.378498 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="d71374b2-c1f0-4d3a-b8b7-783495ecb58e" containerName="container-00" Oct 01 15:58:46 crc kubenswrapper[4771]: I1001 15:58:46.378943 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="d71374b2-c1f0-4d3a-b8b7-783495ecb58e" containerName="container-00" Oct 01 15:58:46 crc kubenswrapper[4771]: I1001 15:58:46.379947 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rf44r/crc-debug-qb4f6" Oct 01 15:58:46 crc kubenswrapper[4771]: I1001 15:58:46.382295 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-rf44r"/"default-dockercfg-r4t4p" Oct 01 15:58:46 crc kubenswrapper[4771]: I1001 15:58:46.483443 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e217d90d-b2b4-4749-b490-d90300951483-host\") pod \"crc-debug-qb4f6\" (UID: \"e217d90d-b2b4-4749-b490-d90300951483\") " pod="openshift-must-gather-rf44r/crc-debug-qb4f6" Oct 01 15:58:46 crc kubenswrapper[4771]: I1001 15:58:46.483599 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t7qq\" (UniqueName: \"kubernetes.io/projected/e217d90d-b2b4-4749-b490-d90300951483-kube-api-access-6t7qq\") pod \"crc-debug-qb4f6\" (UID: \"e217d90d-b2b4-4749-b490-d90300951483\") " pod="openshift-must-gather-rf44r/crc-debug-qb4f6" Oct 01 15:58:46 crc kubenswrapper[4771]: I1001 15:58:46.585618 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e217d90d-b2b4-4749-b490-d90300951483-host\") pod \"crc-debug-qb4f6\" (UID: \"e217d90d-b2b4-4749-b490-d90300951483\") " pod="openshift-must-gather-rf44r/crc-debug-qb4f6" Oct 01 15:58:46 crc kubenswrapper[4771]: I1001 15:58:46.585814 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e217d90d-b2b4-4749-b490-d90300951483-host\") pod \"crc-debug-qb4f6\" (UID: \"e217d90d-b2b4-4749-b490-d90300951483\") " pod="openshift-must-gather-rf44r/crc-debug-qb4f6" Oct 01 15:58:46 crc kubenswrapper[4771]: I1001 15:58:46.586098 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6t7qq\" (UniqueName: \"kubernetes.io/projected/e217d90d-b2b4-4749-b490-d90300951483-kube-api-access-6t7qq\") pod \"crc-debug-qb4f6\" (UID: \"e217d90d-b2b4-4749-b490-d90300951483\") " pod="openshift-must-gather-rf44r/crc-debug-qb4f6" Oct 01 15:58:46 crc kubenswrapper[4771]: I1001 15:58:46.609866 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6t7qq\" (UniqueName: \"kubernetes.io/projected/e217d90d-b2b4-4749-b490-d90300951483-kube-api-access-6t7qq\") pod \"crc-debug-qb4f6\" (UID: \"e217d90d-b2b4-4749-b490-d90300951483\") " pod="openshift-must-gather-rf44r/crc-debug-qb4f6" Oct 01 15:58:46 crc kubenswrapper[4771]: I1001 15:58:46.715661 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rf44r/crc-debug-qb4f6" Oct 01 15:58:47 crc kubenswrapper[4771]: I1001 15:58:47.015840 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rf44r/crc-debug-qb4f6" event={"ID":"e217d90d-b2b4-4749-b490-d90300951483","Type":"ContainerStarted","Data":"3c143ae9d050bbdb8a0ff163c919bc5d601b9bb6ac1b204a0519c69bb48e8d31"} Oct 01 15:58:48 crc kubenswrapper[4771]: I1001 15:58:48.030759 4771 generic.go:334] "Generic (PLEG): container finished" podID="e217d90d-b2b4-4749-b490-d90300951483" containerID="133293e0c47bc996b01edc19003009f0be84799b420eadd057f352fa78f06791" exitCode=0 Oct 01 15:58:48 crc kubenswrapper[4771]: I1001 15:58:48.030812 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rf44r/crc-debug-qb4f6" event={"ID":"e217d90d-b2b4-4749-b490-d90300951483","Type":"ContainerDied","Data":"133293e0c47bc996b01edc19003009f0be84799b420eadd057f352fa78f06791"} Oct 01 15:58:49 crc kubenswrapper[4771]: I1001 15:58:49.375372 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rf44r/crc-debug-qb4f6" Oct 01 15:58:49 crc kubenswrapper[4771]: I1001 15:58:49.438612 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6t7qq\" (UniqueName: \"kubernetes.io/projected/e217d90d-b2b4-4749-b490-d90300951483-kube-api-access-6t7qq\") pod \"e217d90d-b2b4-4749-b490-d90300951483\" (UID: \"e217d90d-b2b4-4749-b490-d90300951483\") " Oct 01 15:58:49 crc kubenswrapper[4771]: I1001 15:58:49.438760 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e217d90d-b2b4-4749-b490-d90300951483-host\") pod \"e217d90d-b2b4-4749-b490-d90300951483\" (UID: \"e217d90d-b2b4-4749-b490-d90300951483\") " Oct 01 15:58:49 crc kubenswrapper[4771]: I1001 15:58:49.439219 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e217d90d-b2b4-4749-b490-d90300951483-host" (OuterVolumeSpecName: "host") pod "e217d90d-b2b4-4749-b490-d90300951483" (UID: "e217d90d-b2b4-4749-b490-d90300951483"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 15:58:49 crc kubenswrapper[4771]: I1001 15:58:49.443977 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e217d90d-b2b4-4749-b490-d90300951483-kube-api-access-6t7qq" (OuterVolumeSpecName: "kube-api-access-6t7qq") pod "e217d90d-b2b4-4749-b490-d90300951483" (UID: "e217d90d-b2b4-4749-b490-d90300951483"). InnerVolumeSpecName "kube-api-access-6t7qq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:58:49 crc kubenswrapper[4771]: I1001 15:58:49.540541 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6t7qq\" (UniqueName: \"kubernetes.io/projected/e217d90d-b2b4-4749-b490-d90300951483-kube-api-access-6t7qq\") on node \"crc\" DevicePath \"\"" Oct 01 15:58:49 crc kubenswrapper[4771]: I1001 15:58:49.540572 4771 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e217d90d-b2b4-4749-b490-d90300951483-host\") on node \"crc\" DevicePath \"\"" Oct 01 15:58:50 crc kubenswrapper[4771]: I1001 15:58:50.047415 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rf44r/crc-debug-qb4f6" event={"ID":"e217d90d-b2b4-4749-b490-d90300951483","Type":"ContainerDied","Data":"3c143ae9d050bbdb8a0ff163c919bc5d601b9bb6ac1b204a0519c69bb48e8d31"} Oct 01 15:58:50 crc kubenswrapper[4771]: I1001 15:58:50.047678 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c143ae9d050bbdb8a0ff163c919bc5d601b9bb6ac1b204a0519c69bb48e8d31" Oct 01 15:58:50 crc kubenswrapper[4771]: I1001 15:58:50.047721 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rf44r/crc-debug-qb4f6" Oct 01 15:58:54 crc kubenswrapper[4771]: I1001 15:58:54.577336 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-rf44r/crc-debug-qb4f6"] Oct 01 15:58:54 crc kubenswrapper[4771]: I1001 15:58:54.584416 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-rf44r/crc-debug-qb4f6"] Oct 01 15:58:55 crc kubenswrapper[4771]: I1001 15:58:55.792851 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rf44r/crc-debug-z8ffr"] Oct 01 15:58:55 crc kubenswrapper[4771]: E1001 15:58:55.793272 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e217d90d-b2b4-4749-b490-d90300951483" containerName="container-00" Oct 01 15:58:55 crc kubenswrapper[4771]: I1001 15:58:55.793288 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="e217d90d-b2b4-4749-b490-d90300951483" containerName="container-00" Oct 01 15:58:55 crc kubenswrapper[4771]: I1001 15:58:55.793534 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="e217d90d-b2b4-4749-b490-d90300951483" containerName="container-00" Oct 01 15:58:55 crc kubenswrapper[4771]: I1001 15:58:55.794180 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rf44r/crc-debug-z8ffr" Oct 01 15:58:55 crc kubenswrapper[4771]: I1001 15:58:55.797397 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-rf44r"/"default-dockercfg-r4t4p" Oct 01 15:58:55 crc kubenswrapper[4771]: I1001 15:58:55.934565 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn4qf\" (UniqueName: \"kubernetes.io/projected/ccb4514e-cd01-46d8-b5fc-2dd03fa08369-kube-api-access-qn4qf\") pod \"crc-debug-z8ffr\" (UID: \"ccb4514e-cd01-46d8-b5fc-2dd03fa08369\") " pod="openshift-must-gather-rf44r/crc-debug-z8ffr" Oct 01 15:58:55 crc kubenswrapper[4771]: I1001 15:58:55.935063 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ccb4514e-cd01-46d8-b5fc-2dd03fa08369-host\") pod \"crc-debug-z8ffr\" (UID: \"ccb4514e-cd01-46d8-b5fc-2dd03fa08369\") " pod="openshift-must-gather-rf44r/crc-debug-z8ffr" Oct 01 15:58:55 crc kubenswrapper[4771]: I1001 15:58:55.997328 4771 scope.go:117] "RemoveContainer" containerID="a902474c94f7a5091bc2240a7b526fc1a45caf03e740411bdb189e934ec62269" Oct 01 15:58:55 crc kubenswrapper[4771]: E1001 15:58:55.997703 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:58:56 crc kubenswrapper[4771]: I1001 15:58:56.002506 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e217d90d-b2b4-4749-b490-d90300951483" path="/var/lib/kubelet/pods/e217d90d-b2b4-4749-b490-d90300951483/volumes" Oct 01 15:58:56 crc kubenswrapper[4771]: I1001 15:58:56.036216 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qn4qf\" (UniqueName: \"kubernetes.io/projected/ccb4514e-cd01-46d8-b5fc-2dd03fa08369-kube-api-access-qn4qf\") pod \"crc-debug-z8ffr\" (UID: \"ccb4514e-cd01-46d8-b5fc-2dd03fa08369\") " pod="openshift-must-gather-rf44r/crc-debug-z8ffr" Oct 01 15:58:56 crc kubenswrapper[4771]: I1001 15:58:56.036278 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ccb4514e-cd01-46d8-b5fc-2dd03fa08369-host\") pod \"crc-debug-z8ffr\" (UID: \"ccb4514e-cd01-46d8-b5fc-2dd03fa08369\") " pod="openshift-must-gather-rf44r/crc-debug-z8ffr" Oct 01 15:58:56 crc kubenswrapper[4771]: I1001 15:58:56.036456 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ccb4514e-cd01-46d8-b5fc-2dd03fa08369-host\") pod \"crc-debug-z8ffr\" (UID: \"ccb4514e-cd01-46d8-b5fc-2dd03fa08369\") " pod="openshift-must-gather-rf44r/crc-debug-z8ffr" Oct 01 15:58:56 crc kubenswrapper[4771]: I1001 15:58:56.069505 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn4qf\" (UniqueName: \"kubernetes.io/projected/ccb4514e-cd01-46d8-b5fc-2dd03fa08369-kube-api-access-qn4qf\") pod \"crc-debug-z8ffr\" (UID: \"ccb4514e-cd01-46d8-b5fc-2dd03fa08369\") " pod="openshift-must-gather-rf44r/crc-debug-z8ffr" Oct 01 15:58:56 crc kubenswrapper[4771]: I1001 15:58:56.131921 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-rf44r"/"default-dockercfg-r4t4p" Oct 01 15:58:56 crc kubenswrapper[4771]: I1001 15:58:56.140782 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rf44r/crc-debug-z8ffr" Oct 01 15:58:57 crc kubenswrapper[4771]: I1001 15:58:57.113226 4771 generic.go:334] "Generic (PLEG): container finished" podID="ccb4514e-cd01-46d8-b5fc-2dd03fa08369" containerID="fa8e364e4ac01141ba744e7da4f2ebc50d5cc9533156baf14e9e8f51a0b81bd0" exitCode=0 Oct 01 15:58:57 crc kubenswrapper[4771]: I1001 15:58:57.113322 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rf44r/crc-debug-z8ffr" event={"ID":"ccb4514e-cd01-46d8-b5fc-2dd03fa08369","Type":"ContainerDied","Data":"fa8e364e4ac01141ba744e7da4f2ebc50d5cc9533156baf14e9e8f51a0b81bd0"} Oct 01 15:58:57 crc kubenswrapper[4771]: I1001 15:58:57.113834 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rf44r/crc-debug-z8ffr" event={"ID":"ccb4514e-cd01-46d8-b5fc-2dd03fa08369","Type":"ContainerStarted","Data":"f97ebcc02184c37b723689c328da8b657c0fea8f452485fec8df8f41c2365fe3"} Oct 01 15:58:57 crc kubenswrapper[4771]: I1001 15:58:57.163008 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-rf44r/crc-debug-z8ffr"] Oct 01 15:58:57 crc kubenswrapper[4771]: I1001 15:58:57.178048 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-rf44r/crc-debug-z8ffr"] Oct 01 15:58:58 crc kubenswrapper[4771]: I1001 15:58:58.229864 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rf44r/crc-debug-z8ffr" Oct 01 15:58:58 crc kubenswrapper[4771]: I1001 15:58:58.375123 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qn4qf\" (UniqueName: \"kubernetes.io/projected/ccb4514e-cd01-46d8-b5fc-2dd03fa08369-kube-api-access-qn4qf\") pod \"ccb4514e-cd01-46d8-b5fc-2dd03fa08369\" (UID: \"ccb4514e-cd01-46d8-b5fc-2dd03fa08369\") " Oct 01 15:58:58 crc kubenswrapper[4771]: I1001 15:58:58.375256 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ccb4514e-cd01-46d8-b5fc-2dd03fa08369-host\") pod \"ccb4514e-cd01-46d8-b5fc-2dd03fa08369\" (UID: \"ccb4514e-cd01-46d8-b5fc-2dd03fa08369\") " Oct 01 15:58:58 crc kubenswrapper[4771]: I1001 15:58:58.375309 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ccb4514e-cd01-46d8-b5fc-2dd03fa08369-host" (OuterVolumeSpecName: "host") pod "ccb4514e-cd01-46d8-b5fc-2dd03fa08369" (UID: "ccb4514e-cd01-46d8-b5fc-2dd03fa08369"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 15:58:58 crc kubenswrapper[4771]: I1001 15:58:58.375955 4771 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ccb4514e-cd01-46d8-b5fc-2dd03fa08369-host\") on node \"crc\" DevicePath \"\"" Oct 01 15:58:58 crc kubenswrapper[4771]: I1001 15:58:58.380931 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccb4514e-cd01-46d8-b5fc-2dd03fa08369-kube-api-access-qn4qf" (OuterVolumeSpecName: "kube-api-access-qn4qf") pod "ccb4514e-cd01-46d8-b5fc-2dd03fa08369" (UID: "ccb4514e-cd01-46d8-b5fc-2dd03fa08369"). InnerVolumeSpecName "kube-api-access-qn4qf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 15:58:58 crc kubenswrapper[4771]: I1001 15:58:58.477711 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qn4qf\" (UniqueName: \"kubernetes.io/projected/ccb4514e-cd01-46d8-b5fc-2dd03fa08369-kube-api-access-qn4qf\") on node \"crc\" DevicePath \"\"" Oct 01 15:58:58 crc kubenswrapper[4771]: I1001 15:58:58.666569 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-6vp9n_41e9ddd0-3d55-4c3d-a4e9-2fbe4a7ec6f6/kube-rbac-proxy/0.log" Oct 01 15:58:58 crc kubenswrapper[4771]: I1001 15:58:58.772635 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-6vp9n_41e9ddd0-3d55-4c3d-a4e9-2fbe4a7ec6f6/manager/0.log" Oct 01 15:58:58 crc kubenswrapper[4771]: I1001 15:58:58.885022 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-gbllm_31b30f6f-3f4e-4c6b-9517-0eb866b2c68c/kube-rbac-proxy/0.log" Oct 01 15:58:58 crc kubenswrapper[4771]: I1001 15:58:58.928191 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-gbllm_31b30f6f-3f4e-4c6b-9517-0eb866b2c68c/manager/0.log" Oct 01 15:58:59 crc kubenswrapper[4771]: I1001 15:58:59.072305 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-7tpgh_46077b26-3930-4245-86b4-2d836a165664/manager/0.log" Oct 01 15:58:59 crc kubenswrapper[4771]: I1001 15:58:59.088483 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-7tpgh_46077b26-3930-4245-86b4-2d836a165664/kube-rbac-proxy/0.log" Oct 01 15:58:59 crc kubenswrapper[4771]: I1001 15:58:59.152326 4771 scope.go:117] "RemoveContainer" containerID="fa8e364e4ac01141ba744e7da4f2ebc50d5cc9533156baf14e9e8f51a0b81bd0" Oct 01 15:58:59 crc kubenswrapper[4771]: I1001 15:58:59.152362 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rf44r/crc-debug-z8ffr" Oct 01 15:58:59 crc kubenswrapper[4771]: I1001 15:58:59.176374 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f67954f135a897979546c33b0a718c86d8b189ef1f36644161d6597b857mb4c_115476f1-e753-4ab0-9c3f-b4a6ea4a6739/util/0.log" Oct 01 15:58:59 crc kubenswrapper[4771]: I1001 15:58:59.949747 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f67954f135a897979546c33b0a718c86d8b189ef1f36644161d6597b857mb4c_115476f1-e753-4ab0-9c3f-b4a6ea4a6739/pull/0.log" Oct 01 15:58:59 crc kubenswrapper[4771]: I1001 15:58:59.971372 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f67954f135a897979546c33b0a718c86d8b189ef1f36644161d6597b857mb4c_115476f1-e753-4ab0-9c3f-b4a6ea4a6739/util/0.log" Oct 01 15:59:00 crc kubenswrapper[4771]: I1001 15:59:00.000681 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccb4514e-cd01-46d8-b5fc-2dd03fa08369" path="/var/lib/kubelet/pods/ccb4514e-cd01-46d8-b5fc-2dd03fa08369/volumes" Oct 01 15:59:00 crc kubenswrapper[4771]: I1001 15:59:00.033044 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f67954f135a897979546c33b0a718c86d8b189ef1f36644161d6597b857mb4c_115476f1-e753-4ab0-9c3f-b4a6ea4a6739/pull/0.log" Oct 01 15:59:00 crc kubenswrapper[4771]: I1001 15:59:00.143721 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f67954f135a897979546c33b0a718c86d8b189ef1f36644161d6597b857mb4c_115476f1-e753-4ab0-9c3f-b4a6ea4a6739/pull/0.log" Oct 01 15:59:00 crc kubenswrapper[4771]: I1001 15:59:00.171393 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f67954f135a897979546c33b0a718c86d8b189ef1f36644161d6597b857mb4c_115476f1-e753-4ab0-9c3f-b4a6ea4a6739/util/0.log" Oct 01 15:59:00 crc kubenswrapper[4771]: I1001 15:59:00.192429 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f67954f135a897979546c33b0a718c86d8b189ef1f36644161d6597b857mb4c_115476f1-e753-4ab0-9c3f-b4a6ea4a6739/extract/0.log" Oct 01 15:59:00 crc kubenswrapper[4771]: I1001 15:59:00.321071 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-99nfn_60253a13-4845-4234-81f4-329e6f35a86e/kube-rbac-proxy/0.log" Oct 01 15:59:00 crc kubenswrapper[4771]: I1001 15:59:00.400445 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-99nfn_60253a13-4845-4234-81f4-329e6f35a86e/manager/0.log" Oct 01 15:59:00 crc kubenswrapper[4771]: I1001 15:59:00.410086 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-55pmn_43a1a358-9eba-46eb-90c5-a34e0fad09d6/kube-rbac-proxy/0.log" Oct 01 15:59:00 crc kubenswrapper[4771]: I1001 15:59:00.488193 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-55pmn_43a1a358-9eba-46eb-90c5-a34e0fad09d6/manager/0.log" Oct 01 15:59:00 crc kubenswrapper[4771]: I1001 15:59:00.592730 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-mwl7z_681d5bbd-36b0-497f-9d27-f8cc7473399a/kube-rbac-proxy/0.log" Oct 01 15:59:00 crc kubenswrapper[4771]: I1001 15:59:00.644421 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-mwl7z_681d5bbd-36b0-497f-9d27-f8cc7473399a/manager/0.log" Oct 01 15:59:00 crc kubenswrapper[4771]: I1001 15:59:00.759264 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-9d6c5db85-xkmp2_f7c18d5d-6ebb-4c31-a348-6ae7feebfafc/kube-rbac-proxy/0.log" Oct 01 15:59:00 crc kubenswrapper[4771]: I1001 15:59:00.874955 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5cd4858477-zcdfw_fcb3eff6-0c4e-4046-a829-fab3a5942d21/kube-rbac-proxy/0.log" Oct 01 15:59:00 crc kubenswrapper[4771]: I1001 15:59:00.941426 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5cd4858477-zcdfw_fcb3eff6-0c4e-4046-a829-fab3a5942d21/manager/0.log" Oct 01 15:59:00 crc kubenswrapper[4771]: I1001 15:59:00.955028 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-9d6c5db85-xkmp2_f7c18d5d-6ebb-4c31-a348-6ae7feebfafc/manager/0.log" Oct 01 15:59:01 crc kubenswrapper[4771]: I1001 15:59:01.067652 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-wps64_0c5bf036-417b-4f93-94a0-7c8ddc9028d7/kube-rbac-proxy/0.log" Oct 01 15:59:01 crc kubenswrapper[4771]: I1001 15:59:01.180707 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-wps64_0c5bf036-417b-4f93-94a0-7c8ddc9028d7/manager/0.log" Oct 01 15:59:01 crc kubenswrapper[4771]: I1001 15:59:01.656361 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-nv495_4c10ae08-be13-4725-b9be-c55ce015f33e/kube-rbac-proxy/0.log" Oct 01 15:59:01 crc kubenswrapper[4771]: I1001 15:59:01.765013 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-nv495_4c10ae08-be13-4725-b9be-c55ce015f33e/manager/0.log" Oct 01 15:59:01 crc kubenswrapper[4771]: I1001 15:59:01.923262 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-849d5b9b84-nv6q5_c8125ce6-7c9a-45d3-b820-698fd30d3471/kube-rbac-proxy/0.log" Oct 01 15:59:01 crc kubenswrapper[4771]: I1001 15:59:01.950607 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-5jmlk_a1c30bd9-dd78-4d92-9423-16597bf7d758/kube-rbac-proxy/0.log" Oct 01 15:59:02 crc kubenswrapper[4771]: I1001 15:59:02.069467 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-5jmlk_a1c30bd9-dd78-4d92-9423-16597bf7d758/manager/0.log" Oct 01 15:59:02 crc kubenswrapper[4771]: I1001 15:59:02.114060 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-64cd67b5cb-l85hk_863ec596-646c-41a0-b3e4-e33ad84c79aa/kube-rbac-proxy/0.log" Oct 01 15:59:02 crc kubenswrapper[4771]: I1001 15:59:02.136775 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-849d5b9b84-nv6q5_c8125ce6-7c9a-45d3-b820-698fd30d3471/manager/0.log" Oct 01 15:59:02 crc kubenswrapper[4771]: I1001 15:59:02.252696 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-64cd67b5cb-l85hk_863ec596-646c-41a0-b3e4-e33ad84c79aa/manager/0.log" Oct 01 15:59:02 crc kubenswrapper[4771]: I1001 15:59:02.303284 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7b787867f4-fx47m_417e6338-ae16-4903-8381-5bb1c3a92c75/kube-rbac-proxy/0.log" Oct 01 15:59:02 crc kubenswrapper[4771]: I1001 15:59:02.328798 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7b787867f4-fx47m_417e6338-ae16-4903-8381-5bb1c3a92c75/manager/0.log" Oct 01 15:59:02 crc kubenswrapper[4771]: I1001 15:59:02.455003 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-77b9676b8c62g57_c87cb84f-c539-4562-8492-b1106b6181f1/manager/0.log" Oct 01 15:59:02 crc kubenswrapper[4771]: I1001 15:59:02.456008 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-77b9676b8c62g57_c87cb84f-c539-4562-8492-b1106b6181f1/kube-rbac-proxy/0.log" Oct 01 15:59:02 crc kubenswrapper[4771]: I1001 15:59:02.504091 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-577574bf4d-p8zrk_31e6f7a8-0c8e-48c9-a2ba-38ffdabc1d95/kube-rbac-proxy/0.log" Oct 01 15:59:02 crc kubenswrapper[4771]: I1001 15:59:02.677340 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-9fb4b654-b6wcr_6285b448-a922-4015-96e8-3af02ca8a82d/kube-rbac-proxy/0.log" Oct 01 15:59:02 crc kubenswrapper[4771]: I1001 15:59:02.889997 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-hl4lz_4a21d3b6-e1e3-493b-baf1-fcbb055fb859/registry-server/0.log" Oct 01 15:59:02 crc kubenswrapper[4771]: I1001 15:59:02.894700 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-9fb4b654-b6wcr_6285b448-a922-4015-96e8-3af02ca8a82d/operator/0.log" Oct 01 15:59:03 crc kubenswrapper[4771]: I1001 15:59:03.008885 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-qnlxq_b90ad79b-e447-4a96-82a1-4ae8cb5b9959/kube-rbac-proxy/0.log" Oct 01 15:59:03 crc kubenswrapper[4771]: I1001 15:59:03.143705 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-qnlxq_b90ad79b-e447-4a96-82a1-4ae8cb5b9959/manager/0.log" Oct 01 15:59:03 crc kubenswrapper[4771]: I1001 15:59:03.207874 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-bkpfg_9c1c4098-4bbf-4d54-a09d-44b29ef352c3/kube-rbac-proxy/0.log" Oct 01 15:59:03 crc kubenswrapper[4771]: I1001 15:59:03.278131 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-bkpfg_9c1c4098-4bbf-4d54-a09d-44b29ef352c3/manager/0.log" Oct 01 15:59:03 crc kubenswrapper[4771]: I1001 15:59:03.444443 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-7clp8_b5efb91f-7e66-488b-ab6f-e52dbf63bc3c/operator/0.log" Oct 01 15:59:03 crc kubenswrapper[4771]: I1001 15:59:03.517389 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-84d6b4b759-wljgn_193791c4-4d63-4f50-a743-439b664c16b7/kube-rbac-proxy/0.log" Oct 01 15:59:03 crc kubenswrapper[4771]: I1001 15:59:03.642681 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-84d6b4b759-wljgn_193791c4-4d63-4f50-a743-439b664c16b7/manager/0.log" Oct 01 15:59:03 crc kubenswrapper[4771]: I1001 15:59:03.706162 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-577574bf4d-p8zrk_31e6f7a8-0c8e-48c9-a2ba-38ffdabc1d95/manager/0.log" Oct 01 15:59:03 crc kubenswrapper[4771]: I1001 15:59:03.745639 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-c495dbccb-25dzd_9abd1d50-adca-4d9a-8c33-89c3242174a5/kube-rbac-proxy/0.log" Oct 01 15:59:03 crc kubenswrapper[4771]: I1001 15:59:03.756152 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-c495dbccb-25dzd_9abd1d50-adca-4d9a-8c33-89c3242174a5/manager/0.log" Oct 01 15:59:03 crc kubenswrapper[4771]: I1001 15:59:03.893699 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-85777745bb-vvkwz_e6d65172-53ec-4aae-a508-b955072cdd2a/manager/0.log" Oct 01 15:59:03 crc kubenswrapper[4771]: I1001 15:59:03.896898 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-85777745bb-vvkwz_e6d65172-53ec-4aae-a508-b955072cdd2a/kube-rbac-proxy/0.log" Oct 01 15:59:03 crc kubenswrapper[4771]: I1001 15:59:03.940854 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6b9957f54f-crn74_a0517b85-5f5f-4d87-92f8-901564af068c/kube-rbac-proxy/0.log" Oct 01 15:59:04 crc kubenswrapper[4771]: I1001 15:59:04.040750 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6b9957f54f-crn74_a0517b85-5f5f-4d87-92f8-901564af068c/manager/0.log" Oct 01 15:59:08 crc kubenswrapper[4771]: I1001 15:59:08.985323 4771 scope.go:117] "RemoveContainer" containerID="a902474c94f7a5091bc2240a7b526fc1a45caf03e740411bdb189e934ec62269" Oct 01 15:59:08 crc kubenswrapper[4771]: E1001 15:59:08.986134 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:59:19 crc kubenswrapper[4771]: I1001 15:59:19.130238 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-8gml8_6730c7b8-fcbf-48c4-b2b8-5ed3566a7cd4/control-plane-machine-set-operator/0.log" Oct 01 15:59:19 crc kubenswrapper[4771]: I1001 15:59:19.307899 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-qlkrl_f160df7c-e97b-4c5a-badf-08379f8e27bf/machine-api-operator/0.log" Oct 01 15:59:19 crc kubenswrapper[4771]: I1001 15:59:19.317775 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-qlkrl_f160df7c-e97b-4c5a-badf-08379f8e27bf/kube-rbac-proxy/0.log" Oct 01 15:59:22 crc kubenswrapper[4771]: I1001 15:59:22.986003 4771 scope.go:117] "RemoveContainer" containerID="a902474c94f7a5091bc2240a7b526fc1a45caf03e740411bdb189e934ec62269" Oct 01 15:59:22 crc kubenswrapper[4771]: E1001 15:59:22.988167 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:59:31 crc kubenswrapper[4771]: I1001 15:59:31.438007 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-n6g7s_96d1876b-3e09-4899-8b04-a49c88ebf65d/cert-manager-controller/0.log" Oct 01 15:59:31 crc kubenswrapper[4771]: I1001 15:59:31.588171 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-wjfnq_153105d4-1f8e-43f2-bcea-0f3a36598eb0/cert-manager-cainjector/0.log" Oct 01 15:59:31 crc kubenswrapper[4771]: I1001 15:59:31.636613 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-9zpvt_fdd5e5a8-3303-4cee-ad76-d47d1a0da067/cert-manager-webhook/0.log" Oct 01 15:59:36 crc kubenswrapper[4771]: I1001 15:59:36.985392 4771 scope.go:117] "RemoveContainer" containerID="a902474c94f7a5091bc2240a7b526fc1a45caf03e740411bdb189e934ec62269" Oct 01 15:59:36 crc kubenswrapper[4771]: E1001 15:59:36.986089 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:59:43 crc kubenswrapper[4771]: I1001 15:59:43.508117 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-864bb6dfb5-nghbt_2d196fd3-9a93-4f81-b0ad-fefca77240a5/nmstate-console-plugin/0.log" Oct 01 15:59:43 crc kubenswrapper[4771]: I1001 15:59:43.597335 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-vqfvd_a8a82139-4b56-419e-a4e4-143e3246ec96/nmstate-handler/0.log" Oct 01 15:59:43 crc kubenswrapper[4771]: I1001 15:59:43.666316 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-wpddr_5082e8a7-dbba-4c99-9c6a-f35f64310963/kube-rbac-proxy/0.log" Oct 01 15:59:43 crc kubenswrapper[4771]: I1001 15:59:43.711126 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-wpddr_5082e8a7-dbba-4c99-9c6a-f35f64310963/nmstate-metrics/0.log" Oct 01 15:59:44 crc kubenswrapper[4771]: I1001 15:59:44.069533 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5d6f6cfd66-9twbv_88f55c4c-a1d9-4751-9776-562464717201/nmstate-operator/0.log" Oct 01 15:59:44 crc kubenswrapper[4771]: I1001 15:59:44.075048 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6d689559c5-rx6d2_e9d87ba9-d0d9-4647-a69b-4a114140b6be/nmstate-webhook/0.log" Oct 01 15:59:47 crc kubenswrapper[4771]: I1001 15:59:47.989065 4771 scope.go:117] "RemoveContainer" containerID="a902474c94f7a5091bc2240a7b526fc1a45caf03e740411bdb189e934ec62269" Oct 01 15:59:47 crc kubenswrapper[4771]: E1001 15:59:47.989561 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:59:58 crc kubenswrapper[4771]: I1001 15:59:58.948338 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-ffb8z_e2cac0e6-5d1c-4914-b9f6-334aeedcf2d4/kube-rbac-proxy/0.log" Oct 01 15:59:58 crc kubenswrapper[4771]: I1001 15:59:58.985326 4771 scope.go:117] "RemoveContainer" containerID="a902474c94f7a5091bc2240a7b526fc1a45caf03e740411bdb189e934ec62269" Oct 01 15:59:58 crc kubenswrapper[4771]: E1001 15:59:58.985756 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 15:59:59 crc kubenswrapper[4771]: I1001 15:59:59.021067 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-ffb8z_e2cac0e6-5d1c-4914-b9f6-334aeedcf2d4/controller/0.log" Oct 01 15:59:59 crc kubenswrapper[4771]: I1001 15:59:59.173344 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mp8gm_36cbe93e-7162-4367-a756-e731b356fa91/cp-frr-files/0.log" Oct 01 15:59:59 crc kubenswrapper[4771]: I1001 15:59:59.343550 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mp8gm_36cbe93e-7162-4367-a756-e731b356fa91/cp-metrics/0.log" Oct 01 15:59:59 crc kubenswrapper[4771]: I1001 15:59:59.362807 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mp8gm_36cbe93e-7162-4367-a756-e731b356fa91/cp-reloader/0.log" Oct 01 15:59:59 crc kubenswrapper[4771]: I1001 15:59:59.372971 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mp8gm_36cbe93e-7162-4367-a756-e731b356fa91/cp-frr-files/0.log" Oct 01 15:59:59 crc kubenswrapper[4771]: I1001 15:59:59.393708 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mp8gm_36cbe93e-7162-4367-a756-e731b356fa91/cp-reloader/0.log" Oct 01 15:59:59 crc kubenswrapper[4771]: I1001 15:59:59.521606 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mp8gm_36cbe93e-7162-4367-a756-e731b356fa91/cp-reloader/0.log" Oct 01 15:59:59 crc kubenswrapper[4771]: I1001 15:59:59.545250 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mp8gm_36cbe93e-7162-4367-a756-e731b356fa91/cp-frr-files/0.log" Oct 01 15:59:59 crc kubenswrapper[4771]: I1001 15:59:59.572881 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mp8gm_36cbe93e-7162-4367-a756-e731b356fa91/cp-metrics/0.log" Oct 01 15:59:59 crc kubenswrapper[4771]: I1001 15:59:59.610200 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mp8gm_36cbe93e-7162-4367-a756-e731b356fa91/cp-metrics/0.log" Oct 01 15:59:59 crc kubenswrapper[4771]: I1001 15:59:59.812955 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mp8gm_36cbe93e-7162-4367-a756-e731b356fa91/cp-reloader/0.log" Oct 01 15:59:59 crc kubenswrapper[4771]: I1001 15:59:59.826911 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mp8gm_36cbe93e-7162-4367-a756-e731b356fa91/controller/0.log" Oct 01 15:59:59 crc kubenswrapper[4771]: I1001 15:59:59.835487 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mp8gm_36cbe93e-7162-4367-a756-e731b356fa91/cp-metrics/0.log" Oct 01 15:59:59 crc kubenswrapper[4771]: I1001 15:59:59.859747 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mp8gm_36cbe93e-7162-4367-a756-e731b356fa91/cp-frr-files/0.log" Oct 01 16:00:00 crc kubenswrapper[4771]: I1001 16:00:00.027233 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mp8gm_36cbe93e-7162-4367-a756-e731b356fa91/frr-metrics/0.log" Oct 01 16:00:00 crc kubenswrapper[4771]: I1001 16:00:00.048273 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mp8gm_36cbe93e-7162-4367-a756-e731b356fa91/kube-rbac-proxy/0.log" Oct 01 16:00:00 crc kubenswrapper[4771]: I1001 16:00:00.065145 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mp8gm_36cbe93e-7162-4367-a756-e731b356fa91/kube-rbac-proxy-frr/0.log" Oct 01 16:00:00 crc kubenswrapper[4771]: I1001 16:00:00.145444 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322240-cbk5d"] Oct 01 16:00:00 crc kubenswrapper[4771]: E1001 16:00:00.145831 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccb4514e-cd01-46d8-b5fc-2dd03fa08369" containerName="container-00" Oct 01 16:00:00 crc kubenswrapper[4771]: I1001 16:00:00.145849 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccb4514e-cd01-46d8-b5fc-2dd03fa08369" containerName="container-00" Oct 01 16:00:00 crc kubenswrapper[4771]: I1001 16:00:00.146012 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccb4514e-cd01-46d8-b5fc-2dd03fa08369" containerName="container-00" Oct 01 16:00:00 crc kubenswrapper[4771]: I1001 16:00:00.146560 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322240-cbk5d" Oct 01 16:00:00 crc kubenswrapper[4771]: I1001 16:00:00.149602 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 01 16:00:00 crc kubenswrapper[4771]: I1001 16:00:00.151885 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 01 16:00:00 crc kubenswrapper[4771]: I1001 16:00:00.162414 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322240-cbk5d"] Oct 01 16:00:00 crc kubenswrapper[4771]: I1001 16:00:00.254360 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52c1a9fe-ba9f-4d54-ba72-45387aacaf7f-config-volume\") pod \"collect-profiles-29322240-cbk5d\" (UID: \"52c1a9fe-ba9f-4d54-ba72-45387aacaf7f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322240-cbk5d" Oct 01 16:00:00 crc kubenswrapper[4771]: I1001 16:00:00.254521 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/52c1a9fe-ba9f-4d54-ba72-45387aacaf7f-secret-volume\") pod \"collect-profiles-29322240-cbk5d\" (UID: \"52c1a9fe-ba9f-4d54-ba72-45387aacaf7f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322240-cbk5d" Oct 01 16:00:00 crc kubenswrapper[4771]: I1001 16:00:00.254558 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdxbg\" (UniqueName: \"kubernetes.io/projected/52c1a9fe-ba9f-4d54-ba72-45387aacaf7f-kube-api-access-bdxbg\") pod \"collect-profiles-29322240-cbk5d\" (UID: \"52c1a9fe-ba9f-4d54-ba72-45387aacaf7f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322240-cbk5d" Oct 01 16:00:00 crc kubenswrapper[4771]: I1001 16:00:00.258906 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mp8gm_36cbe93e-7162-4367-a756-e731b356fa91/reloader/0.log" Oct 01 16:00:00 crc kubenswrapper[4771]: I1001 16:00:00.327038 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-5478bdb765-4lvs5_d0200ff4-2245-408d-bf5f-28479e049c57/frr-k8s-webhook-server/0.log" Oct 01 16:00:00 crc kubenswrapper[4771]: I1001 16:00:00.355723 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/52c1a9fe-ba9f-4d54-ba72-45387aacaf7f-secret-volume\") pod \"collect-profiles-29322240-cbk5d\" (UID: \"52c1a9fe-ba9f-4d54-ba72-45387aacaf7f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322240-cbk5d" Oct 01 16:00:00 crc kubenswrapper[4771]: I1001 16:00:00.356618 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdxbg\" (UniqueName: \"kubernetes.io/projected/52c1a9fe-ba9f-4d54-ba72-45387aacaf7f-kube-api-access-bdxbg\") pod \"collect-profiles-29322240-cbk5d\" (UID: \"52c1a9fe-ba9f-4d54-ba72-45387aacaf7f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322240-cbk5d" Oct 01 16:00:00 crc kubenswrapper[4771]: I1001 16:00:00.356864 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52c1a9fe-ba9f-4d54-ba72-45387aacaf7f-config-volume\") pod \"collect-profiles-29322240-cbk5d\" (UID: \"52c1a9fe-ba9f-4d54-ba72-45387aacaf7f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322240-cbk5d" Oct 01 16:00:00 crc kubenswrapper[4771]: I1001 16:00:00.357765 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52c1a9fe-ba9f-4d54-ba72-45387aacaf7f-config-volume\") pod \"collect-profiles-29322240-cbk5d\" (UID: \"52c1a9fe-ba9f-4d54-ba72-45387aacaf7f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322240-cbk5d" Oct 01 16:00:00 crc kubenswrapper[4771]: I1001 16:00:00.366715 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/52c1a9fe-ba9f-4d54-ba72-45387aacaf7f-secret-volume\") pod \"collect-profiles-29322240-cbk5d\" (UID: \"52c1a9fe-ba9f-4d54-ba72-45387aacaf7f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322240-cbk5d" Oct 01 16:00:00 crc kubenswrapper[4771]: I1001 16:00:00.375131 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdxbg\" (UniqueName: \"kubernetes.io/projected/52c1a9fe-ba9f-4d54-ba72-45387aacaf7f-kube-api-access-bdxbg\") pod \"collect-profiles-29322240-cbk5d\" (UID: \"52c1a9fe-ba9f-4d54-ba72-45387aacaf7f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29322240-cbk5d" Oct 01 16:00:00 crc kubenswrapper[4771]: I1001 16:00:00.484588 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322240-cbk5d" Oct 01 16:00:00 crc kubenswrapper[4771]: I1001 16:00:00.685130 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-554dcf567c-bnmzr_35c6bc93-608d-4534-9ccd-493ea57f189d/manager/0.log" Oct 01 16:00:00 crc kubenswrapper[4771]: I1001 16:00:00.835406 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-bd9669845-gnppv_b09aca49-c227-4561-9f87-df661ec6d85c/webhook-server/0.log" Oct 01 16:00:00 crc kubenswrapper[4771]: I1001 16:00:00.963050 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-s5glz_cf712988-695e-4dae-a121-ce52bf39689e/kube-rbac-proxy/0.log" Oct 01 16:00:01 crc kubenswrapper[4771]: I1001 16:00:01.004210 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322240-cbk5d"] Oct 01 16:00:01 crc kubenswrapper[4771]: I1001 16:00:01.349400 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mp8gm_36cbe93e-7162-4367-a756-e731b356fa91/frr/0.log" Oct 01 16:00:01 crc kubenswrapper[4771]: I1001 16:00:01.461474 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-s5glz_cf712988-695e-4dae-a121-ce52bf39689e/speaker/0.log" Oct 01 16:00:01 crc kubenswrapper[4771]: I1001 16:00:01.750975 4771 generic.go:334] "Generic (PLEG): container finished" podID="52c1a9fe-ba9f-4d54-ba72-45387aacaf7f" containerID="ca74f97b2a07522357769125a5882cb6475d094e2da6001fddf67293dde0c231" exitCode=0 Oct 01 16:00:01 crc kubenswrapper[4771]: I1001 16:00:01.751022 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322240-cbk5d" event={"ID":"52c1a9fe-ba9f-4d54-ba72-45387aacaf7f","Type":"ContainerDied","Data":"ca74f97b2a07522357769125a5882cb6475d094e2da6001fddf67293dde0c231"} Oct 01 16:00:01 crc kubenswrapper[4771]: I1001 16:00:01.751075 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322240-cbk5d" event={"ID":"52c1a9fe-ba9f-4d54-ba72-45387aacaf7f","Type":"ContainerStarted","Data":"4dc406a4f06a7ba2cf58dfe6b117dd5b8fdd11d521298c974d4bf7a2b4a05ad5"} Oct 01 16:00:03 crc kubenswrapper[4771]: I1001 16:00:03.107883 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322240-cbk5d" Oct 01 16:00:03 crc kubenswrapper[4771]: I1001 16:00:03.210845 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/52c1a9fe-ba9f-4d54-ba72-45387aacaf7f-secret-volume\") pod \"52c1a9fe-ba9f-4d54-ba72-45387aacaf7f\" (UID: \"52c1a9fe-ba9f-4d54-ba72-45387aacaf7f\") " Oct 01 16:00:03 crc kubenswrapper[4771]: I1001 16:00:03.210979 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdxbg\" (UniqueName: \"kubernetes.io/projected/52c1a9fe-ba9f-4d54-ba72-45387aacaf7f-kube-api-access-bdxbg\") pod \"52c1a9fe-ba9f-4d54-ba72-45387aacaf7f\" (UID: \"52c1a9fe-ba9f-4d54-ba72-45387aacaf7f\") " Oct 01 16:00:03 crc kubenswrapper[4771]: I1001 16:00:03.211012 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52c1a9fe-ba9f-4d54-ba72-45387aacaf7f-config-volume\") pod \"52c1a9fe-ba9f-4d54-ba72-45387aacaf7f\" (UID: \"52c1a9fe-ba9f-4d54-ba72-45387aacaf7f\") " Oct 01 16:00:03 crc kubenswrapper[4771]: I1001 16:00:03.211846 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52c1a9fe-ba9f-4d54-ba72-45387aacaf7f-config-volume" (OuterVolumeSpecName: "config-volume") pod "52c1a9fe-ba9f-4d54-ba72-45387aacaf7f" (UID: "52c1a9fe-ba9f-4d54-ba72-45387aacaf7f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 01 16:00:03 crc kubenswrapper[4771]: I1001 16:00:03.216226 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52c1a9fe-ba9f-4d54-ba72-45387aacaf7f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "52c1a9fe-ba9f-4d54-ba72-45387aacaf7f" (UID: "52c1a9fe-ba9f-4d54-ba72-45387aacaf7f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:00:03 crc kubenswrapper[4771]: I1001 16:00:03.217041 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52c1a9fe-ba9f-4d54-ba72-45387aacaf7f-kube-api-access-bdxbg" (OuterVolumeSpecName: "kube-api-access-bdxbg") pod "52c1a9fe-ba9f-4d54-ba72-45387aacaf7f" (UID: "52c1a9fe-ba9f-4d54-ba72-45387aacaf7f"). InnerVolumeSpecName "kube-api-access-bdxbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:00:03 crc kubenswrapper[4771]: I1001 16:00:03.313154 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdxbg\" (UniqueName: \"kubernetes.io/projected/52c1a9fe-ba9f-4d54-ba72-45387aacaf7f-kube-api-access-bdxbg\") on node \"crc\" DevicePath \"\"" Oct 01 16:00:03 crc kubenswrapper[4771]: I1001 16:00:03.313198 4771 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52c1a9fe-ba9f-4d54-ba72-45387aacaf7f-config-volume\") on node \"crc\" DevicePath \"\"" Oct 01 16:00:03 crc kubenswrapper[4771]: I1001 16:00:03.313212 4771 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/52c1a9fe-ba9f-4d54-ba72-45387aacaf7f-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 01 16:00:03 crc kubenswrapper[4771]: I1001 16:00:03.767627 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29322240-cbk5d" event={"ID":"52c1a9fe-ba9f-4d54-ba72-45387aacaf7f","Type":"ContainerDied","Data":"4dc406a4f06a7ba2cf58dfe6b117dd5b8fdd11d521298c974d4bf7a2b4a05ad5"} Oct 01 16:00:03 crc kubenswrapper[4771]: I1001 16:00:03.767665 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4dc406a4f06a7ba2cf58dfe6b117dd5b8fdd11d521298c974d4bf7a2b4a05ad5" Oct 01 16:00:03 crc kubenswrapper[4771]: I1001 16:00:03.767666 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29322240-cbk5d" Oct 01 16:00:04 crc kubenswrapper[4771]: I1001 16:00:04.178619 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322195-9gkfv"] Oct 01 16:00:04 crc kubenswrapper[4771]: I1001 16:00:04.186421 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29322195-9gkfv"] Oct 01 16:00:06 crc kubenswrapper[4771]: I1001 16:00:06.001534 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49d1d22e-d17a-40b5-b968-c4f1277ae9cb" path="/var/lib/kubelet/pods/49d1d22e-d17a-40b5-b968-c4f1277ae9cb/volumes" Oct 01 16:00:13 crc kubenswrapper[4771]: I1001 16:00:13.984951 4771 scope.go:117] "RemoveContainer" containerID="a902474c94f7a5091bc2240a7b526fc1a45caf03e740411bdb189e934ec62269" Oct 01 16:00:13 crc kubenswrapper[4771]: E1001 16:00:13.985586 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 16:00:14 crc kubenswrapper[4771]: I1001 16:00:14.349891 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbbx8p_f9a434b3-f6c6-441a-bc5f-0731967288da/util/0.log" Oct 01 16:00:14 crc kubenswrapper[4771]: I1001 16:00:14.458870 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbbx8p_f9a434b3-f6c6-441a-bc5f-0731967288da/util/0.log" Oct 01 16:00:14 crc kubenswrapper[4771]: I1001 16:00:14.470531 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbbx8p_f9a434b3-f6c6-441a-bc5f-0731967288da/pull/0.log" Oct 01 16:00:14 crc kubenswrapper[4771]: I1001 16:00:14.542104 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbbx8p_f9a434b3-f6c6-441a-bc5f-0731967288da/pull/0.log" Oct 01 16:00:14 crc kubenswrapper[4771]: I1001 16:00:14.723915 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbbx8p_f9a434b3-f6c6-441a-bc5f-0731967288da/extract/0.log" Oct 01 16:00:14 crc kubenswrapper[4771]: I1001 16:00:14.765540 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbbx8p_f9a434b3-f6c6-441a-bc5f-0731967288da/util/0.log" Oct 01 16:00:14 crc kubenswrapper[4771]: I1001 16:00:14.779127 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbbx8p_f9a434b3-f6c6-441a-bc5f-0731967288da/pull/0.log" Oct 01 16:00:14 crc kubenswrapper[4771]: I1001 16:00:14.910681 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-cxlz6_e0ead9f7-cc56-4d45-9718-d175502b54df/extract-utilities/0.log" Oct 01 16:00:15 crc kubenswrapper[4771]: I1001 16:00:15.096910 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-cxlz6_e0ead9f7-cc56-4d45-9718-d175502b54df/extract-content/0.log" Oct 01 16:00:15 crc kubenswrapper[4771]: I1001 16:00:15.135316 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-cxlz6_e0ead9f7-cc56-4d45-9718-d175502b54df/extract-content/0.log" Oct 01 16:00:15 crc kubenswrapper[4771]: I1001 16:00:15.152524 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-cxlz6_e0ead9f7-cc56-4d45-9718-d175502b54df/extract-utilities/0.log" Oct 01 16:00:15 crc kubenswrapper[4771]: I1001 16:00:15.323491 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-cxlz6_e0ead9f7-cc56-4d45-9718-d175502b54df/extract-content/0.log" Oct 01 16:00:15 crc kubenswrapper[4771]: I1001 16:00:15.327142 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-cxlz6_e0ead9f7-cc56-4d45-9718-d175502b54df/extract-utilities/0.log" Oct 01 16:00:15 crc kubenswrapper[4771]: I1001 16:00:15.561556 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bv7vl_2061f39f-1b36-4d01-b13f-33156f106012/extract-utilities/0.log" Oct 01 16:00:15 crc kubenswrapper[4771]: I1001 16:00:15.750498 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-cxlz6_e0ead9f7-cc56-4d45-9718-d175502b54df/registry-server/0.log" Oct 01 16:00:15 crc kubenswrapper[4771]: I1001 16:00:15.794501 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bv7vl_2061f39f-1b36-4d01-b13f-33156f106012/extract-content/0.log" Oct 01 16:00:15 crc kubenswrapper[4771]: I1001 16:00:15.813890 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bv7vl_2061f39f-1b36-4d01-b13f-33156f106012/extract-content/0.log" Oct 01 16:00:15 crc kubenswrapper[4771]: I1001 16:00:15.828566 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bv7vl_2061f39f-1b36-4d01-b13f-33156f106012/extract-utilities/0.log" Oct 01 16:00:16 crc kubenswrapper[4771]: I1001 16:00:16.042964 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bv7vl_2061f39f-1b36-4d01-b13f-33156f106012/extract-utilities/0.log" Oct 01 16:00:16 crc kubenswrapper[4771]: I1001 16:00:16.055287 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bv7vl_2061f39f-1b36-4d01-b13f-33156f106012/extract-content/0.log" Oct 01 16:00:16 crc kubenswrapper[4771]: I1001 16:00:16.323004 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96g4kvv_5c4805aa-64a3-4354-b3ab-ab48935503cf/util/0.log" Oct 01 16:00:16 crc kubenswrapper[4771]: I1001 16:00:16.542398 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96g4kvv_5c4805aa-64a3-4354-b3ab-ab48935503cf/pull/0.log" Oct 01 16:00:16 crc kubenswrapper[4771]: I1001 16:00:16.552988 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bv7vl_2061f39f-1b36-4d01-b13f-33156f106012/registry-server/0.log" Oct 01 16:00:16 crc kubenswrapper[4771]: I1001 16:00:16.569296 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96g4kvv_5c4805aa-64a3-4354-b3ab-ab48935503cf/pull/0.log" Oct 01 16:00:16 crc kubenswrapper[4771]: I1001 16:00:16.569994 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96g4kvv_5c4805aa-64a3-4354-b3ab-ab48935503cf/util/0.log" Oct 01 16:00:16 crc kubenswrapper[4771]: I1001 16:00:16.737072 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96g4kvv_5c4805aa-64a3-4354-b3ab-ab48935503cf/pull/0.log" Oct 01 16:00:16 crc kubenswrapper[4771]: I1001 16:00:16.748224 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96g4kvv_5c4805aa-64a3-4354-b3ab-ab48935503cf/extract/0.log" Oct 01 16:00:16 crc kubenswrapper[4771]: I1001 16:00:16.794368 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96g4kvv_5c4805aa-64a3-4354-b3ab-ab48935503cf/util/0.log" Oct 01 16:00:16 crc kubenswrapper[4771]: I1001 16:00:16.981677 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-4bz2c_bf346cda-7a16-42a2-b731-d8834b7a1380/marketplace-operator/0.log" Oct 01 16:00:16 crc kubenswrapper[4771]: I1001 16:00:16.996492 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9ftld_e61495fc-ef85-44de-8135-f080a089e4ed/extract-utilities/0.log" Oct 01 16:00:17 crc kubenswrapper[4771]: I1001 16:00:17.176985 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9ftld_e61495fc-ef85-44de-8135-f080a089e4ed/extract-utilities/0.log" Oct 01 16:00:17 crc kubenswrapper[4771]: I1001 16:00:17.227113 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9ftld_e61495fc-ef85-44de-8135-f080a089e4ed/extract-content/0.log" Oct 01 16:00:17 crc kubenswrapper[4771]: I1001 16:00:17.261309 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9ftld_e61495fc-ef85-44de-8135-f080a089e4ed/extract-content/0.log" Oct 01 16:00:17 crc kubenswrapper[4771]: I1001 16:00:17.418835 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9ftld_e61495fc-ef85-44de-8135-f080a089e4ed/extract-utilities/0.log" Oct 01 16:00:17 crc kubenswrapper[4771]: I1001 16:00:17.424538 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9ftld_e61495fc-ef85-44de-8135-f080a089e4ed/extract-content/0.log" Oct 01 16:00:17 crc kubenswrapper[4771]: I1001 16:00:17.550950 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9ftld_e61495fc-ef85-44de-8135-f080a089e4ed/registry-server/0.log" Oct 01 16:00:17 crc kubenswrapper[4771]: I1001 16:00:17.592224 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lzbhb_e8c77405-b06a-457f-ae26-aa105e1e638c/extract-utilities/0.log" Oct 01 16:00:17 crc kubenswrapper[4771]: I1001 16:00:17.823131 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lzbhb_e8c77405-b06a-457f-ae26-aa105e1e638c/extract-content/0.log" Oct 01 16:00:17 crc kubenswrapper[4771]: I1001 16:00:17.832280 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lzbhb_e8c77405-b06a-457f-ae26-aa105e1e638c/extract-content/0.log" Oct 01 16:00:17 crc kubenswrapper[4771]: I1001 16:00:17.856227 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lzbhb_e8c77405-b06a-457f-ae26-aa105e1e638c/extract-utilities/0.log" Oct 01 16:00:17 crc kubenswrapper[4771]: I1001 16:00:17.993350 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lzbhb_e8c77405-b06a-457f-ae26-aa105e1e638c/extract-utilities/0.log" Oct 01 16:00:18 crc kubenswrapper[4771]: I1001 16:00:18.018462 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lzbhb_e8c77405-b06a-457f-ae26-aa105e1e638c/extract-content/0.log" Oct 01 16:00:18 crc kubenswrapper[4771]: I1001 16:00:18.701357 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lzbhb_e8c77405-b06a-457f-ae26-aa105e1e638c/registry-server/0.log" Oct 01 16:00:27 crc kubenswrapper[4771]: I1001 16:00:27.008971 4771 scope.go:117] "RemoveContainer" containerID="a902474c94f7a5091bc2240a7b526fc1a45caf03e740411bdb189e934ec62269" Oct 01 16:00:27 crc kubenswrapper[4771]: E1001 16:00:27.009867 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 16:00:38 crc kubenswrapper[4771]: I1001 16:00:38.985360 4771 scope.go:117] "RemoveContainer" containerID="a902474c94f7a5091bc2240a7b526fc1a45caf03e740411bdb189e934ec62269" Oct 01 16:00:38 crc kubenswrapper[4771]: E1001 16:00:38.986266 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 16:00:51 crc kubenswrapper[4771]: I1001 16:00:51.985142 4771 scope.go:117] "RemoveContainer" containerID="a902474c94f7a5091bc2240a7b526fc1a45caf03e740411bdb189e934ec62269" Oct 01 16:00:52 crc kubenswrapper[4771]: I1001 16:00:52.264303 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" event={"ID":"289ee6d3-fabe-417f-964c-76ca03c143cc","Type":"ContainerStarted","Data":"19c8aefbbf261f233703d5d75b1ff28f73820b1b39703c62ab135fd2a27acfcb"} Oct 01 16:01:00 crc kubenswrapper[4771]: I1001 16:01:00.152135 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29322241-5g5cn"] Oct 01 16:01:00 crc kubenswrapper[4771]: E1001 16:01:00.152966 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52c1a9fe-ba9f-4d54-ba72-45387aacaf7f" containerName="collect-profiles" Oct 01 16:01:00 crc kubenswrapper[4771]: I1001 16:01:00.152978 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="52c1a9fe-ba9f-4d54-ba72-45387aacaf7f" containerName="collect-profiles" Oct 01 16:01:00 crc kubenswrapper[4771]: I1001 16:01:00.153170 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="52c1a9fe-ba9f-4d54-ba72-45387aacaf7f" containerName="collect-profiles" Oct 01 16:01:00 crc kubenswrapper[4771]: I1001 16:01:00.153720 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29322241-5g5cn" Oct 01 16:01:00 crc kubenswrapper[4771]: I1001 16:01:00.169177 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29322241-5g5cn"] Oct 01 16:01:00 crc kubenswrapper[4771]: I1001 16:01:00.236556 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8373a131-a79c-4385-a8bd-949721e18222-combined-ca-bundle\") pod \"keystone-cron-29322241-5g5cn\" (UID: \"8373a131-a79c-4385-a8bd-949721e18222\") " pod="openstack/keystone-cron-29322241-5g5cn" Oct 01 16:01:00 crc kubenswrapper[4771]: I1001 16:01:00.237004 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8vpw\" (UniqueName: \"kubernetes.io/projected/8373a131-a79c-4385-a8bd-949721e18222-kube-api-access-k8vpw\") pod \"keystone-cron-29322241-5g5cn\" (UID: \"8373a131-a79c-4385-a8bd-949721e18222\") " pod="openstack/keystone-cron-29322241-5g5cn" Oct 01 16:01:00 crc kubenswrapper[4771]: I1001 16:01:00.237132 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8373a131-a79c-4385-a8bd-949721e18222-fernet-keys\") pod \"keystone-cron-29322241-5g5cn\" (UID: \"8373a131-a79c-4385-a8bd-949721e18222\") " pod="openstack/keystone-cron-29322241-5g5cn" Oct 01 16:01:00 crc kubenswrapper[4771]: I1001 16:01:00.237176 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8373a131-a79c-4385-a8bd-949721e18222-config-data\") pod \"keystone-cron-29322241-5g5cn\" (UID: \"8373a131-a79c-4385-a8bd-949721e18222\") " pod="openstack/keystone-cron-29322241-5g5cn" Oct 01 16:01:00 crc kubenswrapper[4771]: I1001 16:01:00.338799 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8vpw\" (UniqueName: \"kubernetes.io/projected/8373a131-a79c-4385-a8bd-949721e18222-kube-api-access-k8vpw\") pod \"keystone-cron-29322241-5g5cn\" (UID: \"8373a131-a79c-4385-a8bd-949721e18222\") " pod="openstack/keystone-cron-29322241-5g5cn" Oct 01 16:01:00 crc kubenswrapper[4771]: I1001 16:01:00.338922 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8373a131-a79c-4385-a8bd-949721e18222-fernet-keys\") pod \"keystone-cron-29322241-5g5cn\" (UID: \"8373a131-a79c-4385-a8bd-949721e18222\") " pod="openstack/keystone-cron-29322241-5g5cn" Oct 01 16:01:00 crc kubenswrapper[4771]: I1001 16:01:00.338965 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8373a131-a79c-4385-a8bd-949721e18222-config-data\") pod \"keystone-cron-29322241-5g5cn\" (UID: \"8373a131-a79c-4385-a8bd-949721e18222\") " pod="openstack/keystone-cron-29322241-5g5cn" Oct 01 16:01:00 crc kubenswrapper[4771]: I1001 16:01:00.339124 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8373a131-a79c-4385-a8bd-949721e18222-combined-ca-bundle\") pod \"keystone-cron-29322241-5g5cn\" (UID: \"8373a131-a79c-4385-a8bd-949721e18222\") " pod="openstack/keystone-cron-29322241-5g5cn" Oct 01 16:01:00 crc kubenswrapper[4771]: I1001 16:01:00.347556 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8373a131-a79c-4385-a8bd-949721e18222-fernet-keys\") pod \"keystone-cron-29322241-5g5cn\" (UID: \"8373a131-a79c-4385-a8bd-949721e18222\") " pod="openstack/keystone-cron-29322241-5g5cn" Oct 01 16:01:00 crc kubenswrapper[4771]: I1001 16:01:00.348323 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8373a131-a79c-4385-a8bd-949721e18222-combined-ca-bundle\") pod \"keystone-cron-29322241-5g5cn\" (UID: \"8373a131-a79c-4385-a8bd-949721e18222\") " pod="openstack/keystone-cron-29322241-5g5cn" Oct 01 16:01:00 crc kubenswrapper[4771]: I1001 16:01:00.349262 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8373a131-a79c-4385-a8bd-949721e18222-config-data\") pod \"keystone-cron-29322241-5g5cn\" (UID: \"8373a131-a79c-4385-a8bd-949721e18222\") " pod="openstack/keystone-cron-29322241-5g5cn" Oct 01 16:01:00 crc kubenswrapper[4771]: I1001 16:01:00.368285 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8vpw\" (UniqueName: \"kubernetes.io/projected/8373a131-a79c-4385-a8bd-949721e18222-kube-api-access-k8vpw\") pod \"keystone-cron-29322241-5g5cn\" (UID: \"8373a131-a79c-4385-a8bd-949721e18222\") " pod="openstack/keystone-cron-29322241-5g5cn" Oct 01 16:01:00 crc kubenswrapper[4771]: I1001 16:01:00.475322 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29322241-5g5cn" Oct 01 16:01:00 crc kubenswrapper[4771]: I1001 16:01:00.993309 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29322241-5g5cn"] Oct 01 16:01:01 crc kubenswrapper[4771]: I1001 16:01:01.356358 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29322241-5g5cn" event={"ID":"8373a131-a79c-4385-a8bd-949721e18222","Type":"ContainerStarted","Data":"5f527a7fddd91cff6f01705f6ae67048bbb09212c5bb55a35d070ad6fe5acf8c"} Oct 01 16:01:01 crc kubenswrapper[4771]: I1001 16:01:01.357639 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29322241-5g5cn" event={"ID":"8373a131-a79c-4385-a8bd-949721e18222","Type":"ContainerStarted","Data":"5d8fb875df2efb51d437dfbde9703d4f6fa990b439a75e0011aef90054e77816"} Oct 01 16:01:01 crc kubenswrapper[4771]: I1001 16:01:01.384445 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29322241-5g5cn" podStartSLOduration=1.384423489 podStartE2EDuration="1.384423489s" podCreationTimestamp="2025-10-01 16:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:01:01.374057375 +0000 UTC m=+3905.993232566" watchObservedRunningTime="2025-10-01 16:01:01.384423489 +0000 UTC m=+3906.003598670" Oct 01 16:01:03 crc kubenswrapper[4771]: I1001 16:01:03.378896 4771 generic.go:334] "Generic (PLEG): container finished" podID="8373a131-a79c-4385-a8bd-949721e18222" containerID="5f527a7fddd91cff6f01705f6ae67048bbb09212c5bb55a35d070ad6fe5acf8c" exitCode=0 Oct 01 16:01:03 crc kubenswrapper[4771]: I1001 16:01:03.379007 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29322241-5g5cn" event={"ID":"8373a131-a79c-4385-a8bd-949721e18222","Type":"ContainerDied","Data":"5f527a7fddd91cff6f01705f6ae67048bbb09212c5bb55a35d070ad6fe5acf8c"} Oct 01 16:01:04 crc kubenswrapper[4771]: I1001 16:01:04.297201 4771 scope.go:117] "RemoveContainer" containerID="0ab99eb779308fb00453a3d5a5ac906b56271e46379cdb79c78d6210c1984a29" Oct 01 16:01:04 crc kubenswrapper[4771]: I1001 16:01:04.792210 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29322241-5g5cn" Oct 01 16:01:04 crc kubenswrapper[4771]: I1001 16:01:04.831074 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8vpw\" (UniqueName: \"kubernetes.io/projected/8373a131-a79c-4385-a8bd-949721e18222-kube-api-access-k8vpw\") pod \"8373a131-a79c-4385-a8bd-949721e18222\" (UID: \"8373a131-a79c-4385-a8bd-949721e18222\") " Oct 01 16:01:04 crc kubenswrapper[4771]: I1001 16:01:04.831253 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8373a131-a79c-4385-a8bd-949721e18222-fernet-keys\") pod \"8373a131-a79c-4385-a8bd-949721e18222\" (UID: \"8373a131-a79c-4385-a8bd-949721e18222\") " Oct 01 16:01:04 crc kubenswrapper[4771]: I1001 16:01:04.831366 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8373a131-a79c-4385-a8bd-949721e18222-combined-ca-bundle\") pod \"8373a131-a79c-4385-a8bd-949721e18222\" (UID: \"8373a131-a79c-4385-a8bd-949721e18222\") " Oct 01 16:01:04 crc kubenswrapper[4771]: I1001 16:01:04.832376 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8373a131-a79c-4385-a8bd-949721e18222-config-data\") pod \"8373a131-a79c-4385-a8bd-949721e18222\" (UID: \"8373a131-a79c-4385-a8bd-949721e18222\") " Oct 01 16:01:04 crc kubenswrapper[4771]: I1001 16:01:04.849525 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8373a131-a79c-4385-a8bd-949721e18222-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "8373a131-a79c-4385-a8bd-949721e18222" (UID: "8373a131-a79c-4385-a8bd-949721e18222"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:01:04 crc kubenswrapper[4771]: I1001 16:01:04.849875 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8373a131-a79c-4385-a8bd-949721e18222-kube-api-access-k8vpw" (OuterVolumeSpecName: "kube-api-access-k8vpw") pod "8373a131-a79c-4385-a8bd-949721e18222" (UID: "8373a131-a79c-4385-a8bd-949721e18222"). InnerVolumeSpecName "kube-api-access-k8vpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:01:04 crc kubenswrapper[4771]: I1001 16:01:04.874122 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8373a131-a79c-4385-a8bd-949721e18222-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8373a131-a79c-4385-a8bd-949721e18222" (UID: "8373a131-a79c-4385-a8bd-949721e18222"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:01:04 crc kubenswrapper[4771]: I1001 16:01:04.896387 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8373a131-a79c-4385-a8bd-949721e18222-config-data" (OuterVolumeSpecName: "config-data") pod "8373a131-a79c-4385-a8bd-949721e18222" (UID: "8373a131-a79c-4385-a8bd-949721e18222"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 01 16:01:04 crc kubenswrapper[4771]: I1001 16:01:04.934455 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8vpw\" (UniqueName: \"kubernetes.io/projected/8373a131-a79c-4385-a8bd-949721e18222-kube-api-access-k8vpw\") on node \"crc\" DevicePath \"\"" Oct 01 16:01:04 crc kubenswrapper[4771]: I1001 16:01:04.934492 4771 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/8373a131-a79c-4385-a8bd-949721e18222-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 01 16:01:04 crc kubenswrapper[4771]: I1001 16:01:04.934505 4771 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8373a131-a79c-4385-a8bd-949721e18222-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 01 16:01:04 crc kubenswrapper[4771]: I1001 16:01:04.934515 4771 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8373a131-a79c-4385-a8bd-949721e18222-config-data\") on node \"crc\" DevicePath \"\"" Oct 01 16:01:05 crc kubenswrapper[4771]: I1001 16:01:05.410171 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29322241-5g5cn" event={"ID":"8373a131-a79c-4385-a8bd-949721e18222","Type":"ContainerDied","Data":"5d8fb875df2efb51d437dfbde9703d4f6fa990b439a75e0011aef90054e77816"} Oct 01 16:01:05 crc kubenswrapper[4771]: I1001 16:01:05.410210 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d8fb875df2efb51d437dfbde9703d4f6fa990b439a75e0011aef90054e77816" Oct 01 16:01:05 crc kubenswrapper[4771]: I1001 16:01:05.410221 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29322241-5g5cn" Oct 01 16:02:02 crc kubenswrapper[4771]: I1001 16:02:02.460003 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kkm6f"] Oct 01 16:02:02 crc kubenswrapper[4771]: E1001 16:02:02.462190 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8373a131-a79c-4385-a8bd-949721e18222" containerName="keystone-cron" Oct 01 16:02:02 crc kubenswrapper[4771]: I1001 16:02:02.462225 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="8373a131-a79c-4385-a8bd-949721e18222" containerName="keystone-cron" Oct 01 16:02:02 crc kubenswrapper[4771]: I1001 16:02:02.462467 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="8373a131-a79c-4385-a8bd-949721e18222" containerName="keystone-cron" Oct 01 16:02:02 crc kubenswrapper[4771]: I1001 16:02:02.464638 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kkm6f" Oct 01 16:02:02 crc kubenswrapper[4771]: I1001 16:02:02.478990 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kkm6f"] Oct 01 16:02:02 crc kubenswrapper[4771]: I1001 16:02:02.526113 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe46002b-150e-4e55-b1b5-50d658e6a2a4-catalog-content\") pod \"redhat-marketplace-kkm6f\" (UID: \"fe46002b-150e-4e55-b1b5-50d658e6a2a4\") " pod="openshift-marketplace/redhat-marketplace-kkm6f" Oct 01 16:02:02 crc kubenswrapper[4771]: I1001 16:02:02.526512 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slkht\" (UniqueName: \"kubernetes.io/projected/fe46002b-150e-4e55-b1b5-50d658e6a2a4-kube-api-access-slkht\") pod \"redhat-marketplace-kkm6f\" (UID: \"fe46002b-150e-4e55-b1b5-50d658e6a2a4\") " pod="openshift-marketplace/redhat-marketplace-kkm6f" Oct 01 16:02:02 crc kubenswrapper[4771]: I1001 16:02:02.526718 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe46002b-150e-4e55-b1b5-50d658e6a2a4-utilities\") pod \"redhat-marketplace-kkm6f\" (UID: \"fe46002b-150e-4e55-b1b5-50d658e6a2a4\") " pod="openshift-marketplace/redhat-marketplace-kkm6f" Oct 01 16:02:02 crc kubenswrapper[4771]: I1001 16:02:02.628167 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slkht\" (UniqueName: \"kubernetes.io/projected/fe46002b-150e-4e55-b1b5-50d658e6a2a4-kube-api-access-slkht\") pod \"redhat-marketplace-kkm6f\" (UID: \"fe46002b-150e-4e55-b1b5-50d658e6a2a4\") " pod="openshift-marketplace/redhat-marketplace-kkm6f" Oct 01 16:02:02 crc kubenswrapper[4771]: I1001 16:02:02.628266 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe46002b-150e-4e55-b1b5-50d658e6a2a4-utilities\") pod \"redhat-marketplace-kkm6f\" (UID: \"fe46002b-150e-4e55-b1b5-50d658e6a2a4\") " pod="openshift-marketplace/redhat-marketplace-kkm6f" Oct 01 16:02:02 crc kubenswrapper[4771]: I1001 16:02:02.628336 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe46002b-150e-4e55-b1b5-50d658e6a2a4-catalog-content\") pod \"redhat-marketplace-kkm6f\" (UID: \"fe46002b-150e-4e55-b1b5-50d658e6a2a4\") " pod="openshift-marketplace/redhat-marketplace-kkm6f" Oct 01 16:02:02 crc kubenswrapper[4771]: I1001 16:02:02.628888 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe46002b-150e-4e55-b1b5-50d658e6a2a4-catalog-content\") pod \"redhat-marketplace-kkm6f\" (UID: \"fe46002b-150e-4e55-b1b5-50d658e6a2a4\") " pod="openshift-marketplace/redhat-marketplace-kkm6f" Oct 01 16:02:02 crc kubenswrapper[4771]: I1001 16:02:02.628916 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe46002b-150e-4e55-b1b5-50d658e6a2a4-utilities\") pod \"redhat-marketplace-kkm6f\" (UID: \"fe46002b-150e-4e55-b1b5-50d658e6a2a4\") " pod="openshift-marketplace/redhat-marketplace-kkm6f" Oct 01 16:02:02 crc kubenswrapper[4771]: I1001 16:02:02.655941 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slkht\" (UniqueName: \"kubernetes.io/projected/fe46002b-150e-4e55-b1b5-50d658e6a2a4-kube-api-access-slkht\") pod \"redhat-marketplace-kkm6f\" (UID: \"fe46002b-150e-4e55-b1b5-50d658e6a2a4\") " pod="openshift-marketplace/redhat-marketplace-kkm6f" Oct 01 16:02:02 crc kubenswrapper[4771]: I1001 16:02:02.799914 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kkm6f" Oct 01 16:02:03 crc kubenswrapper[4771]: I1001 16:02:03.301208 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kkm6f"] Oct 01 16:02:03 crc kubenswrapper[4771]: W1001 16:02:03.780102 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe46002b_150e_4e55_b1b5_50d658e6a2a4.slice/crio-5ba214c273d7761ffdca931bca37e53f43f6b0861449bb8465151fd777dca146 WatchSource:0}: Error finding container 5ba214c273d7761ffdca931bca37e53f43f6b0861449bb8465151fd777dca146: Status 404 returned error can't find the container with id 5ba214c273d7761ffdca931bca37e53f43f6b0861449bb8465151fd777dca146 Oct 01 16:02:04 crc kubenswrapper[4771]: I1001 16:02:04.050949 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kkm6f" event={"ID":"fe46002b-150e-4e55-b1b5-50d658e6a2a4","Type":"ContainerStarted","Data":"267e3b96490422a4f114804b74a91c0b475b3c54ae7b54bc305f8b506ea6dc75"} Oct 01 16:02:04 crc kubenswrapper[4771]: I1001 16:02:04.051365 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kkm6f" event={"ID":"fe46002b-150e-4e55-b1b5-50d658e6a2a4","Type":"ContainerStarted","Data":"5ba214c273d7761ffdca931bca37e53f43f6b0861449bb8465151fd777dca146"} Oct 01 16:02:05 crc kubenswrapper[4771]: I1001 16:02:05.062811 4771 generic.go:334] "Generic (PLEG): container finished" podID="fe46002b-150e-4e55-b1b5-50d658e6a2a4" containerID="267e3b96490422a4f114804b74a91c0b475b3c54ae7b54bc305f8b506ea6dc75" exitCode=0 Oct 01 16:02:05 crc kubenswrapper[4771]: I1001 16:02:05.063011 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kkm6f" event={"ID":"fe46002b-150e-4e55-b1b5-50d658e6a2a4","Type":"ContainerDied","Data":"267e3b96490422a4f114804b74a91c0b475b3c54ae7b54bc305f8b506ea6dc75"} Oct 01 16:02:05 crc kubenswrapper[4771]: I1001 16:02:05.068680 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 16:02:07 crc kubenswrapper[4771]: I1001 16:02:07.083885 4771 generic.go:334] "Generic (PLEG): container finished" podID="fe46002b-150e-4e55-b1b5-50d658e6a2a4" containerID="f047eef171c386a3e4f10d9e29496a0547938fa486738a6f52e11764dcde0a0a" exitCode=0 Oct 01 16:02:07 crc kubenswrapper[4771]: I1001 16:02:07.083981 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kkm6f" event={"ID":"fe46002b-150e-4e55-b1b5-50d658e6a2a4","Type":"ContainerDied","Data":"f047eef171c386a3e4f10d9e29496a0547938fa486738a6f52e11764dcde0a0a"} Oct 01 16:02:08 crc kubenswrapper[4771]: I1001 16:02:08.100897 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kkm6f" event={"ID":"fe46002b-150e-4e55-b1b5-50d658e6a2a4","Type":"ContainerStarted","Data":"e97692021fe42f3d283229f4f257b89d8af7dc2cb83a1845bedfe8a160366360"} Oct 01 16:02:08 crc kubenswrapper[4771]: I1001 16:02:08.120566 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kkm6f" podStartSLOduration=3.640398082 podStartE2EDuration="6.120546045s" podCreationTimestamp="2025-10-01 16:02:02 +0000 UTC" firstStartedPulling="2025-10-01 16:02:05.06837308 +0000 UTC m=+3969.687548251" lastFinishedPulling="2025-10-01 16:02:07.548521043 +0000 UTC m=+3972.167696214" observedRunningTime="2025-10-01 16:02:08.119284644 +0000 UTC m=+3972.738459815" watchObservedRunningTime="2025-10-01 16:02:08.120546045 +0000 UTC m=+3972.739721216" Oct 01 16:02:12 crc kubenswrapper[4771]: I1001 16:02:12.800160 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kkm6f" Oct 01 16:02:12 crc kubenswrapper[4771]: I1001 16:02:12.800720 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kkm6f" Oct 01 16:02:12 crc kubenswrapper[4771]: I1001 16:02:12.864453 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kkm6f" Oct 01 16:02:13 crc kubenswrapper[4771]: I1001 16:02:13.193445 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kkm6f" Oct 01 16:02:13 crc kubenswrapper[4771]: I1001 16:02:13.242472 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kkm6f"] Oct 01 16:02:15 crc kubenswrapper[4771]: I1001 16:02:15.165533 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kkm6f" podUID="fe46002b-150e-4e55-b1b5-50d658e6a2a4" containerName="registry-server" containerID="cri-o://e97692021fe42f3d283229f4f257b89d8af7dc2cb83a1845bedfe8a160366360" gracePeriod=2 Oct 01 16:02:16 crc kubenswrapper[4771]: I1001 16:02:16.188141 4771 generic.go:334] "Generic (PLEG): container finished" podID="fe46002b-150e-4e55-b1b5-50d658e6a2a4" containerID="e97692021fe42f3d283229f4f257b89d8af7dc2cb83a1845bedfe8a160366360" exitCode=0 Oct 01 16:02:16 crc kubenswrapper[4771]: I1001 16:02:16.188367 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kkm6f" event={"ID":"fe46002b-150e-4e55-b1b5-50d658e6a2a4","Type":"ContainerDied","Data":"e97692021fe42f3d283229f4f257b89d8af7dc2cb83a1845bedfe8a160366360"} Oct 01 16:02:16 crc kubenswrapper[4771]: I1001 16:02:16.188704 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kkm6f" event={"ID":"fe46002b-150e-4e55-b1b5-50d658e6a2a4","Type":"ContainerDied","Data":"5ba214c273d7761ffdca931bca37e53f43f6b0861449bb8465151fd777dca146"} Oct 01 16:02:16 crc kubenswrapper[4771]: I1001 16:02:16.188725 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5ba214c273d7761ffdca931bca37e53f43f6b0861449bb8465151fd777dca146" Oct 01 16:02:16 crc kubenswrapper[4771]: I1001 16:02:16.190871 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kkm6f" Oct 01 16:02:16 crc kubenswrapper[4771]: I1001 16:02:16.196390 4771 generic.go:334] "Generic (PLEG): container finished" podID="7d8f06d5-e155-46cc-bf0b-0a690ab4a3a9" containerID="201b1073dbb9a79ad0370e3e4ee1d23658754299a0614d3fd78e16a0863e704c" exitCode=0 Oct 01 16:02:16 crc kubenswrapper[4771]: I1001 16:02:16.196559 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rf44r/must-gather-2mzcl" event={"ID":"7d8f06d5-e155-46cc-bf0b-0a690ab4a3a9","Type":"ContainerDied","Data":"201b1073dbb9a79ad0370e3e4ee1d23658754299a0614d3fd78e16a0863e704c"} Oct 01 16:02:16 crc kubenswrapper[4771]: I1001 16:02:16.199275 4771 scope.go:117] "RemoveContainer" containerID="201b1073dbb9a79ad0370e3e4ee1d23658754299a0614d3fd78e16a0863e704c" Oct 01 16:02:16 crc kubenswrapper[4771]: I1001 16:02:16.336809 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe46002b-150e-4e55-b1b5-50d658e6a2a4-catalog-content\") pod \"fe46002b-150e-4e55-b1b5-50d658e6a2a4\" (UID: \"fe46002b-150e-4e55-b1b5-50d658e6a2a4\") " Oct 01 16:02:16 crc kubenswrapper[4771]: I1001 16:02:16.336880 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slkht\" (UniqueName: \"kubernetes.io/projected/fe46002b-150e-4e55-b1b5-50d658e6a2a4-kube-api-access-slkht\") pod \"fe46002b-150e-4e55-b1b5-50d658e6a2a4\" (UID: \"fe46002b-150e-4e55-b1b5-50d658e6a2a4\") " Oct 01 16:02:16 crc kubenswrapper[4771]: I1001 16:02:16.337076 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe46002b-150e-4e55-b1b5-50d658e6a2a4-utilities\") pod \"fe46002b-150e-4e55-b1b5-50d658e6a2a4\" (UID: \"fe46002b-150e-4e55-b1b5-50d658e6a2a4\") " Oct 01 16:02:16 crc kubenswrapper[4771]: I1001 16:02:16.338460 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe46002b-150e-4e55-b1b5-50d658e6a2a4-utilities" (OuterVolumeSpecName: "utilities") pod "fe46002b-150e-4e55-b1b5-50d658e6a2a4" (UID: "fe46002b-150e-4e55-b1b5-50d658e6a2a4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:02:16 crc kubenswrapper[4771]: I1001 16:02:16.344028 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe46002b-150e-4e55-b1b5-50d658e6a2a4-kube-api-access-slkht" (OuterVolumeSpecName: "kube-api-access-slkht") pod "fe46002b-150e-4e55-b1b5-50d658e6a2a4" (UID: "fe46002b-150e-4e55-b1b5-50d658e6a2a4"). InnerVolumeSpecName "kube-api-access-slkht". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:16 crc kubenswrapper[4771]: I1001 16:02:16.352240 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe46002b-150e-4e55-b1b5-50d658e6a2a4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fe46002b-150e-4e55-b1b5-50d658e6a2a4" (UID: "fe46002b-150e-4e55-b1b5-50d658e6a2a4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:02:16 crc kubenswrapper[4771]: I1001 16:02:16.439274 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fe46002b-150e-4e55-b1b5-50d658e6a2a4-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:16 crc kubenswrapper[4771]: I1001 16:02:16.439319 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fe46002b-150e-4e55-b1b5-50d658e6a2a4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:16 crc kubenswrapper[4771]: I1001 16:02:16.439333 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slkht\" (UniqueName: \"kubernetes.io/projected/fe46002b-150e-4e55-b1b5-50d658e6a2a4-kube-api-access-slkht\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:16 crc kubenswrapper[4771]: I1001 16:02:16.864530 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rf44r_must-gather-2mzcl_7d8f06d5-e155-46cc-bf0b-0a690ab4a3a9/gather/0.log" Oct 01 16:02:17 crc kubenswrapper[4771]: I1001 16:02:17.204201 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kkm6f" Oct 01 16:02:17 crc kubenswrapper[4771]: I1001 16:02:17.261417 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kkm6f"] Oct 01 16:02:17 crc kubenswrapper[4771]: I1001 16:02:17.271362 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kkm6f"] Oct 01 16:02:18 crc kubenswrapper[4771]: I1001 16:02:18.001863 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe46002b-150e-4e55-b1b5-50d658e6a2a4" path="/var/lib/kubelet/pods/fe46002b-150e-4e55-b1b5-50d658e6a2a4/volumes" Oct 01 16:02:24 crc kubenswrapper[4771]: I1001 16:02:24.857477 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-rf44r/must-gather-2mzcl"] Oct 01 16:02:24 crc kubenswrapper[4771]: I1001 16:02:24.858347 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-rf44r/must-gather-2mzcl" podUID="7d8f06d5-e155-46cc-bf0b-0a690ab4a3a9" containerName="copy" containerID="cri-o://b43a2123bb1eaa535c05de6ee8efc3e138362e16812b651b02a41593cc9cecda" gracePeriod=2 Oct 01 16:02:24 crc kubenswrapper[4771]: I1001 16:02:24.868893 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-rf44r/must-gather-2mzcl"] Oct 01 16:02:25 crc kubenswrapper[4771]: I1001 16:02:25.286810 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rf44r_must-gather-2mzcl_7d8f06d5-e155-46cc-bf0b-0a690ab4a3a9/copy/0.log" Oct 01 16:02:25 crc kubenswrapper[4771]: I1001 16:02:25.290120 4771 generic.go:334] "Generic (PLEG): container finished" podID="7d8f06d5-e155-46cc-bf0b-0a690ab4a3a9" containerID="b43a2123bb1eaa535c05de6ee8efc3e138362e16812b651b02a41593cc9cecda" exitCode=143 Oct 01 16:02:25 crc kubenswrapper[4771]: I1001 16:02:25.514289 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rf44r_must-gather-2mzcl_7d8f06d5-e155-46cc-bf0b-0a690ab4a3a9/copy/0.log" Oct 01 16:02:25 crc kubenswrapper[4771]: I1001 16:02:25.514784 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rf44r/must-gather-2mzcl" Oct 01 16:02:25 crc kubenswrapper[4771]: I1001 16:02:25.623861 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ltb2\" (UniqueName: \"kubernetes.io/projected/7d8f06d5-e155-46cc-bf0b-0a690ab4a3a9-kube-api-access-8ltb2\") pod \"7d8f06d5-e155-46cc-bf0b-0a690ab4a3a9\" (UID: \"7d8f06d5-e155-46cc-bf0b-0a690ab4a3a9\") " Oct 01 16:02:25 crc kubenswrapper[4771]: I1001 16:02:25.624144 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7d8f06d5-e155-46cc-bf0b-0a690ab4a3a9-must-gather-output\") pod \"7d8f06d5-e155-46cc-bf0b-0a690ab4a3a9\" (UID: \"7d8f06d5-e155-46cc-bf0b-0a690ab4a3a9\") " Oct 01 16:02:25 crc kubenswrapper[4771]: I1001 16:02:25.635411 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d8f06d5-e155-46cc-bf0b-0a690ab4a3a9-kube-api-access-8ltb2" (OuterVolumeSpecName: "kube-api-access-8ltb2") pod "7d8f06d5-e155-46cc-bf0b-0a690ab4a3a9" (UID: "7d8f06d5-e155-46cc-bf0b-0a690ab4a3a9"). InnerVolumeSpecName "kube-api-access-8ltb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:02:25 crc kubenswrapper[4771]: I1001 16:02:25.726093 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ltb2\" (UniqueName: \"kubernetes.io/projected/7d8f06d5-e155-46cc-bf0b-0a690ab4a3a9-kube-api-access-8ltb2\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:25 crc kubenswrapper[4771]: I1001 16:02:25.788424 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d8f06d5-e155-46cc-bf0b-0a690ab4a3a9-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "7d8f06d5-e155-46cc-bf0b-0a690ab4a3a9" (UID: "7d8f06d5-e155-46cc-bf0b-0a690ab4a3a9"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:02:25 crc kubenswrapper[4771]: I1001 16:02:25.830238 4771 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7d8f06d5-e155-46cc-bf0b-0a690ab4a3a9-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 01 16:02:26 crc kubenswrapper[4771]: I1001 16:02:26.000813 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d8f06d5-e155-46cc-bf0b-0a690ab4a3a9" path="/var/lib/kubelet/pods/7d8f06d5-e155-46cc-bf0b-0a690ab4a3a9/volumes" Oct 01 16:02:26 crc kubenswrapper[4771]: I1001 16:02:26.299871 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rf44r_must-gather-2mzcl_7d8f06d5-e155-46cc-bf0b-0a690ab4a3a9/copy/0.log" Oct 01 16:02:26 crc kubenswrapper[4771]: I1001 16:02:26.300279 4771 scope.go:117] "RemoveContainer" containerID="b43a2123bb1eaa535c05de6ee8efc3e138362e16812b651b02a41593cc9cecda" Oct 01 16:02:26 crc kubenswrapper[4771]: I1001 16:02:26.300332 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rf44r/must-gather-2mzcl" Oct 01 16:02:26 crc kubenswrapper[4771]: I1001 16:02:26.320177 4771 scope.go:117] "RemoveContainer" containerID="201b1073dbb9a79ad0370e3e4ee1d23658754299a0614d3fd78e16a0863e704c" Oct 01 16:03:07 crc kubenswrapper[4771]: I1001 16:03:07.014768 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wfbm9/must-gather-brgcz"] Oct 01 16:03:07 crc kubenswrapper[4771]: E1001 16:03:07.016021 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d8f06d5-e155-46cc-bf0b-0a690ab4a3a9" containerName="copy" Oct 01 16:03:07 crc kubenswrapper[4771]: I1001 16:03:07.016039 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d8f06d5-e155-46cc-bf0b-0a690ab4a3a9" containerName="copy" Oct 01 16:03:07 crc kubenswrapper[4771]: E1001 16:03:07.016099 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe46002b-150e-4e55-b1b5-50d658e6a2a4" containerName="registry-server" Oct 01 16:03:07 crc kubenswrapper[4771]: I1001 16:03:07.016112 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe46002b-150e-4e55-b1b5-50d658e6a2a4" containerName="registry-server" Oct 01 16:03:07 crc kubenswrapper[4771]: E1001 16:03:07.016137 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe46002b-150e-4e55-b1b5-50d658e6a2a4" containerName="extract-utilities" Oct 01 16:03:07 crc kubenswrapper[4771]: I1001 16:03:07.016173 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe46002b-150e-4e55-b1b5-50d658e6a2a4" containerName="extract-utilities" Oct 01 16:03:07 crc kubenswrapper[4771]: E1001 16:03:07.016205 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d8f06d5-e155-46cc-bf0b-0a690ab4a3a9" containerName="gather" Oct 01 16:03:07 crc kubenswrapper[4771]: I1001 16:03:07.016213 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d8f06d5-e155-46cc-bf0b-0a690ab4a3a9" containerName="gather" Oct 01 16:03:07 crc kubenswrapper[4771]: E1001 16:03:07.016254 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe46002b-150e-4e55-b1b5-50d658e6a2a4" containerName="extract-content" Oct 01 16:03:07 crc kubenswrapper[4771]: I1001 16:03:07.016264 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe46002b-150e-4e55-b1b5-50d658e6a2a4" containerName="extract-content" Oct 01 16:03:07 crc kubenswrapper[4771]: I1001 16:03:07.016573 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe46002b-150e-4e55-b1b5-50d658e6a2a4" containerName="registry-server" Oct 01 16:03:07 crc kubenswrapper[4771]: I1001 16:03:07.016604 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d8f06d5-e155-46cc-bf0b-0a690ab4a3a9" containerName="copy" Oct 01 16:03:07 crc kubenswrapper[4771]: I1001 16:03:07.016630 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d8f06d5-e155-46cc-bf0b-0a690ab4a3a9" containerName="gather" Oct 01 16:03:07 crc kubenswrapper[4771]: I1001 16:03:07.018266 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wfbm9/must-gather-brgcz" Oct 01 16:03:07 crc kubenswrapper[4771]: I1001 16:03:07.025497 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-wfbm9"/"openshift-service-ca.crt" Oct 01 16:03:07 crc kubenswrapper[4771]: I1001 16:03:07.025695 4771 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-wfbm9"/"kube-root-ca.crt" Oct 01 16:03:07 crc kubenswrapper[4771]: I1001 16:03:07.025890 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wfbm9/must-gather-brgcz"] Oct 01 16:03:07 crc kubenswrapper[4771]: I1001 16:03:07.138916 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glzhm\" (UniqueName: \"kubernetes.io/projected/bfd8d29d-231e-4d89-b6a1-c65c00b76a7b-kube-api-access-glzhm\") pod \"must-gather-brgcz\" (UID: \"bfd8d29d-231e-4d89-b6a1-c65c00b76a7b\") " pod="openshift-must-gather-wfbm9/must-gather-brgcz" Oct 01 16:03:07 crc kubenswrapper[4771]: I1001 16:03:07.139026 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bfd8d29d-231e-4d89-b6a1-c65c00b76a7b-must-gather-output\") pod \"must-gather-brgcz\" (UID: \"bfd8d29d-231e-4d89-b6a1-c65c00b76a7b\") " pod="openshift-must-gather-wfbm9/must-gather-brgcz" Oct 01 16:03:07 crc kubenswrapper[4771]: I1001 16:03:07.240392 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glzhm\" (UniqueName: \"kubernetes.io/projected/bfd8d29d-231e-4d89-b6a1-c65c00b76a7b-kube-api-access-glzhm\") pod \"must-gather-brgcz\" (UID: \"bfd8d29d-231e-4d89-b6a1-c65c00b76a7b\") " pod="openshift-must-gather-wfbm9/must-gather-brgcz" Oct 01 16:03:07 crc kubenswrapper[4771]: I1001 16:03:07.240484 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bfd8d29d-231e-4d89-b6a1-c65c00b76a7b-must-gather-output\") pod \"must-gather-brgcz\" (UID: \"bfd8d29d-231e-4d89-b6a1-c65c00b76a7b\") " pod="openshift-must-gather-wfbm9/must-gather-brgcz" Oct 01 16:03:07 crc kubenswrapper[4771]: I1001 16:03:07.240936 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bfd8d29d-231e-4d89-b6a1-c65c00b76a7b-must-gather-output\") pod \"must-gather-brgcz\" (UID: \"bfd8d29d-231e-4d89-b6a1-c65c00b76a7b\") " pod="openshift-must-gather-wfbm9/must-gather-brgcz" Oct 01 16:03:07 crc kubenswrapper[4771]: I1001 16:03:07.262792 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glzhm\" (UniqueName: \"kubernetes.io/projected/bfd8d29d-231e-4d89-b6a1-c65c00b76a7b-kube-api-access-glzhm\") pod \"must-gather-brgcz\" (UID: \"bfd8d29d-231e-4d89-b6a1-c65c00b76a7b\") " pod="openshift-must-gather-wfbm9/must-gather-brgcz" Oct 01 16:03:07 crc kubenswrapper[4771]: I1001 16:03:07.342072 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wfbm9/must-gather-brgcz" Oct 01 16:03:07 crc kubenswrapper[4771]: I1001 16:03:07.837009 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wfbm9/must-gather-brgcz"] Oct 01 16:03:08 crc kubenswrapper[4771]: I1001 16:03:08.714265 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wfbm9/must-gather-brgcz" event={"ID":"bfd8d29d-231e-4d89-b6a1-c65c00b76a7b","Type":"ContainerStarted","Data":"ed3fdce484e6b258cfcdd683343311cae2bba8d648ed04bf787223793586461a"} Oct 01 16:03:08 crc kubenswrapper[4771]: I1001 16:03:08.714312 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wfbm9/must-gather-brgcz" event={"ID":"bfd8d29d-231e-4d89-b6a1-c65c00b76a7b","Type":"ContainerStarted","Data":"ff5572627a2645aae6e23c6c5e1b619d9763c95e556fb0f4474b25ee61711a70"} Oct 01 16:03:08 crc kubenswrapper[4771]: I1001 16:03:08.714322 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wfbm9/must-gather-brgcz" event={"ID":"bfd8d29d-231e-4d89-b6a1-c65c00b76a7b","Type":"ContainerStarted","Data":"4a5e036dfacb6782d23ef426e07165a26f6d58388a1d2d46597640fcfba32f6d"} Oct 01 16:03:08 crc kubenswrapper[4771]: I1001 16:03:08.738645 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wfbm9/must-gather-brgcz" podStartSLOduration=2.738622797 podStartE2EDuration="2.738622797s" podCreationTimestamp="2025-10-01 16:03:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:03:08.729130174 +0000 UTC m=+4033.348305365" watchObservedRunningTime="2025-10-01 16:03:08.738622797 +0000 UTC m=+4033.357797988" Oct 01 16:03:12 crc kubenswrapper[4771]: I1001 16:03:12.177398 4771 patch_prober.go:28] interesting pod/machine-config-daemon-vck47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 16:03:12 crc kubenswrapper[4771]: I1001 16:03:12.178109 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 16:03:12 crc kubenswrapper[4771]: I1001 16:03:12.397613 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wfbm9/crc-debug-gb8w8"] Oct 01 16:03:12 crc kubenswrapper[4771]: I1001 16:03:12.398791 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wfbm9/crc-debug-gb8w8" Oct 01 16:03:12 crc kubenswrapper[4771]: I1001 16:03:12.400440 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-wfbm9"/"default-dockercfg-cdth4" Oct 01 16:03:12 crc kubenswrapper[4771]: I1001 16:03:12.434799 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/96b84840-2449-4577-8d61-b411b7b59012-host\") pod \"crc-debug-gb8w8\" (UID: \"96b84840-2449-4577-8d61-b411b7b59012\") " pod="openshift-must-gather-wfbm9/crc-debug-gb8w8" Oct 01 16:03:12 crc kubenswrapper[4771]: I1001 16:03:12.434934 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmdhc\" (UniqueName: \"kubernetes.io/projected/96b84840-2449-4577-8d61-b411b7b59012-kube-api-access-tmdhc\") pod \"crc-debug-gb8w8\" (UID: \"96b84840-2449-4577-8d61-b411b7b59012\") " pod="openshift-must-gather-wfbm9/crc-debug-gb8w8" Oct 01 16:03:12 crc kubenswrapper[4771]: I1001 16:03:12.536146 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmdhc\" (UniqueName: \"kubernetes.io/projected/96b84840-2449-4577-8d61-b411b7b59012-kube-api-access-tmdhc\") pod \"crc-debug-gb8w8\" (UID: \"96b84840-2449-4577-8d61-b411b7b59012\") " pod="openshift-must-gather-wfbm9/crc-debug-gb8w8" Oct 01 16:03:12 crc kubenswrapper[4771]: I1001 16:03:12.536565 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/96b84840-2449-4577-8d61-b411b7b59012-host\") pod \"crc-debug-gb8w8\" (UID: \"96b84840-2449-4577-8d61-b411b7b59012\") " pod="openshift-must-gather-wfbm9/crc-debug-gb8w8" Oct 01 16:03:12 crc kubenswrapper[4771]: I1001 16:03:12.536710 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/96b84840-2449-4577-8d61-b411b7b59012-host\") pod \"crc-debug-gb8w8\" (UID: \"96b84840-2449-4577-8d61-b411b7b59012\") " pod="openshift-must-gather-wfbm9/crc-debug-gb8w8" Oct 01 16:03:12 crc kubenswrapper[4771]: I1001 16:03:12.561380 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmdhc\" (UniqueName: \"kubernetes.io/projected/96b84840-2449-4577-8d61-b411b7b59012-kube-api-access-tmdhc\") pod \"crc-debug-gb8w8\" (UID: \"96b84840-2449-4577-8d61-b411b7b59012\") " pod="openshift-must-gather-wfbm9/crc-debug-gb8w8" Oct 01 16:03:12 crc kubenswrapper[4771]: I1001 16:03:12.721863 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wfbm9/crc-debug-gb8w8" Oct 01 16:03:13 crc kubenswrapper[4771]: I1001 16:03:13.776899 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wfbm9/crc-debug-gb8w8" event={"ID":"96b84840-2449-4577-8d61-b411b7b59012","Type":"ContainerStarted","Data":"b6821859df514ce75e3d006b239ab2ceae671a4cf290c33c66d5a03d04c2ec28"} Oct 01 16:03:13 crc kubenswrapper[4771]: I1001 16:03:13.777241 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wfbm9/crc-debug-gb8w8" event={"ID":"96b84840-2449-4577-8d61-b411b7b59012","Type":"ContainerStarted","Data":"06cf42a1b4c628a7bfdfd6c7d9a0a8666cc62e8b3d2c2c0991b9cc915ad8a842"} Oct 01 16:03:13 crc kubenswrapper[4771]: I1001 16:03:13.800777 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wfbm9/crc-debug-gb8w8" podStartSLOduration=1.800757779 podStartE2EDuration="1.800757779s" podCreationTimestamp="2025-10-01 16:03:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:03:13.795715954 +0000 UTC m=+4038.414891125" watchObservedRunningTime="2025-10-01 16:03:13.800757779 +0000 UTC m=+4038.419932950" Oct 01 16:03:42 crc kubenswrapper[4771]: I1001 16:03:42.177740 4771 patch_prober.go:28] interesting pod/machine-config-daemon-vck47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 16:03:42 crc kubenswrapper[4771]: I1001 16:03:42.178209 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 16:03:50 crc kubenswrapper[4771]: I1001 16:03:50.886697 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bws7c"] Oct 01 16:03:50 crc kubenswrapper[4771]: I1001 16:03:50.889559 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bws7c" Oct 01 16:03:50 crc kubenswrapper[4771]: I1001 16:03:50.899901 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bws7c"] Oct 01 16:03:50 crc kubenswrapper[4771]: I1001 16:03:50.977689 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/356464e0-65a3-414e-89eb-3232c844b9a8-catalog-content\") pod \"redhat-operators-bws7c\" (UID: \"356464e0-65a3-414e-89eb-3232c844b9a8\") " pod="openshift-marketplace/redhat-operators-bws7c" Oct 01 16:03:50 crc kubenswrapper[4771]: I1001 16:03:50.977812 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmzpp\" (UniqueName: \"kubernetes.io/projected/356464e0-65a3-414e-89eb-3232c844b9a8-kube-api-access-nmzpp\") pod \"redhat-operators-bws7c\" (UID: \"356464e0-65a3-414e-89eb-3232c844b9a8\") " pod="openshift-marketplace/redhat-operators-bws7c" Oct 01 16:03:50 crc kubenswrapper[4771]: I1001 16:03:50.977887 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/356464e0-65a3-414e-89eb-3232c844b9a8-utilities\") pod \"redhat-operators-bws7c\" (UID: \"356464e0-65a3-414e-89eb-3232c844b9a8\") " pod="openshift-marketplace/redhat-operators-bws7c" Oct 01 16:03:51 crc kubenswrapper[4771]: I1001 16:03:51.079535 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmzpp\" (UniqueName: \"kubernetes.io/projected/356464e0-65a3-414e-89eb-3232c844b9a8-kube-api-access-nmzpp\") pod \"redhat-operators-bws7c\" (UID: \"356464e0-65a3-414e-89eb-3232c844b9a8\") " pod="openshift-marketplace/redhat-operators-bws7c" Oct 01 16:03:51 crc kubenswrapper[4771]: I1001 16:03:51.081953 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/356464e0-65a3-414e-89eb-3232c844b9a8-utilities\") pod \"redhat-operators-bws7c\" (UID: \"356464e0-65a3-414e-89eb-3232c844b9a8\") " pod="openshift-marketplace/redhat-operators-bws7c" Oct 01 16:03:51 crc kubenswrapper[4771]: I1001 16:03:51.082245 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/356464e0-65a3-414e-89eb-3232c844b9a8-catalog-content\") pod \"redhat-operators-bws7c\" (UID: \"356464e0-65a3-414e-89eb-3232c844b9a8\") " pod="openshift-marketplace/redhat-operators-bws7c" Oct 01 16:03:51 crc kubenswrapper[4771]: I1001 16:03:51.083776 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/356464e0-65a3-414e-89eb-3232c844b9a8-utilities\") pod \"redhat-operators-bws7c\" (UID: \"356464e0-65a3-414e-89eb-3232c844b9a8\") " pod="openshift-marketplace/redhat-operators-bws7c" Oct 01 16:03:51 crc kubenswrapper[4771]: I1001 16:03:51.085523 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/356464e0-65a3-414e-89eb-3232c844b9a8-catalog-content\") pod \"redhat-operators-bws7c\" (UID: \"356464e0-65a3-414e-89eb-3232c844b9a8\") " pod="openshift-marketplace/redhat-operators-bws7c" Oct 01 16:03:51 crc kubenswrapper[4771]: I1001 16:03:51.099541 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmzpp\" (UniqueName: \"kubernetes.io/projected/356464e0-65a3-414e-89eb-3232c844b9a8-kube-api-access-nmzpp\") pod \"redhat-operators-bws7c\" (UID: \"356464e0-65a3-414e-89eb-3232c844b9a8\") " pod="openshift-marketplace/redhat-operators-bws7c" Oct 01 16:03:51 crc kubenswrapper[4771]: I1001 16:03:51.225470 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bws7c" Oct 01 16:03:51 crc kubenswrapper[4771]: I1001 16:03:51.733326 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bws7c"] Oct 01 16:03:52 crc kubenswrapper[4771]: I1001 16:03:52.119323 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bws7c" event={"ID":"356464e0-65a3-414e-89eb-3232c844b9a8","Type":"ContainerStarted","Data":"8e2c9a4fb9368b786d4222b436ecb0614c338691897ff69d5f1955d305d1f8b5"} Oct 01 16:03:53 crc kubenswrapper[4771]: I1001 16:03:53.130972 4771 generic.go:334] "Generic (PLEG): container finished" podID="356464e0-65a3-414e-89eb-3232c844b9a8" containerID="492e1c81e2cebeb0dccd59c07561830ae10826ffd2bfb519fa76caae94cbd470" exitCode=0 Oct 01 16:03:53 crc kubenswrapper[4771]: I1001 16:03:53.131056 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bws7c" event={"ID":"356464e0-65a3-414e-89eb-3232c844b9a8","Type":"ContainerDied","Data":"492e1c81e2cebeb0dccd59c07561830ae10826ffd2bfb519fa76caae94cbd470"} Oct 01 16:03:55 crc kubenswrapper[4771]: I1001 16:03:55.154614 4771 generic.go:334] "Generic (PLEG): container finished" podID="356464e0-65a3-414e-89eb-3232c844b9a8" containerID="9b3e3af84bff92fa00eab19326d6a6a6e282581fed79dc4338c2f8d96dfb5361" exitCode=0 Oct 01 16:03:55 crc kubenswrapper[4771]: I1001 16:03:55.154674 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bws7c" event={"ID":"356464e0-65a3-414e-89eb-3232c844b9a8","Type":"ContainerDied","Data":"9b3e3af84bff92fa00eab19326d6a6a6e282581fed79dc4338c2f8d96dfb5361"} Oct 01 16:03:58 crc kubenswrapper[4771]: I1001 16:03:58.182565 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bws7c" event={"ID":"356464e0-65a3-414e-89eb-3232c844b9a8","Type":"ContainerStarted","Data":"7b62c38d8513ba5877ea1aaf672b1ab2b92fe64a02a42f441d0f43c899fa9733"} Oct 01 16:03:58 crc kubenswrapper[4771]: I1001 16:03:58.205641 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bws7c" podStartSLOduration=4.156874358 podStartE2EDuration="8.205619599s" podCreationTimestamp="2025-10-01 16:03:50 +0000 UTC" firstStartedPulling="2025-10-01 16:03:53.135671946 +0000 UTC m=+4077.754847117" lastFinishedPulling="2025-10-01 16:03:57.184417167 +0000 UTC m=+4081.803592358" observedRunningTime="2025-10-01 16:03:58.199793936 +0000 UTC m=+4082.818969117" watchObservedRunningTime="2025-10-01 16:03:58.205619599 +0000 UTC m=+4082.824794770" Oct 01 16:04:01 crc kubenswrapper[4771]: I1001 16:04:01.226098 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bws7c" Oct 01 16:04:01 crc kubenswrapper[4771]: I1001 16:04:01.227814 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bws7c" Oct 01 16:04:02 crc kubenswrapper[4771]: I1001 16:04:02.275171 4771 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bws7c" podUID="356464e0-65a3-414e-89eb-3232c844b9a8" containerName="registry-server" probeResult="failure" output=< Oct 01 16:04:02 crc kubenswrapper[4771]: timeout: failed to connect service ":50051" within 1s Oct 01 16:04:02 crc kubenswrapper[4771]: > Oct 01 16:04:11 crc kubenswrapper[4771]: I1001 16:04:11.291210 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bws7c" Oct 01 16:04:11 crc kubenswrapper[4771]: I1001 16:04:11.340763 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bws7c" Oct 01 16:04:11 crc kubenswrapper[4771]: I1001 16:04:11.524027 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bws7c"] Oct 01 16:04:12 crc kubenswrapper[4771]: I1001 16:04:12.177523 4771 patch_prober.go:28] interesting pod/machine-config-daemon-vck47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 16:04:12 crc kubenswrapper[4771]: I1001 16:04:12.177908 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 16:04:12 crc kubenswrapper[4771]: I1001 16:04:12.177965 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vck47" Oct 01 16:04:12 crc kubenswrapper[4771]: I1001 16:04:12.178918 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"19c8aefbbf261f233703d5d75b1ff28f73820b1b39703c62ab135fd2a27acfcb"} pod="openshift-machine-config-operator/machine-config-daemon-vck47" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 16:04:12 crc kubenswrapper[4771]: I1001 16:04:12.178996 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" containerID="cri-o://19c8aefbbf261f233703d5d75b1ff28f73820b1b39703c62ab135fd2a27acfcb" gracePeriod=600 Oct 01 16:04:12 crc kubenswrapper[4771]: I1001 16:04:12.332214 4771 generic.go:334] "Generic (PLEG): container finished" podID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerID="19c8aefbbf261f233703d5d75b1ff28f73820b1b39703c62ab135fd2a27acfcb" exitCode=0 Oct 01 16:04:12 crc kubenswrapper[4771]: I1001 16:04:12.332407 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bws7c" podUID="356464e0-65a3-414e-89eb-3232c844b9a8" containerName="registry-server" containerID="cri-o://7b62c38d8513ba5877ea1aaf672b1ab2b92fe64a02a42f441d0f43c899fa9733" gracePeriod=2 Oct 01 16:04:12 crc kubenswrapper[4771]: I1001 16:04:12.332481 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" event={"ID":"289ee6d3-fabe-417f-964c-76ca03c143cc","Type":"ContainerDied","Data":"19c8aefbbf261f233703d5d75b1ff28f73820b1b39703c62ab135fd2a27acfcb"} Oct 01 16:04:12 crc kubenswrapper[4771]: I1001 16:04:12.332510 4771 scope.go:117] "RemoveContainer" containerID="a902474c94f7a5091bc2240a7b526fc1a45caf03e740411bdb189e934ec62269" Oct 01 16:04:12 crc kubenswrapper[4771]: I1001 16:04:12.802347 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bws7c" Oct 01 16:04:12 crc kubenswrapper[4771]: I1001 16:04:12.897465 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmzpp\" (UniqueName: \"kubernetes.io/projected/356464e0-65a3-414e-89eb-3232c844b9a8-kube-api-access-nmzpp\") pod \"356464e0-65a3-414e-89eb-3232c844b9a8\" (UID: \"356464e0-65a3-414e-89eb-3232c844b9a8\") " Oct 01 16:04:12 crc kubenswrapper[4771]: I1001 16:04:12.897538 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/356464e0-65a3-414e-89eb-3232c844b9a8-utilities\") pod \"356464e0-65a3-414e-89eb-3232c844b9a8\" (UID: \"356464e0-65a3-414e-89eb-3232c844b9a8\") " Oct 01 16:04:12 crc kubenswrapper[4771]: I1001 16:04:12.897655 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/356464e0-65a3-414e-89eb-3232c844b9a8-catalog-content\") pod \"356464e0-65a3-414e-89eb-3232c844b9a8\" (UID: \"356464e0-65a3-414e-89eb-3232c844b9a8\") " Oct 01 16:04:12 crc kubenswrapper[4771]: I1001 16:04:12.898376 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/356464e0-65a3-414e-89eb-3232c844b9a8-utilities" (OuterVolumeSpecName: "utilities") pod "356464e0-65a3-414e-89eb-3232c844b9a8" (UID: "356464e0-65a3-414e-89eb-3232c844b9a8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:04:12 crc kubenswrapper[4771]: I1001 16:04:12.910096 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/356464e0-65a3-414e-89eb-3232c844b9a8-kube-api-access-nmzpp" (OuterVolumeSpecName: "kube-api-access-nmzpp") pod "356464e0-65a3-414e-89eb-3232c844b9a8" (UID: "356464e0-65a3-414e-89eb-3232c844b9a8"). InnerVolumeSpecName "kube-api-access-nmzpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:04:12 crc kubenswrapper[4771]: I1001 16:04:12.995421 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/356464e0-65a3-414e-89eb-3232c844b9a8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "356464e0-65a3-414e-89eb-3232c844b9a8" (UID: "356464e0-65a3-414e-89eb-3232c844b9a8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:04:12 crc kubenswrapper[4771]: I1001 16:04:12.999278 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmzpp\" (UniqueName: \"kubernetes.io/projected/356464e0-65a3-414e-89eb-3232c844b9a8-kube-api-access-nmzpp\") on node \"crc\" DevicePath \"\"" Oct 01 16:04:12 crc kubenswrapper[4771]: I1001 16:04:12.999314 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/356464e0-65a3-414e-89eb-3232c844b9a8-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 16:04:12 crc kubenswrapper[4771]: I1001 16:04:12.999324 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/356464e0-65a3-414e-89eb-3232c844b9a8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 16:04:13 crc kubenswrapper[4771]: I1001 16:04:13.349634 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" event={"ID":"289ee6d3-fabe-417f-964c-76ca03c143cc","Type":"ContainerStarted","Data":"f55cdf44553a8313053f9f313ebda5423df128ee40bc2ac50a7599b6ed530df9"} Oct 01 16:04:13 crc kubenswrapper[4771]: I1001 16:04:13.353328 4771 generic.go:334] "Generic (PLEG): container finished" podID="356464e0-65a3-414e-89eb-3232c844b9a8" containerID="7b62c38d8513ba5877ea1aaf672b1ab2b92fe64a02a42f441d0f43c899fa9733" exitCode=0 Oct 01 16:04:13 crc kubenswrapper[4771]: I1001 16:04:13.353397 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bws7c" event={"ID":"356464e0-65a3-414e-89eb-3232c844b9a8","Type":"ContainerDied","Data":"7b62c38d8513ba5877ea1aaf672b1ab2b92fe64a02a42f441d0f43c899fa9733"} Oct 01 16:04:13 crc kubenswrapper[4771]: I1001 16:04:13.353432 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bws7c" event={"ID":"356464e0-65a3-414e-89eb-3232c844b9a8","Type":"ContainerDied","Data":"8e2c9a4fb9368b786d4222b436ecb0614c338691897ff69d5f1955d305d1f8b5"} Oct 01 16:04:13 crc kubenswrapper[4771]: I1001 16:04:13.353440 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bws7c" Oct 01 16:04:13 crc kubenswrapper[4771]: I1001 16:04:13.353455 4771 scope.go:117] "RemoveContainer" containerID="7b62c38d8513ba5877ea1aaf672b1ab2b92fe64a02a42f441d0f43c899fa9733" Oct 01 16:04:13 crc kubenswrapper[4771]: I1001 16:04:13.384836 4771 scope.go:117] "RemoveContainer" containerID="9b3e3af84bff92fa00eab19326d6a6a6e282581fed79dc4338c2f8d96dfb5361" Oct 01 16:04:13 crc kubenswrapper[4771]: I1001 16:04:13.406975 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bws7c"] Oct 01 16:04:13 crc kubenswrapper[4771]: I1001 16:04:13.415035 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bws7c"] Oct 01 16:04:13 crc kubenswrapper[4771]: I1001 16:04:13.420294 4771 scope.go:117] "RemoveContainer" containerID="492e1c81e2cebeb0dccd59c07561830ae10826ffd2bfb519fa76caae94cbd470" Oct 01 16:04:13 crc kubenswrapper[4771]: I1001 16:04:13.475274 4771 scope.go:117] "RemoveContainer" containerID="7b62c38d8513ba5877ea1aaf672b1ab2b92fe64a02a42f441d0f43c899fa9733" Oct 01 16:04:13 crc kubenswrapper[4771]: E1001 16:04:13.475862 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b62c38d8513ba5877ea1aaf672b1ab2b92fe64a02a42f441d0f43c899fa9733\": container with ID starting with 7b62c38d8513ba5877ea1aaf672b1ab2b92fe64a02a42f441d0f43c899fa9733 not found: ID does not exist" containerID="7b62c38d8513ba5877ea1aaf672b1ab2b92fe64a02a42f441d0f43c899fa9733" Oct 01 16:04:13 crc kubenswrapper[4771]: I1001 16:04:13.476009 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b62c38d8513ba5877ea1aaf672b1ab2b92fe64a02a42f441d0f43c899fa9733"} err="failed to get container status \"7b62c38d8513ba5877ea1aaf672b1ab2b92fe64a02a42f441d0f43c899fa9733\": rpc error: code = NotFound desc = could not find container \"7b62c38d8513ba5877ea1aaf672b1ab2b92fe64a02a42f441d0f43c899fa9733\": container with ID starting with 7b62c38d8513ba5877ea1aaf672b1ab2b92fe64a02a42f441d0f43c899fa9733 not found: ID does not exist" Oct 01 16:04:13 crc kubenswrapper[4771]: I1001 16:04:13.476121 4771 scope.go:117] "RemoveContainer" containerID="9b3e3af84bff92fa00eab19326d6a6a6e282581fed79dc4338c2f8d96dfb5361" Oct 01 16:04:13 crc kubenswrapper[4771]: E1001 16:04:13.476601 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b3e3af84bff92fa00eab19326d6a6a6e282581fed79dc4338c2f8d96dfb5361\": container with ID starting with 9b3e3af84bff92fa00eab19326d6a6a6e282581fed79dc4338c2f8d96dfb5361 not found: ID does not exist" containerID="9b3e3af84bff92fa00eab19326d6a6a6e282581fed79dc4338c2f8d96dfb5361" Oct 01 16:04:13 crc kubenswrapper[4771]: I1001 16:04:13.476645 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b3e3af84bff92fa00eab19326d6a6a6e282581fed79dc4338c2f8d96dfb5361"} err="failed to get container status \"9b3e3af84bff92fa00eab19326d6a6a6e282581fed79dc4338c2f8d96dfb5361\": rpc error: code = NotFound desc = could not find container \"9b3e3af84bff92fa00eab19326d6a6a6e282581fed79dc4338c2f8d96dfb5361\": container with ID starting with 9b3e3af84bff92fa00eab19326d6a6a6e282581fed79dc4338c2f8d96dfb5361 not found: ID does not exist" Oct 01 16:04:13 crc kubenswrapper[4771]: I1001 16:04:13.476676 4771 scope.go:117] "RemoveContainer" containerID="492e1c81e2cebeb0dccd59c07561830ae10826ffd2bfb519fa76caae94cbd470" Oct 01 16:04:13 crc kubenswrapper[4771]: E1001 16:04:13.477032 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"492e1c81e2cebeb0dccd59c07561830ae10826ffd2bfb519fa76caae94cbd470\": container with ID starting with 492e1c81e2cebeb0dccd59c07561830ae10826ffd2bfb519fa76caae94cbd470 not found: ID does not exist" containerID="492e1c81e2cebeb0dccd59c07561830ae10826ffd2bfb519fa76caae94cbd470" Oct 01 16:04:13 crc kubenswrapper[4771]: I1001 16:04:13.477063 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"492e1c81e2cebeb0dccd59c07561830ae10826ffd2bfb519fa76caae94cbd470"} err="failed to get container status \"492e1c81e2cebeb0dccd59c07561830ae10826ffd2bfb519fa76caae94cbd470\": rpc error: code = NotFound desc = could not find container \"492e1c81e2cebeb0dccd59c07561830ae10826ffd2bfb519fa76caae94cbd470\": container with ID starting with 492e1c81e2cebeb0dccd59c07561830ae10826ffd2bfb519fa76caae94cbd470 not found: ID does not exist" Oct 01 16:04:13 crc kubenswrapper[4771]: I1001 16:04:13.996353 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="356464e0-65a3-414e-89eb-3232c844b9a8" path="/var/lib/kubelet/pods/356464e0-65a3-414e-89eb-3232c844b9a8/volumes" Oct 01 16:04:17 crc kubenswrapper[4771]: I1001 16:04:17.924420 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-59879d9576-fvcgl_d7137719-9397-4b5e-97ae-10176a7deea3/barbican-api-log/0.log" Oct 01 16:04:17 crc kubenswrapper[4771]: I1001 16:04:17.929912 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-59879d9576-fvcgl_d7137719-9397-4b5e-97ae-10176a7deea3/barbican-api/0.log" Oct 01 16:04:18 crc kubenswrapper[4771]: I1001 16:04:18.130381 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-755c69f65b-sb4nj_90aaa270-c5a1-47b4-8adc-2bd096da3ab0/barbican-keystone-listener/0.log" Oct 01 16:04:18 crc kubenswrapper[4771]: I1001 16:04:18.149580 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-755c69f65b-sb4nj_90aaa270-c5a1-47b4-8adc-2bd096da3ab0/barbican-keystone-listener-log/0.log" Oct 01 16:04:18 crc kubenswrapper[4771]: I1001 16:04:18.351525 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-74dd9b479f-cpgmx_962b1815-b3dd-47fc-afdf-97a82cc67893/barbican-worker/0.log" Oct 01 16:04:18 crc kubenswrapper[4771]: I1001 16:04:18.419697 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-74dd9b479f-cpgmx_962b1815-b3dd-47fc-afdf-97a82cc67893/barbican-worker-log/0.log" Oct 01 16:04:18 crc kubenswrapper[4771]: I1001 16:04:18.588093 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-2hsb8_0498d724-f802-4a21-9197-f87079f3c96e/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 16:04:18 crc kubenswrapper[4771]: I1001 16:04:18.777568 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_3fb688fd-269e-4d0f-a84f-ccb670696d20/ceilometer-notification-agent/0.log" Oct 01 16:04:18 crc kubenswrapper[4771]: I1001 16:04:18.802378 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_3fb688fd-269e-4d0f-a84f-ccb670696d20/ceilometer-central-agent/0.log" Oct 01 16:04:19 crc kubenswrapper[4771]: I1001 16:04:19.454135 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_3fb688fd-269e-4d0f-a84f-ccb670696d20/proxy-httpd/0.log" Oct 01 16:04:19 crc kubenswrapper[4771]: I1001 16:04:19.492220 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_3fb688fd-269e-4d0f-a84f-ccb670696d20/sg-core/0.log" Oct 01 16:04:19 crc kubenswrapper[4771]: I1001 16:04:19.693321 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_12fc4771-6958-4668-adcc-6aa10e36e1ea/cinder-api-log/0.log" Oct 01 16:04:19 crc kubenswrapper[4771]: I1001 16:04:19.743043 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_12fc4771-6958-4668-adcc-6aa10e36e1ea/cinder-api/0.log" Oct 01 16:04:20 crc kubenswrapper[4771]: I1001 16:04:20.103951 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_3a36d28b-706e-4639-9d68-158427aaa655/probe/0.log" Oct 01 16:04:20 crc kubenswrapper[4771]: I1001 16:04:20.151981 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_3a36d28b-706e-4639-9d68-158427aaa655/cinder-scheduler/0.log" Oct 01 16:04:20 crc kubenswrapper[4771]: I1001 16:04:20.323610 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-ptcjt_5598c0d1-a4ba-4824-8111-dddf70823911/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 16:04:20 crc kubenswrapper[4771]: I1001 16:04:20.455426 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-txt9q_a64a2e26-92a1-4578-9a5a-fc5e8062f1b8/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 16:04:20 crc kubenswrapper[4771]: I1001 16:04:20.666685 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-vv89q_ae5eb9bd-1612-4698-850a-21e0b335a920/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 16:04:20 crc kubenswrapper[4771]: I1001 16:04:20.785712 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-952sj_e5a19de7-6ecc-4f22-bc0c-18f3761eef3c/init/0.log" Oct 01 16:04:21 crc kubenswrapper[4771]: I1001 16:04:21.003929 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-952sj_e5a19de7-6ecc-4f22-bc0c-18f3761eef3c/init/0.log" Oct 01 16:04:21 crc kubenswrapper[4771]: I1001 16:04:21.046268 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-9h2g6_c69dcf56-20fa-4a9a-992c-a73435ff9102/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 16:04:21 crc kubenswrapper[4771]: I1001 16:04:21.066223 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-cb6ffcf87-952sj_e5a19de7-6ecc-4f22-bc0c-18f3761eef3c/dnsmasq-dns/0.log" Oct 01 16:04:21 crc kubenswrapper[4771]: I1001 16:04:21.258351 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_bd39907f-ab26-47d6-9d78-0b0437de4b04/glance-log/0.log" Oct 01 16:04:21 crc kubenswrapper[4771]: I1001 16:04:21.279186 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_bd39907f-ab26-47d6-9d78-0b0437de4b04/glance-httpd/0.log" Oct 01 16:04:21 crc kubenswrapper[4771]: I1001 16:04:21.443168 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_83dc7d05-3ef5-4da2-b4ea-58d3c11d4528/glance-log/0.log" Oct 01 16:04:21 crc kubenswrapper[4771]: I1001 16:04:21.492196 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_83dc7d05-3ef5-4da2-b4ea-58d3c11d4528/glance-httpd/0.log" Oct 01 16:04:21 crc kubenswrapper[4771]: I1001 16:04:21.713167 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-qk797_e6e7232e-0b6f-433f-a1e5-f99aab22ed8a/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 16:04:21 crc kubenswrapper[4771]: I1001 16:04:21.759446 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-66679756f6-g56hw_405e12dd-6888-4994-ac26-b2836ad9069c/horizon/0.log" Oct 01 16:04:21 crc kubenswrapper[4771]: I1001 16:04:21.954981 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-zkh9n_333518f5-86a1-4afc-974d-c3dbee185c42/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 16:04:22 crc kubenswrapper[4771]: I1001 16:04:22.237228 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29322241-5g5cn_8373a131-a79c-4385-a8bd-949721e18222/keystone-cron/0.log" Oct 01 16:04:22 crc kubenswrapper[4771]: I1001 16:04:22.359092 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-66679756f6-g56hw_405e12dd-6888-4994-ac26-b2836ad9069c/horizon-log/0.log" Oct 01 16:04:22 crc kubenswrapper[4771]: I1001 16:04:22.448211 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-74b8bdcb7c-xttgq_5e356f03-9445-4825-a39d-8b564bd4ea1c/keystone-api/0.log" Oct 01 16:04:22 crc kubenswrapper[4771]: I1001 16:04:22.537367 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_bbfa5749-f148-47da-8cbf-b88b1ea7bd9f/kube-state-metrics/0.log" Oct 01 16:04:22 crc kubenswrapper[4771]: I1001 16:04:22.658027 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-hvr6k_87149516-d807-4412-90a5-e127c03943e0/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 16:04:23 crc kubenswrapper[4771]: I1001 16:04:23.028024 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-67bb68cc5c-l7gnn_a7b28e9a-d59d-4aba-97c6-9102ada72a28/neutron-httpd/0.log" Oct 01 16:04:23 crc kubenswrapper[4771]: I1001 16:04:23.046862 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-67bb68cc5c-l7gnn_a7b28e9a-d59d-4aba-97c6-9102ada72a28/neutron-api/0.log" Oct 01 16:04:23 crc kubenswrapper[4771]: I1001 16:04:23.336026 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-bng8s_9c50c132-15f0-45c7-a895-46fe2be6003e/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 16:04:23 crc kubenswrapper[4771]: I1001 16:04:23.839278 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_f5ed9a0d-0d21-4432-aae5-dca422c5c331/nova-api-log/0.log" Oct 01 16:04:24 crc kubenswrapper[4771]: I1001 16:04:24.061313 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_f5ed9a0d-0d21-4432-aae5-dca422c5c331/nova-api-api/0.log" Oct 01 16:04:24 crc kubenswrapper[4771]: I1001 16:04:24.265624 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_a01636b1-c705-4844-94b8-bb58e65faa1f/nova-cell0-conductor-conductor/0.log" Oct 01 16:04:24 crc kubenswrapper[4771]: I1001 16:04:24.430847 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_8a756a41-c563-4d6c-a5e8-907724f1847c/nova-cell1-conductor-conductor/0.log" Oct 01 16:04:24 crc kubenswrapper[4771]: I1001 16:04:24.617532 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_a8db97b3-f960-4eff-a879-c2b42c4e6364/nova-cell1-novncproxy-novncproxy/0.log" Oct 01 16:04:24 crc kubenswrapper[4771]: I1001 16:04:24.750796 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-lbdsd_ea0299a3-63d8-41e7-a23d-4ddd7491df9c/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 16:04:24 crc kubenswrapper[4771]: I1001 16:04:24.971632 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_10227ed2-4069-45bc-b3b9-091bb98d72af/nova-metadata-log/0.log" Oct 01 16:04:25 crc kubenswrapper[4771]: I1001 16:04:25.426446 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_f5a9190d-63c4-47e3-9fcd-ed0e0615d807/nova-scheduler-scheduler/0.log" Oct 01 16:04:25 crc kubenswrapper[4771]: I1001 16:04:25.642863 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_37c38012-d257-4269-86fa-8cf3ef4de4cd/mysql-bootstrap/0.log" Oct 01 16:04:25 crc kubenswrapper[4771]: I1001 16:04:25.757511 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_37c38012-d257-4269-86fa-8cf3ef4de4cd/mysql-bootstrap/0.log" Oct 01 16:04:25 crc kubenswrapper[4771]: I1001 16:04:25.847408 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_37c38012-d257-4269-86fa-8cf3ef4de4cd/galera/0.log" Oct 01 16:04:26 crc kubenswrapper[4771]: I1001 16:04:26.045716 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e902a14c-a59a-4278-b560-33de2cb50d32/mysql-bootstrap/0.log" Oct 01 16:04:26 crc kubenswrapper[4771]: I1001 16:04:26.296571 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e902a14c-a59a-4278-b560-33de2cb50d32/mysql-bootstrap/0.log" Oct 01 16:04:26 crc kubenswrapper[4771]: I1001 16:04:26.336693 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_e902a14c-a59a-4278-b560-33de2cb50d32/galera/0.log" Oct 01 16:04:26 crc kubenswrapper[4771]: I1001 16:04:26.497465 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_10227ed2-4069-45bc-b3b9-091bb98d72af/nova-metadata-metadata/0.log" Oct 01 16:04:26 crc kubenswrapper[4771]: I1001 16:04:26.547212 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_a40ee0c4-9c6b-4ed0-9f06-1bb104bb9a11/openstackclient/0.log" Oct 01 16:04:26 crc kubenswrapper[4771]: I1001 16:04:26.747406 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-tckkt_553f381f-ef83-4876-8a81-df81a5be7dd8/openstack-network-exporter/0.log" Oct 01 16:04:26 crc kubenswrapper[4771]: I1001 16:04:26.922116 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rv4hj_74fee09d-11ad-45f4-a779-4c352b6dc67f/ovsdb-server-init/0.log" Oct 01 16:04:27 crc kubenswrapper[4771]: I1001 16:04:27.233787 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rv4hj_74fee09d-11ad-45f4-a779-4c352b6dc67f/ovsdb-server-init/0.log" Oct 01 16:04:27 crc kubenswrapper[4771]: I1001 16:04:27.256416 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rv4hj_74fee09d-11ad-45f4-a779-4c352b6dc67f/ovsdb-server/0.log" Oct 01 16:04:27 crc kubenswrapper[4771]: I1001 16:04:27.277307 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-rv4hj_74fee09d-11ad-45f4-a779-4c352b6dc67f/ovs-vswitchd/0.log" Oct 01 16:04:27 crc kubenswrapper[4771]: I1001 16:04:27.494087 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-zpdvh_20d8761e-4ce2-4312-8a80-8c3ce8908f2c/ovn-controller/0.log" Oct 01 16:04:27 crc kubenswrapper[4771]: I1001 16:04:27.701456 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-h96xv_0bd4b4c7-4ca8-46c2-91c8-6aaa1cf19295/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 16:04:27 crc kubenswrapper[4771]: I1001 16:04:27.990110 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_553def8f-6710-4724-a4b7-a9f6e2c310e6/openstack-network-exporter/0.log" Oct 01 16:04:28 crc kubenswrapper[4771]: I1001 16:04:28.161469 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_553def8f-6710-4724-a4b7-a9f6e2c310e6/ovn-northd/0.log" Oct 01 16:04:28 crc kubenswrapper[4771]: I1001 16:04:28.269435 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_064359ac-92c0-4674-a919-ccb8ffc0a5df/openstack-network-exporter/0.log" Oct 01 16:04:28 crc kubenswrapper[4771]: I1001 16:04:28.446413 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_064359ac-92c0-4674-a919-ccb8ffc0a5df/ovsdbserver-nb/0.log" Oct 01 16:04:28 crc kubenswrapper[4771]: I1001 16:04:28.496002 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_999431d1-6d92-46de-ba0f-b253f96fe627/openstack-network-exporter/0.log" Oct 01 16:04:28 crc kubenswrapper[4771]: I1001 16:04:28.625724 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_999431d1-6d92-46de-ba0f-b253f96fe627/ovsdbserver-sb/0.log" Oct 01 16:04:28 crc kubenswrapper[4771]: I1001 16:04:28.767863 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-67bb557f68-mz5cv_6dfa4374-0400-489e-90eb-baca0f8afdfd/placement-api/0.log" Oct 01 16:04:28 crc kubenswrapper[4771]: I1001 16:04:28.924785 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-67bb557f68-mz5cv_6dfa4374-0400-489e-90eb-baca0f8afdfd/placement-log/0.log" Oct 01 16:04:29 crc kubenswrapper[4771]: I1001 16:04:29.007025 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_15eb6248-64ff-4f3d-bcb4-4d78026673d4/setup-container/0.log" Oct 01 16:04:29 crc kubenswrapper[4771]: I1001 16:04:29.190869 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_15eb6248-64ff-4f3d-bcb4-4d78026673d4/setup-container/0.log" Oct 01 16:04:29 crc kubenswrapper[4771]: I1001 16:04:29.219210 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_15eb6248-64ff-4f3d-bcb4-4d78026673d4/rabbitmq/0.log" Oct 01 16:04:29 crc kubenswrapper[4771]: I1001 16:04:29.431206 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ad32a9ec-a803-4d44-a4c1-03447e26e983/setup-container/0.log" Oct 01 16:04:29 crc kubenswrapper[4771]: I1001 16:04:29.622826 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ad32a9ec-a803-4d44-a4c1-03447e26e983/setup-container/0.log" Oct 01 16:04:29 crc kubenswrapper[4771]: I1001 16:04:29.675449 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_ad32a9ec-a803-4d44-a4c1-03447e26e983/rabbitmq/0.log" Oct 01 16:04:29 crc kubenswrapper[4771]: I1001 16:04:29.869915 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-6gkfz_560e443b-7ae0-4b0c-912d-6f7895b3a8dd/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 16:04:30 crc kubenswrapper[4771]: I1001 16:04:30.011231 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-l4srh_856ab139-589a-4b24-89ab-37ef20ef1762/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 16:04:30 crc kubenswrapper[4771]: I1001 16:04:30.180829 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-r8pxs_a91e8801-f8f9-4ce4-ba42-a4fa54057ec1/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 16:04:30 crc kubenswrapper[4771]: I1001 16:04:30.354204 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-d6dm9_8a04e35a-e2e5-412d-ab61-896f5271ac14/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 16:04:30 crc kubenswrapper[4771]: I1001 16:04:30.823129 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-cg72d_3d9fa44b-220b-4f14-824c-1393dd61fc88/ssh-known-hosts-edpm-deployment/0.log" Oct 01 16:04:31 crc kubenswrapper[4771]: I1001 16:04:31.049142 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-d6484bc47-hkzdw_e4b812be-6e39-4ac8-b43f-dba345603f74/proxy-server/0.log" Oct 01 16:04:31 crc kubenswrapper[4771]: I1001 16:04:31.232993 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-hzvjt_a56d3441-b413-4629-870b-49c208943243/swift-ring-rebalance/0.log" Oct 01 16:04:31 crc kubenswrapper[4771]: I1001 16:04:31.234627 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-d6484bc47-hkzdw_e4b812be-6e39-4ac8-b43f-dba345603f74/proxy-httpd/0.log" Oct 01 16:04:31 crc kubenswrapper[4771]: I1001 16:04:31.472168 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1916131b-f4ff-4f49-8abc-a640dc07abc4/account-auditor/0.log" Oct 01 16:04:31 crc kubenswrapper[4771]: I1001 16:04:31.475259 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1916131b-f4ff-4f49-8abc-a640dc07abc4/account-reaper/0.log" Oct 01 16:04:31 crc kubenswrapper[4771]: I1001 16:04:31.840430 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1916131b-f4ff-4f49-8abc-a640dc07abc4/account-replicator/0.log" Oct 01 16:04:31 crc kubenswrapper[4771]: I1001 16:04:31.870459 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1916131b-f4ff-4f49-8abc-a640dc07abc4/account-server/0.log" Oct 01 16:04:31 crc kubenswrapper[4771]: I1001 16:04:31.900343 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1916131b-f4ff-4f49-8abc-a640dc07abc4/container-auditor/0.log" Oct 01 16:04:32 crc kubenswrapper[4771]: I1001 16:04:32.247096 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1916131b-f4ff-4f49-8abc-a640dc07abc4/container-server/0.log" Oct 01 16:04:32 crc kubenswrapper[4771]: I1001 16:04:32.280452 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1916131b-f4ff-4f49-8abc-a640dc07abc4/container-updater/0.log" Oct 01 16:04:32 crc kubenswrapper[4771]: I1001 16:04:32.297248 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1916131b-f4ff-4f49-8abc-a640dc07abc4/container-replicator/0.log" Oct 01 16:04:32 crc kubenswrapper[4771]: I1001 16:04:32.493519 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1916131b-f4ff-4f49-8abc-a640dc07abc4/object-auditor/0.log" Oct 01 16:04:32 crc kubenswrapper[4771]: I1001 16:04:32.506170 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1916131b-f4ff-4f49-8abc-a640dc07abc4/object-replicator/0.log" Oct 01 16:04:32 crc kubenswrapper[4771]: I1001 16:04:32.519937 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1916131b-f4ff-4f49-8abc-a640dc07abc4/object-expirer/0.log" Oct 01 16:04:32 crc kubenswrapper[4771]: I1001 16:04:32.668849 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1916131b-f4ff-4f49-8abc-a640dc07abc4/object-server/0.log" Oct 01 16:04:32 crc kubenswrapper[4771]: I1001 16:04:32.720221 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1916131b-f4ff-4f49-8abc-a640dc07abc4/rsync/0.log" Oct 01 16:04:32 crc kubenswrapper[4771]: I1001 16:04:32.743710 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1916131b-f4ff-4f49-8abc-a640dc07abc4/object-updater/0.log" Oct 01 16:04:32 crc kubenswrapper[4771]: I1001 16:04:32.881528 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_1916131b-f4ff-4f49-8abc-a640dc07abc4/swift-recon-cron/0.log" Oct 01 16:04:33 crc kubenswrapper[4771]: I1001 16:04:33.030490 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-dhjp6_b938863c-4e4f-414a-9b0b-2d2583d9ae0c/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 16:04:33 crc kubenswrapper[4771]: I1001 16:04:33.254955 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_32741182-3a7c-43a7-b996-1dd78a418dc6/tempest-tests-tempest-tests-runner/0.log" Oct 01 16:04:33 crc kubenswrapper[4771]: I1001 16:04:33.297987 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_e0b503da-0417-4b9a-b62e-3f13be34b988/test-operator-logs-container/0.log" Oct 01 16:04:33 crc kubenswrapper[4771]: I1001 16:04:33.479103 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-nnjkx_9fb926c9-80f3-4d82-9de5-a4f0fc314ef5/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 01 16:04:41 crc kubenswrapper[4771]: I1001 16:04:41.308342 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_107cd834-196a-4454-b70b-cbb3ab3631df/memcached/0.log" Oct 01 16:05:04 crc kubenswrapper[4771]: I1001 16:05:04.498073 4771 scope.go:117] "RemoveContainer" containerID="133293e0c47bc996b01edc19003009f0be84799b420eadd057f352fa78f06791" Oct 01 16:05:07 crc kubenswrapper[4771]: I1001 16:05:07.877648 4771 generic.go:334] "Generic (PLEG): container finished" podID="96b84840-2449-4577-8d61-b411b7b59012" containerID="b6821859df514ce75e3d006b239ab2ceae671a4cf290c33c66d5a03d04c2ec28" exitCode=0 Oct 01 16:05:07 crc kubenswrapper[4771]: I1001 16:05:07.877757 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wfbm9/crc-debug-gb8w8" event={"ID":"96b84840-2449-4577-8d61-b411b7b59012","Type":"ContainerDied","Data":"b6821859df514ce75e3d006b239ab2ceae671a4cf290c33c66d5a03d04c2ec28"} Oct 01 16:05:09 crc kubenswrapper[4771]: I1001 16:05:09.001227 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wfbm9/crc-debug-gb8w8" Oct 01 16:05:09 crc kubenswrapper[4771]: I1001 16:05:09.053757 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-wfbm9/crc-debug-gb8w8"] Oct 01 16:05:09 crc kubenswrapper[4771]: I1001 16:05:09.065551 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-wfbm9/crc-debug-gb8w8"] Oct 01 16:05:09 crc kubenswrapper[4771]: I1001 16:05:09.189589 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/96b84840-2449-4577-8d61-b411b7b59012-host\") pod \"96b84840-2449-4577-8d61-b411b7b59012\" (UID: \"96b84840-2449-4577-8d61-b411b7b59012\") " Oct 01 16:05:09 crc kubenswrapper[4771]: I1001 16:05:09.190164 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmdhc\" (UniqueName: \"kubernetes.io/projected/96b84840-2449-4577-8d61-b411b7b59012-kube-api-access-tmdhc\") pod \"96b84840-2449-4577-8d61-b411b7b59012\" (UID: \"96b84840-2449-4577-8d61-b411b7b59012\") " Oct 01 16:05:09 crc kubenswrapper[4771]: I1001 16:05:09.189724 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/96b84840-2449-4577-8d61-b411b7b59012-host" (OuterVolumeSpecName: "host") pod "96b84840-2449-4577-8d61-b411b7b59012" (UID: "96b84840-2449-4577-8d61-b411b7b59012"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 16:05:09 crc kubenswrapper[4771]: I1001 16:05:09.191497 4771 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/96b84840-2449-4577-8d61-b411b7b59012-host\") on node \"crc\" DevicePath \"\"" Oct 01 16:05:09 crc kubenswrapper[4771]: I1001 16:05:09.198974 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b84840-2449-4577-8d61-b411b7b59012-kube-api-access-tmdhc" (OuterVolumeSpecName: "kube-api-access-tmdhc") pod "96b84840-2449-4577-8d61-b411b7b59012" (UID: "96b84840-2449-4577-8d61-b411b7b59012"). InnerVolumeSpecName "kube-api-access-tmdhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:05:09 crc kubenswrapper[4771]: I1001 16:05:09.293580 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmdhc\" (UniqueName: \"kubernetes.io/projected/96b84840-2449-4577-8d61-b411b7b59012-kube-api-access-tmdhc\") on node \"crc\" DevicePath \"\"" Oct 01 16:05:09 crc kubenswrapper[4771]: I1001 16:05:09.904826 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06cf42a1b4c628a7bfdfd6c7d9a0a8666cc62e8b3d2c2c0991b9cc915ad8a842" Oct 01 16:05:09 crc kubenswrapper[4771]: I1001 16:05:09.905480 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wfbm9/crc-debug-gb8w8" Oct 01 16:05:10 crc kubenswrapper[4771]: I1001 16:05:10.011906 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b84840-2449-4577-8d61-b411b7b59012" path="/var/lib/kubelet/pods/96b84840-2449-4577-8d61-b411b7b59012/volumes" Oct 01 16:05:10 crc kubenswrapper[4771]: I1001 16:05:10.241396 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wfbm9/crc-debug-z57gv"] Oct 01 16:05:10 crc kubenswrapper[4771]: E1001 16:05:10.242015 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="356464e0-65a3-414e-89eb-3232c844b9a8" containerName="registry-server" Oct 01 16:05:10 crc kubenswrapper[4771]: I1001 16:05:10.242027 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="356464e0-65a3-414e-89eb-3232c844b9a8" containerName="registry-server" Oct 01 16:05:10 crc kubenswrapper[4771]: E1001 16:05:10.242060 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="356464e0-65a3-414e-89eb-3232c844b9a8" containerName="extract-content" Oct 01 16:05:10 crc kubenswrapper[4771]: I1001 16:05:10.242067 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="356464e0-65a3-414e-89eb-3232c844b9a8" containerName="extract-content" Oct 01 16:05:10 crc kubenswrapper[4771]: E1001 16:05:10.242090 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="356464e0-65a3-414e-89eb-3232c844b9a8" containerName="extract-utilities" Oct 01 16:05:10 crc kubenswrapper[4771]: I1001 16:05:10.242097 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="356464e0-65a3-414e-89eb-3232c844b9a8" containerName="extract-utilities" Oct 01 16:05:10 crc kubenswrapper[4771]: E1001 16:05:10.242103 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96b84840-2449-4577-8d61-b411b7b59012" containerName="container-00" Oct 01 16:05:10 crc kubenswrapper[4771]: I1001 16:05:10.242108 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="96b84840-2449-4577-8d61-b411b7b59012" containerName="container-00" Oct 01 16:05:10 crc kubenswrapper[4771]: I1001 16:05:10.242289 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="356464e0-65a3-414e-89eb-3232c844b9a8" containerName="registry-server" Oct 01 16:05:10 crc kubenswrapper[4771]: I1001 16:05:10.242309 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="96b84840-2449-4577-8d61-b411b7b59012" containerName="container-00" Oct 01 16:05:10 crc kubenswrapper[4771]: I1001 16:05:10.242899 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wfbm9/crc-debug-z57gv" Oct 01 16:05:10 crc kubenswrapper[4771]: I1001 16:05:10.245261 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-wfbm9"/"default-dockercfg-cdth4" Oct 01 16:05:10 crc kubenswrapper[4771]: I1001 16:05:10.413574 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rdwp\" (UniqueName: \"kubernetes.io/projected/0c27d800-f3db-40e5-ba92-e620137bab5c-kube-api-access-4rdwp\") pod \"crc-debug-z57gv\" (UID: \"0c27d800-f3db-40e5-ba92-e620137bab5c\") " pod="openshift-must-gather-wfbm9/crc-debug-z57gv" Oct 01 16:05:10 crc kubenswrapper[4771]: I1001 16:05:10.413622 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0c27d800-f3db-40e5-ba92-e620137bab5c-host\") pod \"crc-debug-z57gv\" (UID: \"0c27d800-f3db-40e5-ba92-e620137bab5c\") " pod="openshift-must-gather-wfbm9/crc-debug-z57gv" Oct 01 16:05:10 crc kubenswrapper[4771]: I1001 16:05:10.515148 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0c27d800-f3db-40e5-ba92-e620137bab5c-host\") pod \"crc-debug-z57gv\" (UID: \"0c27d800-f3db-40e5-ba92-e620137bab5c\") " pod="openshift-must-gather-wfbm9/crc-debug-z57gv" Oct 01 16:05:10 crc kubenswrapper[4771]: I1001 16:05:10.515183 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rdwp\" (UniqueName: \"kubernetes.io/projected/0c27d800-f3db-40e5-ba92-e620137bab5c-kube-api-access-4rdwp\") pod \"crc-debug-z57gv\" (UID: \"0c27d800-f3db-40e5-ba92-e620137bab5c\") " pod="openshift-must-gather-wfbm9/crc-debug-z57gv" Oct 01 16:05:10 crc kubenswrapper[4771]: I1001 16:05:10.515312 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0c27d800-f3db-40e5-ba92-e620137bab5c-host\") pod \"crc-debug-z57gv\" (UID: \"0c27d800-f3db-40e5-ba92-e620137bab5c\") " pod="openshift-must-gather-wfbm9/crc-debug-z57gv" Oct 01 16:05:10 crc kubenswrapper[4771]: I1001 16:05:10.878266 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rdwp\" (UniqueName: \"kubernetes.io/projected/0c27d800-f3db-40e5-ba92-e620137bab5c-kube-api-access-4rdwp\") pod \"crc-debug-z57gv\" (UID: \"0c27d800-f3db-40e5-ba92-e620137bab5c\") " pod="openshift-must-gather-wfbm9/crc-debug-z57gv" Oct 01 16:05:11 crc kubenswrapper[4771]: I1001 16:05:11.167336 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wfbm9/crc-debug-z57gv" Oct 01 16:05:11 crc kubenswrapper[4771]: I1001 16:05:11.928690 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wfbm9/crc-debug-z57gv" event={"ID":"0c27d800-f3db-40e5-ba92-e620137bab5c","Type":"ContainerStarted","Data":"3949ede917f2265a4e0de615715fed5c545d6a5e5ac9004bbc547b4e2d67ebc5"} Oct 01 16:05:11 crc kubenswrapper[4771]: I1001 16:05:11.929112 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wfbm9/crc-debug-z57gv" event={"ID":"0c27d800-f3db-40e5-ba92-e620137bab5c","Type":"ContainerStarted","Data":"9666a1959e84c582d416cf5820df1fa3e8c27a88cf9f0872eda2b56db4628ec1"} Oct 01 16:05:11 crc kubenswrapper[4771]: I1001 16:05:11.950410 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wfbm9/crc-debug-z57gv" podStartSLOduration=1.950386221 podStartE2EDuration="1.950386221s" podCreationTimestamp="2025-10-01 16:05:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-01 16:05:11.942466765 +0000 UTC m=+4156.561641976" watchObservedRunningTime="2025-10-01 16:05:11.950386221 +0000 UTC m=+4156.569561432" Oct 01 16:05:12 crc kubenswrapper[4771]: I1001 16:05:12.940020 4771 generic.go:334] "Generic (PLEG): container finished" podID="0c27d800-f3db-40e5-ba92-e620137bab5c" containerID="3949ede917f2265a4e0de615715fed5c545d6a5e5ac9004bbc547b4e2d67ebc5" exitCode=0 Oct 01 16:05:12 crc kubenswrapper[4771]: I1001 16:05:12.940075 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wfbm9/crc-debug-z57gv" event={"ID":"0c27d800-f3db-40e5-ba92-e620137bab5c","Type":"ContainerDied","Data":"3949ede917f2265a4e0de615715fed5c545d6a5e5ac9004bbc547b4e2d67ebc5"} Oct 01 16:05:14 crc kubenswrapper[4771]: I1001 16:05:14.052588 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wfbm9/crc-debug-z57gv" Oct 01 16:05:14 crc kubenswrapper[4771]: I1001 16:05:14.180425 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rdwp\" (UniqueName: \"kubernetes.io/projected/0c27d800-f3db-40e5-ba92-e620137bab5c-kube-api-access-4rdwp\") pod \"0c27d800-f3db-40e5-ba92-e620137bab5c\" (UID: \"0c27d800-f3db-40e5-ba92-e620137bab5c\") " Oct 01 16:05:14 crc kubenswrapper[4771]: I1001 16:05:14.181173 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0c27d800-f3db-40e5-ba92-e620137bab5c-host\") pod \"0c27d800-f3db-40e5-ba92-e620137bab5c\" (UID: \"0c27d800-f3db-40e5-ba92-e620137bab5c\") " Oct 01 16:05:14 crc kubenswrapper[4771]: I1001 16:05:14.181279 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0c27d800-f3db-40e5-ba92-e620137bab5c-host" (OuterVolumeSpecName: "host") pod "0c27d800-f3db-40e5-ba92-e620137bab5c" (UID: "0c27d800-f3db-40e5-ba92-e620137bab5c"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 16:05:14 crc kubenswrapper[4771]: I1001 16:05:14.184050 4771 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0c27d800-f3db-40e5-ba92-e620137bab5c-host\") on node \"crc\" DevicePath \"\"" Oct 01 16:05:14 crc kubenswrapper[4771]: I1001 16:05:14.255990 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c27d800-f3db-40e5-ba92-e620137bab5c-kube-api-access-4rdwp" (OuterVolumeSpecName: "kube-api-access-4rdwp") pod "0c27d800-f3db-40e5-ba92-e620137bab5c" (UID: "0c27d800-f3db-40e5-ba92-e620137bab5c"). InnerVolumeSpecName "kube-api-access-4rdwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:05:14 crc kubenswrapper[4771]: I1001 16:05:14.285511 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rdwp\" (UniqueName: \"kubernetes.io/projected/0c27d800-f3db-40e5-ba92-e620137bab5c-kube-api-access-4rdwp\") on node \"crc\" DevicePath \"\"" Oct 01 16:05:14 crc kubenswrapper[4771]: I1001 16:05:14.962604 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wfbm9/crc-debug-z57gv" event={"ID":"0c27d800-f3db-40e5-ba92-e620137bab5c","Type":"ContainerDied","Data":"9666a1959e84c582d416cf5820df1fa3e8c27a88cf9f0872eda2b56db4628ec1"} Oct 01 16:05:14 crc kubenswrapper[4771]: I1001 16:05:14.962874 4771 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9666a1959e84c582d416cf5820df1fa3e8c27a88cf9f0872eda2b56db4628ec1" Oct 01 16:05:14 crc kubenswrapper[4771]: I1001 16:05:14.962929 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wfbm9/crc-debug-z57gv" Oct 01 16:05:18 crc kubenswrapper[4771]: I1001 16:05:18.826663 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-wfbm9/crc-debug-z57gv"] Oct 01 16:05:18 crc kubenswrapper[4771]: I1001 16:05:18.836892 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-wfbm9/crc-debug-z57gv"] Oct 01 16:05:20 crc kubenswrapper[4771]: I1001 16:05:20.001284 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c27d800-f3db-40e5-ba92-e620137bab5c" path="/var/lib/kubelet/pods/0c27d800-f3db-40e5-ba92-e620137bab5c/volumes" Oct 01 16:05:20 crc kubenswrapper[4771]: I1001 16:05:20.009987 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wfbm9/crc-debug-srwcz"] Oct 01 16:05:20 crc kubenswrapper[4771]: E1001 16:05:20.010777 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c27d800-f3db-40e5-ba92-e620137bab5c" containerName="container-00" Oct 01 16:05:20 crc kubenswrapper[4771]: I1001 16:05:20.010801 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c27d800-f3db-40e5-ba92-e620137bab5c" containerName="container-00" Oct 01 16:05:20 crc kubenswrapper[4771]: I1001 16:05:20.011057 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c27d800-f3db-40e5-ba92-e620137bab5c" containerName="container-00" Oct 01 16:05:20 crc kubenswrapper[4771]: I1001 16:05:20.011839 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wfbm9/crc-debug-srwcz" Oct 01 16:05:20 crc kubenswrapper[4771]: I1001 16:05:20.014291 4771 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-wfbm9"/"default-dockercfg-cdth4" Oct 01 16:05:20 crc kubenswrapper[4771]: I1001 16:05:20.078142 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72p55\" (UniqueName: \"kubernetes.io/projected/50b4ce05-f6ce-4f16-adf5-e6d5d18bc4ce-kube-api-access-72p55\") pod \"crc-debug-srwcz\" (UID: \"50b4ce05-f6ce-4f16-adf5-e6d5d18bc4ce\") " pod="openshift-must-gather-wfbm9/crc-debug-srwcz" Oct 01 16:05:20 crc kubenswrapper[4771]: I1001 16:05:20.078597 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/50b4ce05-f6ce-4f16-adf5-e6d5d18bc4ce-host\") pod \"crc-debug-srwcz\" (UID: \"50b4ce05-f6ce-4f16-adf5-e6d5d18bc4ce\") " pod="openshift-must-gather-wfbm9/crc-debug-srwcz" Oct 01 16:05:20 crc kubenswrapper[4771]: I1001 16:05:20.181354 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/50b4ce05-f6ce-4f16-adf5-e6d5d18bc4ce-host\") pod \"crc-debug-srwcz\" (UID: \"50b4ce05-f6ce-4f16-adf5-e6d5d18bc4ce\") " pod="openshift-must-gather-wfbm9/crc-debug-srwcz" Oct 01 16:05:20 crc kubenswrapper[4771]: I1001 16:05:20.181519 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/50b4ce05-f6ce-4f16-adf5-e6d5d18bc4ce-host\") pod \"crc-debug-srwcz\" (UID: \"50b4ce05-f6ce-4f16-adf5-e6d5d18bc4ce\") " pod="openshift-must-gather-wfbm9/crc-debug-srwcz" Oct 01 16:05:20 crc kubenswrapper[4771]: I1001 16:05:20.181530 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72p55\" (UniqueName: \"kubernetes.io/projected/50b4ce05-f6ce-4f16-adf5-e6d5d18bc4ce-kube-api-access-72p55\") pod \"crc-debug-srwcz\" (UID: \"50b4ce05-f6ce-4f16-adf5-e6d5d18bc4ce\") " pod="openshift-must-gather-wfbm9/crc-debug-srwcz" Oct 01 16:05:20 crc kubenswrapper[4771]: I1001 16:05:20.206937 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72p55\" (UniqueName: \"kubernetes.io/projected/50b4ce05-f6ce-4f16-adf5-e6d5d18bc4ce-kube-api-access-72p55\") pod \"crc-debug-srwcz\" (UID: \"50b4ce05-f6ce-4f16-adf5-e6d5d18bc4ce\") " pod="openshift-must-gather-wfbm9/crc-debug-srwcz" Oct 01 16:05:20 crc kubenswrapper[4771]: I1001 16:05:20.335625 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wfbm9/crc-debug-srwcz" Oct 01 16:05:20 crc kubenswrapper[4771]: W1001 16:05:20.378320 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50b4ce05_f6ce_4f16_adf5_e6d5d18bc4ce.slice/crio-5323451a94249050255f35662d72dfa4ee2fe757ce13968b5fa604bccb48ef91 WatchSource:0}: Error finding container 5323451a94249050255f35662d72dfa4ee2fe757ce13968b5fa604bccb48ef91: Status 404 returned error can't find the container with id 5323451a94249050255f35662d72dfa4ee2fe757ce13968b5fa604bccb48ef91 Oct 01 16:05:21 crc kubenswrapper[4771]: I1001 16:05:21.018951 4771 generic.go:334] "Generic (PLEG): container finished" podID="50b4ce05-f6ce-4f16-adf5-e6d5d18bc4ce" containerID="555845d9035460515fb9ab062d04fec9ae4ae098401fd266d6af6e8d94bfbf4b" exitCode=0 Oct 01 16:05:21 crc kubenswrapper[4771]: I1001 16:05:21.019033 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wfbm9/crc-debug-srwcz" event={"ID":"50b4ce05-f6ce-4f16-adf5-e6d5d18bc4ce","Type":"ContainerDied","Data":"555845d9035460515fb9ab062d04fec9ae4ae098401fd266d6af6e8d94bfbf4b"} Oct 01 16:05:21 crc kubenswrapper[4771]: I1001 16:05:21.020521 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wfbm9/crc-debug-srwcz" event={"ID":"50b4ce05-f6ce-4f16-adf5-e6d5d18bc4ce","Type":"ContainerStarted","Data":"5323451a94249050255f35662d72dfa4ee2fe757ce13968b5fa604bccb48ef91"} Oct 01 16:05:21 crc kubenswrapper[4771]: I1001 16:05:21.057710 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-wfbm9/crc-debug-srwcz"] Oct 01 16:05:21 crc kubenswrapper[4771]: I1001 16:05:21.069933 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-wfbm9/crc-debug-srwcz"] Oct 01 16:05:22 crc kubenswrapper[4771]: I1001 16:05:22.142791 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wfbm9/crc-debug-srwcz" Oct 01 16:05:22 crc kubenswrapper[4771]: I1001 16:05:22.221873 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/50b4ce05-f6ce-4f16-adf5-e6d5d18bc4ce-host\") pod \"50b4ce05-f6ce-4f16-adf5-e6d5d18bc4ce\" (UID: \"50b4ce05-f6ce-4f16-adf5-e6d5d18bc4ce\") " Oct 01 16:05:22 crc kubenswrapper[4771]: I1001 16:05:22.221975 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/50b4ce05-f6ce-4f16-adf5-e6d5d18bc4ce-host" (OuterVolumeSpecName: "host") pod "50b4ce05-f6ce-4f16-adf5-e6d5d18bc4ce" (UID: "50b4ce05-f6ce-4f16-adf5-e6d5d18bc4ce"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 01 16:05:22 crc kubenswrapper[4771]: I1001 16:05:22.222053 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72p55\" (UniqueName: \"kubernetes.io/projected/50b4ce05-f6ce-4f16-adf5-e6d5d18bc4ce-kube-api-access-72p55\") pod \"50b4ce05-f6ce-4f16-adf5-e6d5d18bc4ce\" (UID: \"50b4ce05-f6ce-4f16-adf5-e6d5d18bc4ce\") " Oct 01 16:05:22 crc kubenswrapper[4771]: I1001 16:05:22.222640 4771 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/50b4ce05-f6ce-4f16-adf5-e6d5d18bc4ce-host\") on node \"crc\" DevicePath \"\"" Oct 01 16:05:22 crc kubenswrapper[4771]: I1001 16:05:22.228690 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50b4ce05-f6ce-4f16-adf5-e6d5d18bc4ce-kube-api-access-72p55" (OuterVolumeSpecName: "kube-api-access-72p55") pod "50b4ce05-f6ce-4f16-adf5-e6d5d18bc4ce" (UID: "50b4ce05-f6ce-4f16-adf5-e6d5d18bc4ce"). InnerVolumeSpecName "kube-api-access-72p55". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:05:22 crc kubenswrapper[4771]: I1001 16:05:22.324920 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72p55\" (UniqueName: \"kubernetes.io/projected/50b4ce05-f6ce-4f16-adf5-e6d5d18bc4ce-kube-api-access-72p55\") on node \"crc\" DevicePath \"\"" Oct 01 16:05:23 crc kubenswrapper[4771]: I1001 16:05:23.037463 4771 scope.go:117] "RemoveContainer" containerID="555845d9035460515fb9ab062d04fec9ae4ae098401fd266d6af6e8d94bfbf4b" Oct 01 16:05:23 crc kubenswrapper[4771]: I1001 16:05:23.037710 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wfbm9/crc-debug-srwcz" Oct 01 16:05:23 crc kubenswrapper[4771]: I1001 16:05:23.499261 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-6vp9n_41e9ddd0-3d55-4c3d-a4e9-2fbe4a7ec6f6/kube-rbac-proxy/0.log" Oct 01 16:05:23 crc kubenswrapper[4771]: I1001 16:05:23.564539 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-6vp9n_41e9ddd0-3d55-4c3d-a4e9-2fbe4a7ec6f6/manager/0.log" Oct 01 16:05:23 crc kubenswrapper[4771]: I1001 16:05:23.676615 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-gbllm_31b30f6f-3f4e-4c6b-9517-0eb866b2c68c/kube-rbac-proxy/0.log" Oct 01 16:05:23 crc kubenswrapper[4771]: I1001 16:05:23.732391 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-7tpgh_46077b26-3930-4245-86b4-2d836a165664/kube-rbac-proxy/0.log" Oct 01 16:05:23 crc kubenswrapper[4771]: I1001 16:05:23.778845 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-gbllm_31b30f6f-3f4e-4c6b-9517-0eb866b2c68c/manager/0.log" Oct 01 16:05:23 crc kubenswrapper[4771]: I1001 16:05:23.896498 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-7tpgh_46077b26-3930-4245-86b4-2d836a165664/manager/0.log" Oct 01 16:05:23 crc kubenswrapper[4771]: I1001 16:05:23.998090 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f67954f135a897979546c33b0a718c86d8b189ef1f36644161d6597b857mb4c_115476f1-e753-4ab0-9c3f-b4a6ea4a6739/util/0.log" Oct 01 16:05:23 crc kubenswrapper[4771]: I1001 16:05:23.999444 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50b4ce05-f6ce-4f16-adf5-e6d5d18bc4ce" path="/var/lib/kubelet/pods/50b4ce05-f6ce-4f16-adf5-e6d5d18bc4ce/volumes" Oct 01 16:05:24 crc kubenswrapper[4771]: I1001 16:05:24.169642 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f67954f135a897979546c33b0a718c86d8b189ef1f36644161d6597b857mb4c_115476f1-e753-4ab0-9c3f-b4a6ea4a6739/pull/0.log" Oct 01 16:05:24 crc kubenswrapper[4771]: I1001 16:05:24.182009 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f67954f135a897979546c33b0a718c86d8b189ef1f36644161d6597b857mb4c_115476f1-e753-4ab0-9c3f-b4a6ea4a6739/util/0.log" Oct 01 16:05:24 crc kubenswrapper[4771]: I1001 16:05:24.202681 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f67954f135a897979546c33b0a718c86d8b189ef1f36644161d6597b857mb4c_115476f1-e753-4ab0-9c3f-b4a6ea4a6739/pull/0.log" Oct 01 16:05:24 crc kubenswrapper[4771]: I1001 16:05:24.358386 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f67954f135a897979546c33b0a718c86d8b189ef1f36644161d6597b857mb4c_115476f1-e753-4ab0-9c3f-b4a6ea4a6739/util/0.log" Oct 01 16:05:24 crc kubenswrapper[4771]: I1001 16:05:24.370021 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f67954f135a897979546c33b0a718c86d8b189ef1f36644161d6597b857mb4c_115476f1-e753-4ab0-9c3f-b4a6ea4a6739/pull/0.log" Oct 01 16:05:24 crc kubenswrapper[4771]: I1001 16:05:24.390993 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f67954f135a897979546c33b0a718c86d8b189ef1f36644161d6597b857mb4c_115476f1-e753-4ab0-9c3f-b4a6ea4a6739/extract/0.log" Oct 01 16:05:24 crc kubenswrapper[4771]: I1001 16:05:24.513704 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-99nfn_60253a13-4845-4234-81f4-329e6f35a86e/kube-rbac-proxy/0.log" Oct 01 16:05:24 crc kubenswrapper[4771]: I1001 16:05:24.606359 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-99nfn_60253a13-4845-4234-81f4-329e6f35a86e/manager/0.log" Oct 01 16:05:24 crc kubenswrapper[4771]: I1001 16:05:24.636926 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-55pmn_43a1a358-9eba-46eb-90c5-a34e0fad09d6/kube-rbac-proxy/0.log" Oct 01 16:05:24 crc kubenswrapper[4771]: I1001 16:05:24.702520 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-55pmn_43a1a358-9eba-46eb-90c5-a34e0fad09d6/manager/0.log" Oct 01 16:05:24 crc kubenswrapper[4771]: I1001 16:05:24.826813 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-mwl7z_681d5bbd-36b0-497f-9d27-f8cc7473399a/kube-rbac-proxy/0.log" Oct 01 16:05:24 crc kubenswrapper[4771]: I1001 16:05:24.828174 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-mwl7z_681d5bbd-36b0-497f-9d27-f8cc7473399a/manager/0.log" Oct 01 16:05:24 crc kubenswrapper[4771]: I1001 16:05:24.979121 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-9d6c5db85-xkmp2_f7c18d5d-6ebb-4c31-a348-6ae7feebfafc/kube-rbac-proxy/0.log" Oct 01 16:05:25 crc kubenswrapper[4771]: I1001 16:05:25.131360 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-9d6c5db85-xkmp2_f7c18d5d-6ebb-4c31-a348-6ae7feebfafc/manager/0.log" Oct 01 16:05:25 crc kubenswrapper[4771]: I1001 16:05:25.139248 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5cd4858477-zcdfw_fcb3eff6-0c4e-4046-a829-fab3a5942d21/kube-rbac-proxy/0.log" Oct 01 16:05:25 crc kubenswrapper[4771]: I1001 16:05:25.183395 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5cd4858477-zcdfw_fcb3eff6-0c4e-4046-a829-fab3a5942d21/manager/0.log" Oct 01 16:05:25 crc kubenswrapper[4771]: I1001 16:05:25.287765 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-wps64_0c5bf036-417b-4f93-94a0-7c8ddc9028d7/kube-rbac-proxy/0.log" Oct 01 16:05:25 crc kubenswrapper[4771]: I1001 16:05:25.412646 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-wps64_0c5bf036-417b-4f93-94a0-7c8ddc9028d7/manager/0.log" Oct 01 16:05:25 crc kubenswrapper[4771]: I1001 16:05:25.498931 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-nv495_4c10ae08-be13-4725-b9be-c55ce015f33e/kube-rbac-proxy/0.log" Oct 01 16:05:25 crc kubenswrapper[4771]: I1001 16:05:25.518859 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-nv495_4c10ae08-be13-4725-b9be-c55ce015f33e/manager/0.log" Oct 01 16:05:25 crc kubenswrapper[4771]: I1001 16:05:25.597120 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-5jmlk_a1c30bd9-dd78-4d92-9423-16597bf7d758/kube-rbac-proxy/0.log" Oct 01 16:05:25 crc kubenswrapper[4771]: I1001 16:05:25.714614 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-5jmlk_a1c30bd9-dd78-4d92-9423-16597bf7d758/manager/0.log" Oct 01 16:05:25 crc kubenswrapper[4771]: I1001 16:05:25.834647 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-849d5b9b84-nv6q5_c8125ce6-7c9a-45d3-b820-698fd30d3471/kube-rbac-proxy/0.log" Oct 01 16:05:25 crc kubenswrapper[4771]: I1001 16:05:25.880184 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-849d5b9b84-nv6q5_c8125ce6-7c9a-45d3-b820-698fd30d3471/manager/0.log" Oct 01 16:05:25 crc kubenswrapper[4771]: I1001 16:05:25.942928 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-64cd67b5cb-l85hk_863ec596-646c-41a0-b3e4-e33ad84c79aa/kube-rbac-proxy/0.log" Oct 01 16:05:26 crc kubenswrapper[4771]: I1001 16:05:26.101296 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7b787867f4-fx47m_417e6338-ae16-4903-8381-5bb1c3a92c75/kube-rbac-proxy/0.log" Oct 01 16:05:26 crc kubenswrapper[4771]: I1001 16:05:26.123919 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-64cd67b5cb-l85hk_863ec596-646c-41a0-b3e4-e33ad84c79aa/manager/0.log" Oct 01 16:05:26 crc kubenswrapper[4771]: I1001 16:05:26.136300 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7b787867f4-fx47m_417e6338-ae16-4903-8381-5bb1c3a92c75/manager/0.log" Oct 01 16:05:26 crc kubenswrapper[4771]: I1001 16:05:26.322826 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-77b9676b8c62g57_c87cb84f-c539-4562-8492-b1106b6181f1/kube-rbac-proxy/0.log" Oct 01 16:05:26 crc kubenswrapper[4771]: I1001 16:05:26.340151 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-77b9676b8c62g57_c87cb84f-c539-4562-8492-b1106b6181f1/manager/0.log" Oct 01 16:05:26 crc kubenswrapper[4771]: I1001 16:05:26.517254 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-577574bf4d-p8zrk_31e6f7a8-0c8e-48c9-a2ba-38ffdabc1d95/kube-rbac-proxy/0.log" Oct 01 16:05:26 crc kubenswrapper[4771]: I1001 16:05:26.556868 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-9fb4b654-b6wcr_6285b448-a922-4015-96e8-3af02ca8a82d/kube-rbac-proxy/0.log" Oct 01 16:05:26 crc kubenswrapper[4771]: I1001 16:05:26.771201 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-hl4lz_4a21d3b6-e1e3-493b-baf1-fcbb055fb859/registry-server/0.log" Oct 01 16:05:26 crc kubenswrapper[4771]: I1001 16:05:26.803710 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-9fb4b654-b6wcr_6285b448-a922-4015-96e8-3af02ca8a82d/operator/0.log" Oct 01 16:05:27 crc kubenswrapper[4771]: I1001 16:05:27.174347 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-qnlxq_b90ad79b-e447-4a96-82a1-4ae8cb5b9959/kube-rbac-proxy/0.log" Oct 01 16:05:27 crc kubenswrapper[4771]: I1001 16:05:27.291581 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-qnlxq_b90ad79b-e447-4a96-82a1-4ae8cb5b9959/manager/0.log" Oct 01 16:05:27 crc kubenswrapper[4771]: I1001 16:05:27.302114 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-bkpfg_9c1c4098-4bbf-4d54-a09d-44b29ef352c3/kube-rbac-proxy/0.log" Oct 01 16:05:27 crc kubenswrapper[4771]: I1001 16:05:27.511305 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-bkpfg_9c1c4098-4bbf-4d54-a09d-44b29ef352c3/manager/0.log" Oct 01 16:05:27 crc kubenswrapper[4771]: I1001 16:05:27.597334 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-7clp8_b5efb91f-7e66-488b-ab6f-e52dbf63bc3c/operator/0.log" Oct 01 16:05:27 crc kubenswrapper[4771]: I1001 16:05:27.705497 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-84d6b4b759-wljgn_193791c4-4d63-4f50-a743-439b664c16b7/kube-rbac-proxy/0.log" Oct 01 16:05:27 crc kubenswrapper[4771]: I1001 16:05:27.761549 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-84d6b4b759-wljgn_193791c4-4d63-4f50-a743-439b664c16b7/manager/0.log" Oct 01 16:05:27 crc kubenswrapper[4771]: I1001 16:05:27.890707 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-c495dbccb-25dzd_9abd1d50-adca-4d9a-8c33-89c3242174a5/kube-rbac-proxy/0.log" Oct 01 16:05:27 crc kubenswrapper[4771]: I1001 16:05:27.942119 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-577574bf4d-p8zrk_31e6f7a8-0c8e-48c9-a2ba-38ffdabc1d95/manager/0.log" Oct 01 16:05:28 crc kubenswrapper[4771]: I1001 16:05:28.022929 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-c495dbccb-25dzd_9abd1d50-adca-4d9a-8c33-89c3242174a5/manager/0.log" Oct 01 16:05:28 crc kubenswrapper[4771]: I1001 16:05:28.055200 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-85777745bb-vvkwz_e6d65172-53ec-4aae-a508-b955072cdd2a/kube-rbac-proxy/0.log" Oct 01 16:05:28 crc kubenswrapper[4771]: I1001 16:05:28.112340 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-85777745bb-vvkwz_e6d65172-53ec-4aae-a508-b955072cdd2a/manager/0.log" Oct 01 16:05:28 crc kubenswrapper[4771]: I1001 16:05:28.230752 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6b9957f54f-crn74_a0517b85-5f5f-4d87-92f8-901564af068c/kube-rbac-proxy/0.log" Oct 01 16:05:28 crc kubenswrapper[4771]: I1001 16:05:28.244285 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6b9957f54f-crn74_a0517b85-5f5f-4d87-92f8-901564af068c/manager/0.log" Oct 01 16:05:44 crc kubenswrapper[4771]: I1001 16:05:44.340247 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-8gml8_6730c7b8-fcbf-48c4-b2b8-5ed3566a7cd4/control-plane-machine-set-operator/0.log" Oct 01 16:05:44 crc kubenswrapper[4771]: I1001 16:05:44.488589 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-qlkrl_f160df7c-e97b-4c5a-badf-08379f8e27bf/kube-rbac-proxy/0.log" Oct 01 16:05:44 crc kubenswrapper[4771]: I1001 16:05:44.568257 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-qlkrl_f160df7c-e97b-4c5a-badf-08379f8e27bf/machine-api-operator/0.log" Oct 01 16:05:56 crc kubenswrapper[4771]: I1001 16:05:56.673606 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-n6g7s_96d1876b-3e09-4899-8b04-a49c88ebf65d/cert-manager-controller/0.log" Oct 01 16:05:56 crc kubenswrapper[4771]: I1001 16:05:56.904758 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-wjfnq_153105d4-1f8e-43f2-bcea-0f3a36598eb0/cert-manager-cainjector/0.log" Oct 01 16:05:56 crc kubenswrapper[4771]: I1001 16:05:56.944239 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-9zpvt_fdd5e5a8-3303-4cee-ad76-d47d1a0da067/cert-manager-webhook/0.log" Oct 01 16:06:08 crc kubenswrapper[4771]: I1001 16:06:08.642714 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-864bb6dfb5-nghbt_2d196fd3-9a93-4f81-b0ad-fefca77240a5/nmstate-console-plugin/0.log" Oct 01 16:06:08 crc kubenswrapper[4771]: I1001 16:06:08.814703 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-vqfvd_a8a82139-4b56-419e-a4e4-143e3246ec96/nmstate-handler/0.log" Oct 01 16:06:08 crc kubenswrapper[4771]: I1001 16:06:08.843499 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-wpddr_5082e8a7-dbba-4c99-9c6a-f35f64310963/kube-rbac-proxy/0.log" Oct 01 16:06:08 crc kubenswrapper[4771]: I1001 16:06:08.855455 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-wpddr_5082e8a7-dbba-4c99-9c6a-f35f64310963/nmstate-metrics/0.log" Oct 01 16:06:09 crc kubenswrapper[4771]: I1001 16:06:09.022135 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5d6f6cfd66-9twbv_88f55c4c-a1d9-4751-9776-562464717201/nmstate-operator/0.log" Oct 01 16:06:09 crc kubenswrapper[4771]: I1001 16:06:09.089451 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6d689559c5-rx6d2_e9d87ba9-d0d9-4647-a69b-4a114140b6be/nmstate-webhook/0.log" Oct 01 16:06:12 crc kubenswrapper[4771]: I1001 16:06:12.177598 4771 patch_prober.go:28] interesting pod/machine-config-daemon-vck47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 16:06:12 crc kubenswrapper[4771]: I1001 16:06:12.177966 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 16:06:23 crc kubenswrapper[4771]: I1001 16:06:23.217119 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-ffb8z_e2cac0e6-5d1c-4914-b9f6-334aeedcf2d4/kube-rbac-proxy/0.log" Oct 01 16:06:23 crc kubenswrapper[4771]: I1001 16:06:23.343892 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-ffb8z_e2cac0e6-5d1c-4914-b9f6-334aeedcf2d4/controller/0.log" Oct 01 16:06:23 crc kubenswrapper[4771]: I1001 16:06:23.845569 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mp8gm_36cbe93e-7162-4367-a756-e731b356fa91/cp-frr-files/0.log" Oct 01 16:06:24 crc kubenswrapper[4771]: I1001 16:06:24.029611 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mp8gm_36cbe93e-7162-4367-a756-e731b356fa91/cp-frr-files/0.log" Oct 01 16:06:24 crc kubenswrapper[4771]: I1001 16:06:24.064959 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mp8gm_36cbe93e-7162-4367-a756-e731b356fa91/cp-reloader/0.log" Oct 01 16:06:24 crc kubenswrapper[4771]: I1001 16:06:24.095627 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mp8gm_36cbe93e-7162-4367-a756-e731b356fa91/cp-reloader/0.log" Oct 01 16:06:24 crc kubenswrapper[4771]: I1001 16:06:24.100106 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mp8gm_36cbe93e-7162-4367-a756-e731b356fa91/cp-metrics/0.log" Oct 01 16:06:24 crc kubenswrapper[4771]: I1001 16:06:24.275708 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mp8gm_36cbe93e-7162-4367-a756-e731b356fa91/cp-metrics/0.log" Oct 01 16:06:24 crc kubenswrapper[4771]: I1001 16:06:24.284903 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mp8gm_36cbe93e-7162-4367-a756-e731b356fa91/cp-frr-files/0.log" Oct 01 16:06:24 crc kubenswrapper[4771]: I1001 16:06:24.333270 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mp8gm_36cbe93e-7162-4367-a756-e731b356fa91/cp-reloader/0.log" Oct 01 16:06:24 crc kubenswrapper[4771]: I1001 16:06:24.369383 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mp8gm_36cbe93e-7162-4367-a756-e731b356fa91/cp-metrics/0.log" Oct 01 16:06:24 crc kubenswrapper[4771]: I1001 16:06:24.516353 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mp8gm_36cbe93e-7162-4367-a756-e731b356fa91/cp-reloader/0.log" Oct 01 16:06:24 crc kubenswrapper[4771]: I1001 16:06:24.524354 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mp8gm_36cbe93e-7162-4367-a756-e731b356fa91/cp-frr-files/0.log" Oct 01 16:06:24 crc kubenswrapper[4771]: I1001 16:06:24.534718 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mp8gm_36cbe93e-7162-4367-a756-e731b356fa91/cp-metrics/0.log" Oct 01 16:06:24 crc kubenswrapper[4771]: I1001 16:06:24.600154 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mp8gm_36cbe93e-7162-4367-a756-e731b356fa91/controller/0.log" Oct 01 16:06:24 crc kubenswrapper[4771]: I1001 16:06:24.717063 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mp8gm_36cbe93e-7162-4367-a756-e731b356fa91/frr-metrics/0.log" Oct 01 16:06:24 crc kubenswrapper[4771]: I1001 16:06:24.760502 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mp8gm_36cbe93e-7162-4367-a756-e731b356fa91/kube-rbac-proxy/0.log" Oct 01 16:06:24 crc kubenswrapper[4771]: I1001 16:06:24.816268 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mp8gm_36cbe93e-7162-4367-a756-e731b356fa91/kube-rbac-proxy-frr/0.log" Oct 01 16:06:24 crc kubenswrapper[4771]: I1001 16:06:24.947692 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mp8gm_36cbe93e-7162-4367-a756-e731b356fa91/reloader/0.log" Oct 01 16:06:25 crc kubenswrapper[4771]: I1001 16:06:25.085422 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-5478bdb765-4lvs5_d0200ff4-2245-408d-bf5f-28479e049c57/frr-k8s-webhook-server/0.log" Oct 01 16:06:25 crc kubenswrapper[4771]: I1001 16:06:25.292210 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-554dcf567c-bnmzr_35c6bc93-608d-4534-9ccd-493ea57f189d/manager/0.log" Oct 01 16:06:25 crc kubenswrapper[4771]: I1001 16:06:25.492002 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-bd9669845-gnppv_b09aca49-c227-4561-9f87-df661ec6d85c/webhook-server/0.log" Oct 01 16:06:25 crc kubenswrapper[4771]: I1001 16:06:25.623606 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-s5glz_cf712988-695e-4dae-a121-ce52bf39689e/kube-rbac-proxy/0.log" Oct 01 16:06:26 crc kubenswrapper[4771]: I1001 16:06:26.012180 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-mp8gm_36cbe93e-7162-4367-a756-e731b356fa91/frr/0.log" Oct 01 16:06:26 crc kubenswrapper[4771]: I1001 16:06:26.132715 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-s5glz_cf712988-695e-4dae-a121-ce52bf39689e/speaker/0.log" Oct 01 16:06:37 crc kubenswrapper[4771]: I1001 16:06:37.457865 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbbx8p_f9a434b3-f6c6-441a-bc5f-0731967288da/util/0.log" Oct 01 16:06:37 crc kubenswrapper[4771]: I1001 16:06:37.656454 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbbx8p_f9a434b3-f6c6-441a-bc5f-0731967288da/util/0.log" Oct 01 16:06:37 crc kubenswrapper[4771]: I1001 16:06:37.664422 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbbx8p_f9a434b3-f6c6-441a-bc5f-0731967288da/pull/0.log" Oct 01 16:06:37 crc kubenswrapper[4771]: I1001 16:06:37.668976 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbbx8p_f9a434b3-f6c6-441a-bc5f-0731967288da/pull/0.log" Oct 01 16:06:37 crc kubenswrapper[4771]: I1001 16:06:37.845233 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbbx8p_f9a434b3-f6c6-441a-bc5f-0731967288da/extract/0.log" Oct 01 16:06:37 crc kubenswrapper[4771]: I1001 16:06:37.853032 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbbx8p_f9a434b3-f6c6-441a-bc5f-0731967288da/pull/0.log" Oct 01 16:06:37 crc kubenswrapper[4771]: I1001 16:06:37.880165 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bcbbx8p_f9a434b3-f6c6-441a-bc5f-0731967288da/util/0.log" Oct 01 16:06:38 crc kubenswrapper[4771]: I1001 16:06:38.036460 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-cxlz6_e0ead9f7-cc56-4d45-9718-d175502b54df/extract-utilities/0.log" Oct 01 16:06:38 crc kubenswrapper[4771]: I1001 16:06:38.223938 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-cxlz6_e0ead9f7-cc56-4d45-9718-d175502b54df/extract-content/0.log" Oct 01 16:06:38 crc kubenswrapper[4771]: I1001 16:06:38.226659 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-cxlz6_e0ead9f7-cc56-4d45-9718-d175502b54df/extract-utilities/0.log" Oct 01 16:06:38 crc kubenswrapper[4771]: I1001 16:06:38.243717 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-cxlz6_e0ead9f7-cc56-4d45-9718-d175502b54df/extract-content/0.log" Oct 01 16:06:38 crc kubenswrapper[4771]: I1001 16:06:38.425607 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-cxlz6_e0ead9f7-cc56-4d45-9718-d175502b54df/extract-utilities/0.log" Oct 01 16:06:38 crc kubenswrapper[4771]: I1001 16:06:38.432852 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-cxlz6_e0ead9f7-cc56-4d45-9718-d175502b54df/extract-content/0.log" Oct 01 16:06:38 crc kubenswrapper[4771]: I1001 16:06:38.659992 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bv7vl_2061f39f-1b36-4d01-b13f-33156f106012/extract-utilities/0.log" Oct 01 16:06:38 crc kubenswrapper[4771]: I1001 16:06:38.803828 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-cxlz6_e0ead9f7-cc56-4d45-9718-d175502b54df/registry-server/0.log" Oct 01 16:06:39 crc kubenswrapper[4771]: I1001 16:06:39.389120 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bv7vl_2061f39f-1b36-4d01-b13f-33156f106012/extract-content/0.log" Oct 01 16:06:39 crc kubenswrapper[4771]: I1001 16:06:39.418853 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bv7vl_2061f39f-1b36-4d01-b13f-33156f106012/extract-utilities/0.log" Oct 01 16:06:39 crc kubenswrapper[4771]: I1001 16:06:39.447518 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bv7vl_2061f39f-1b36-4d01-b13f-33156f106012/extract-content/0.log" Oct 01 16:06:39 crc kubenswrapper[4771]: I1001 16:06:39.615596 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bv7vl_2061f39f-1b36-4d01-b13f-33156f106012/extract-content/0.log" Oct 01 16:06:39 crc kubenswrapper[4771]: I1001 16:06:39.711597 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bv7vl_2061f39f-1b36-4d01-b13f-33156f106012/extract-utilities/0.log" Oct 01 16:06:39 crc kubenswrapper[4771]: I1001 16:06:39.851109 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96g4kvv_5c4805aa-64a3-4354-b3ab-ab48935503cf/util/0.log" Oct 01 16:06:40 crc kubenswrapper[4771]: I1001 16:06:40.038927 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96g4kvv_5c4805aa-64a3-4354-b3ab-ab48935503cf/pull/0.log" Oct 01 16:06:40 crc kubenswrapper[4771]: I1001 16:06:40.039373 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96g4kvv_5c4805aa-64a3-4354-b3ab-ab48935503cf/util/0.log" Oct 01 16:06:40 crc kubenswrapper[4771]: I1001 16:06:40.096455 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96g4kvv_5c4805aa-64a3-4354-b3ab-ab48935503cf/pull/0.log" Oct 01 16:06:40 crc kubenswrapper[4771]: I1001 16:06:40.155172 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-bv7vl_2061f39f-1b36-4d01-b13f-33156f106012/registry-server/0.log" Oct 01 16:06:40 crc kubenswrapper[4771]: I1001 16:06:40.303095 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96g4kvv_5c4805aa-64a3-4354-b3ab-ab48935503cf/pull/0.log" Oct 01 16:06:40 crc kubenswrapper[4771]: I1001 16:06:40.337115 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96g4kvv_5c4805aa-64a3-4354-b3ab-ab48935503cf/util/0.log" Oct 01 16:06:40 crc kubenswrapper[4771]: I1001 16:06:40.348775 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96g4kvv_5c4805aa-64a3-4354-b3ab-ab48935503cf/extract/0.log" Oct 01 16:06:40 crc kubenswrapper[4771]: I1001 16:06:40.483652 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-4bz2c_bf346cda-7a16-42a2-b731-d8834b7a1380/marketplace-operator/0.log" Oct 01 16:06:40 crc kubenswrapper[4771]: I1001 16:06:40.534111 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9ftld_e61495fc-ef85-44de-8135-f080a089e4ed/extract-utilities/0.log" Oct 01 16:06:41 crc kubenswrapper[4771]: I1001 16:06:41.256344 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9ftld_e61495fc-ef85-44de-8135-f080a089e4ed/extract-content/0.log" Oct 01 16:06:41 crc kubenswrapper[4771]: I1001 16:06:41.262213 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9ftld_e61495fc-ef85-44de-8135-f080a089e4ed/extract-content/0.log" Oct 01 16:06:41 crc kubenswrapper[4771]: I1001 16:06:41.270640 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9ftld_e61495fc-ef85-44de-8135-f080a089e4ed/extract-utilities/0.log" Oct 01 16:06:41 crc kubenswrapper[4771]: I1001 16:06:41.417673 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9ftld_e61495fc-ef85-44de-8135-f080a089e4ed/extract-content/0.log" Oct 01 16:06:41 crc kubenswrapper[4771]: I1001 16:06:41.449411 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9ftld_e61495fc-ef85-44de-8135-f080a089e4ed/extract-utilities/0.log" Oct 01 16:06:41 crc kubenswrapper[4771]: I1001 16:06:41.494829 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lzbhb_e8c77405-b06a-457f-ae26-aa105e1e638c/extract-utilities/0.log" Oct 01 16:06:41 crc kubenswrapper[4771]: I1001 16:06:41.677001 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lzbhb_e8c77405-b06a-457f-ae26-aa105e1e638c/extract-utilities/0.log" Oct 01 16:06:41 crc kubenswrapper[4771]: I1001 16:06:41.677850 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lzbhb_e8c77405-b06a-457f-ae26-aa105e1e638c/extract-content/0.log" Oct 01 16:06:41 crc kubenswrapper[4771]: I1001 16:06:41.706191 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lzbhb_e8c77405-b06a-457f-ae26-aa105e1e638c/extract-content/0.log" Oct 01 16:06:41 crc kubenswrapper[4771]: I1001 16:06:41.841254 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lzbhb_e8c77405-b06a-457f-ae26-aa105e1e638c/extract-content/0.log" Oct 01 16:06:41 crc kubenswrapper[4771]: I1001 16:06:41.857182 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9ftld_e61495fc-ef85-44de-8135-f080a089e4ed/registry-server/0.log" Oct 01 16:06:41 crc kubenswrapper[4771]: I1001 16:06:41.899951 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lzbhb_e8c77405-b06a-457f-ae26-aa105e1e638c/extract-utilities/0.log" Oct 01 16:06:42 crc kubenswrapper[4771]: I1001 16:06:42.176830 4771 patch_prober.go:28] interesting pod/machine-config-daemon-vck47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 16:06:42 crc kubenswrapper[4771]: I1001 16:06:42.177178 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 16:06:42 crc kubenswrapper[4771]: I1001 16:06:42.414795 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-lzbhb_e8c77405-b06a-457f-ae26-aa105e1e638c/registry-server/0.log" Oct 01 16:07:12 crc kubenswrapper[4771]: I1001 16:07:12.176793 4771 patch_prober.go:28] interesting pod/machine-config-daemon-vck47 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 01 16:07:12 crc kubenswrapper[4771]: I1001 16:07:12.177271 4771 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 01 16:07:12 crc kubenswrapper[4771]: I1001 16:07:12.177313 4771 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-vck47" Oct 01 16:07:12 crc kubenswrapper[4771]: I1001 16:07:12.178004 4771 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f55cdf44553a8313053f9f313ebda5423df128ee40bc2ac50a7599b6ed530df9"} pod="openshift-machine-config-operator/machine-config-daemon-vck47" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 01 16:07:12 crc kubenswrapper[4771]: I1001 16:07:12.178059 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerName="machine-config-daemon" containerID="cri-o://f55cdf44553a8313053f9f313ebda5423df128ee40bc2ac50a7599b6ed530df9" gracePeriod=600 Oct 01 16:07:12 crc kubenswrapper[4771]: E1001 16:07:12.300681 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 16:07:13 crc kubenswrapper[4771]: I1001 16:07:13.037514 4771 generic.go:334] "Generic (PLEG): container finished" podID="289ee6d3-fabe-417f-964c-76ca03c143cc" containerID="f55cdf44553a8313053f9f313ebda5423df128ee40bc2ac50a7599b6ed530df9" exitCode=0 Oct 01 16:07:13 crc kubenswrapper[4771]: I1001 16:07:13.037623 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-vck47" event={"ID":"289ee6d3-fabe-417f-964c-76ca03c143cc","Type":"ContainerDied","Data":"f55cdf44553a8313053f9f313ebda5423df128ee40bc2ac50a7599b6ed530df9"} Oct 01 16:07:13 crc kubenswrapper[4771]: I1001 16:07:13.038059 4771 scope.go:117] "RemoveContainer" containerID="19c8aefbbf261f233703d5d75b1ff28f73820b1b39703c62ab135fd2a27acfcb" Oct 01 16:07:13 crc kubenswrapper[4771]: I1001 16:07:13.039276 4771 scope.go:117] "RemoveContainer" containerID="f55cdf44553a8313053f9f313ebda5423df128ee40bc2ac50a7599b6ed530df9" Oct 01 16:07:13 crc kubenswrapper[4771]: E1001 16:07:13.039878 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 16:07:24 crc kubenswrapper[4771]: I1001 16:07:24.985454 4771 scope.go:117] "RemoveContainer" containerID="f55cdf44553a8313053f9f313ebda5423df128ee40bc2ac50a7599b6ed530df9" Oct 01 16:07:24 crc kubenswrapper[4771]: E1001 16:07:24.986626 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 16:07:36 crc kubenswrapper[4771]: I1001 16:07:36.986488 4771 scope.go:117] "RemoveContainer" containerID="f55cdf44553a8313053f9f313ebda5423df128ee40bc2ac50a7599b6ed530df9" Oct 01 16:07:36 crc kubenswrapper[4771]: E1001 16:07:36.987300 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 16:07:47 crc kubenswrapper[4771]: I1001 16:07:47.985254 4771 scope.go:117] "RemoveContainer" containerID="f55cdf44553a8313053f9f313ebda5423df128ee40bc2ac50a7599b6ed530df9" Oct 01 16:07:47 crc kubenswrapper[4771]: E1001 16:07:47.987286 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 16:08:01 crc kubenswrapper[4771]: I1001 16:08:01.985031 4771 scope.go:117] "RemoveContainer" containerID="f55cdf44553a8313053f9f313ebda5423df128ee40bc2ac50a7599b6ed530df9" Oct 01 16:08:01 crc kubenswrapper[4771]: E1001 16:08:01.985881 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 16:08:04 crc kubenswrapper[4771]: I1001 16:08:04.653038 4771 scope.go:117] "RemoveContainer" containerID="267e3b96490422a4f114804b74a91c0b475b3c54ae7b54bc305f8b506ea6dc75" Oct 01 16:08:13 crc kubenswrapper[4771]: I1001 16:08:13.985257 4771 scope.go:117] "RemoveContainer" containerID="f55cdf44553a8313053f9f313ebda5423df128ee40bc2ac50a7599b6ed530df9" Oct 01 16:08:13 crc kubenswrapper[4771]: E1001 16:08:13.987340 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 16:08:26 crc kubenswrapper[4771]: I1001 16:08:26.985918 4771 scope.go:117] "RemoveContainer" containerID="f55cdf44553a8313053f9f313ebda5423df128ee40bc2ac50a7599b6ed530df9" Oct 01 16:08:26 crc kubenswrapper[4771]: E1001 16:08:26.986540 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 16:08:36 crc kubenswrapper[4771]: I1001 16:08:36.918358 4771 generic.go:334] "Generic (PLEG): container finished" podID="bfd8d29d-231e-4d89-b6a1-c65c00b76a7b" containerID="ff5572627a2645aae6e23c6c5e1b619d9763c95e556fb0f4474b25ee61711a70" exitCode=0 Oct 01 16:08:36 crc kubenswrapper[4771]: I1001 16:08:36.918405 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wfbm9/must-gather-brgcz" event={"ID":"bfd8d29d-231e-4d89-b6a1-c65c00b76a7b","Type":"ContainerDied","Data":"ff5572627a2645aae6e23c6c5e1b619d9763c95e556fb0f4474b25ee61711a70"} Oct 01 16:08:36 crc kubenswrapper[4771]: I1001 16:08:36.919885 4771 scope.go:117] "RemoveContainer" containerID="ff5572627a2645aae6e23c6c5e1b619d9763c95e556fb0f4474b25ee61711a70" Oct 01 16:08:37 crc kubenswrapper[4771]: I1001 16:08:37.519011 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wfbm9_must-gather-brgcz_bfd8d29d-231e-4d89-b6a1-c65c00b76a7b/gather/0.log" Oct 01 16:08:39 crc kubenswrapper[4771]: I1001 16:08:39.985188 4771 scope.go:117] "RemoveContainer" containerID="f55cdf44553a8313053f9f313ebda5423df128ee40bc2ac50a7599b6ed530df9" Oct 01 16:08:39 crc kubenswrapper[4771]: E1001 16:08:39.985696 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 16:08:48 crc kubenswrapper[4771]: I1001 16:08:48.002428 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-l6sjb"] Oct 01 16:08:48 crc kubenswrapper[4771]: E1001 16:08:48.003748 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50b4ce05-f6ce-4f16-adf5-e6d5d18bc4ce" containerName="container-00" Oct 01 16:08:48 crc kubenswrapper[4771]: I1001 16:08:48.003767 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="50b4ce05-f6ce-4f16-adf5-e6d5d18bc4ce" containerName="container-00" Oct 01 16:08:48 crc kubenswrapper[4771]: I1001 16:08:48.004013 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="50b4ce05-f6ce-4f16-adf5-e6d5d18bc4ce" containerName="container-00" Oct 01 16:08:48 crc kubenswrapper[4771]: I1001 16:08:48.005801 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l6sjb"] Oct 01 16:08:48 crc kubenswrapper[4771]: I1001 16:08:48.005964 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l6sjb" Oct 01 16:08:48 crc kubenswrapper[4771]: I1001 16:08:48.089952 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/854dfb65-df55-4285-8e9e-6ab6e0ee5fc4-utilities\") pod \"certified-operators-l6sjb\" (UID: \"854dfb65-df55-4285-8e9e-6ab6e0ee5fc4\") " pod="openshift-marketplace/certified-operators-l6sjb" Oct 01 16:08:48 crc kubenswrapper[4771]: I1001 16:08:48.090032 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bch69\" (UniqueName: \"kubernetes.io/projected/854dfb65-df55-4285-8e9e-6ab6e0ee5fc4-kube-api-access-bch69\") pod \"certified-operators-l6sjb\" (UID: \"854dfb65-df55-4285-8e9e-6ab6e0ee5fc4\") " pod="openshift-marketplace/certified-operators-l6sjb" Oct 01 16:08:48 crc kubenswrapper[4771]: I1001 16:08:48.090103 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/854dfb65-df55-4285-8e9e-6ab6e0ee5fc4-catalog-content\") pod \"certified-operators-l6sjb\" (UID: \"854dfb65-df55-4285-8e9e-6ab6e0ee5fc4\") " pod="openshift-marketplace/certified-operators-l6sjb" Oct 01 16:08:48 crc kubenswrapper[4771]: I1001 16:08:48.192333 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/854dfb65-df55-4285-8e9e-6ab6e0ee5fc4-utilities\") pod \"certified-operators-l6sjb\" (UID: \"854dfb65-df55-4285-8e9e-6ab6e0ee5fc4\") " pod="openshift-marketplace/certified-operators-l6sjb" Oct 01 16:08:48 crc kubenswrapper[4771]: I1001 16:08:48.192428 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bch69\" (UniqueName: \"kubernetes.io/projected/854dfb65-df55-4285-8e9e-6ab6e0ee5fc4-kube-api-access-bch69\") pod \"certified-operators-l6sjb\" (UID: \"854dfb65-df55-4285-8e9e-6ab6e0ee5fc4\") " pod="openshift-marketplace/certified-operators-l6sjb" Oct 01 16:08:48 crc kubenswrapper[4771]: I1001 16:08:48.192506 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/854dfb65-df55-4285-8e9e-6ab6e0ee5fc4-catalog-content\") pod \"certified-operators-l6sjb\" (UID: \"854dfb65-df55-4285-8e9e-6ab6e0ee5fc4\") " pod="openshift-marketplace/certified-operators-l6sjb" Oct 01 16:08:48 crc kubenswrapper[4771]: I1001 16:08:48.192948 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/854dfb65-df55-4285-8e9e-6ab6e0ee5fc4-utilities\") pod \"certified-operators-l6sjb\" (UID: \"854dfb65-df55-4285-8e9e-6ab6e0ee5fc4\") " pod="openshift-marketplace/certified-operators-l6sjb" Oct 01 16:08:48 crc kubenswrapper[4771]: I1001 16:08:48.193155 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/854dfb65-df55-4285-8e9e-6ab6e0ee5fc4-catalog-content\") pod \"certified-operators-l6sjb\" (UID: \"854dfb65-df55-4285-8e9e-6ab6e0ee5fc4\") " pod="openshift-marketplace/certified-operators-l6sjb" Oct 01 16:08:48 crc kubenswrapper[4771]: I1001 16:08:48.574689 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bch69\" (UniqueName: \"kubernetes.io/projected/854dfb65-df55-4285-8e9e-6ab6e0ee5fc4-kube-api-access-bch69\") pod \"certified-operators-l6sjb\" (UID: \"854dfb65-df55-4285-8e9e-6ab6e0ee5fc4\") " pod="openshift-marketplace/certified-operators-l6sjb" Oct 01 16:08:48 crc kubenswrapper[4771]: I1001 16:08:48.634659 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l6sjb" Oct 01 16:08:48 crc kubenswrapper[4771]: I1001 16:08:48.973675 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-wfbm9/must-gather-brgcz"] Oct 01 16:08:48 crc kubenswrapper[4771]: I1001 16:08:48.974313 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-wfbm9/must-gather-brgcz" podUID="bfd8d29d-231e-4d89-b6a1-c65c00b76a7b" containerName="copy" containerID="cri-o://ed3fdce484e6b258cfcdd683343311cae2bba8d648ed04bf787223793586461a" gracePeriod=2 Oct 01 16:08:48 crc kubenswrapper[4771]: I1001 16:08:48.983827 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-wfbm9/must-gather-brgcz"] Oct 01 16:08:49 crc kubenswrapper[4771]: I1001 16:08:49.122540 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l6sjb"] Oct 01 16:08:49 crc kubenswrapper[4771]: I1001 16:08:49.487755 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wfbm9_must-gather-brgcz_bfd8d29d-231e-4d89-b6a1-c65c00b76a7b/copy/0.log" Oct 01 16:08:49 crc kubenswrapper[4771]: I1001 16:08:49.488478 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wfbm9/must-gather-brgcz" Oct 01 16:08:49 crc kubenswrapper[4771]: I1001 16:08:49.621779 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bfd8d29d-231e-4d89-b6a1-c65c00b76a7b-must-gather-output\") pod \"bfd8d29d-231e-4d89-b6a1-c65c00b76a7b\" (UID: \"bfd8d29d-231e-4d89-b6a1-c65c00b76a7b\") " Oct 01 16:08:49 crc kubenswrapper[4771]: I1001 16:08:49.622006 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glzhm\" (UniqueName: \"kubernetes.io/projected/bfd8d29d-231e-4d89-b6a1-c65c00b76a7b-kube-api-access-glzhm\") pod \"bfd8d29d-231e-4d89-b6a1-c65c00b76a7b\" (UID: \"bfd8d29d-231e-4d89-b6a1-c65c00b76a7b\") " Oct 01 16:08:49 crc kubenswrapper[4771]: I1001 16:08:49.632535 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfd8d29d-231e-4d89-b6a1-c65c00b76a7b-kube-api-access-glzhm" (OuterVolumeSpecName: "kube-api-access-glzhm") pod "bfd8d29d-231e-4d89-b6a1-c65c00b76a7b" (UID: "bfd8d29d-231e-4d89-b6a1-c65c00b76a7b"). InnerVolumeSpecName "kube-api-access-glzhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:08:49 crc kubenswrapper[4771]: I1001 16:08:49.725024 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glzhm\" (UniqueName: \"kubernetes.io/projected/bfd8d29d-231e-4d89-b6a1-c65c00b76a7b-kube-api-access-glzhm\") on node \"crc\" DevicePath \"\"" Oct 01 16:08:49 crc kubenswrapper[4771]: I1001 16:08:49.782986 4771 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6fw46"] Oct 01 16:08:49 crc kubenswrapper[4771]: E1001 16:08:49.783456 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfd8d29d-231e-4d89-b6a1-c65c00b76a7b" containerName="copy" Oct 01 16:08:49 crc kubenswrapper[4771]: I1001 16:08:49.783475 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfd8d29d-231e-4d89-b6a1-c65c00b76a7b" containerName="copy" Oct 01 16:08:49 crc kubenswrapper[4771]: E1001 16:08:49.783492 4771 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfd8d29d-231e-4d89-b6a1-c65c00b76a7b" containerName="gather" Oct 01 16:08:49 crc kubenswrapper[4771]: I1001 16:08:49.783499 4771 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfd8d29d-231e-4d89-b6a1-c65c00b76a7b" containerName="gather" Oct 01 16:08:49 crc kubenswrapper[4771]: I1001 16:08:49.788893 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfd8d29d-231e-4d89-b6a1-c65c00b76a7b" containerName="copy" Oct 01 16:08:49 crc kubenswrapper[4771]: I1001 16:08:49.788949 4771 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfd8d29d-231e-4d89-b6a1-c65c00b76a7b" containerName="gather" Oct 01 16:08:49 crc kubenswrapper[4771]: I1001 16:08:49.790807 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6fw46" Oct 01 16:08:49 crc kubenswrapper[4771]: I1001 16:08:49.796568 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6fw46"] Oct 01 16:08:49 crc kubenswrapper[4771]: I1001 16:08:49.798646 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bfd8d29d-231e-4d89-b6a1-c65c00b76a7b-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "bfd8d29d-231e-4d89-b6a1-c65c00b76a7b" (UID: "bfd8d29d-231e-4d89-b6a1-c65c00b76a7b"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:08:49 crc kubenswrapper[4771]: I1001 16:08:49.829512 4771 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/bfd8d29d-231e-4d89-b6a1-c65c00b76a7b-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 01 16:08:49 crc kubenswrapper[4771]: I1001 16:08:49.931356 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgj48\" (UniqueName: \"kubernetes.io/projected/f2eb213a-3466-4559-8c22-59ea7300784c-kube-api-access-fgj48\") pod \"community-operators-6fw46\" (UID: \"f2eb213a-3466-4559-8c22-59ea7300784c\") " pod="openshift-marketplace/community-operators-6fw46" Oct 01 16:08:49 crc kubenswrapper[4771]: I1001 16:08:49.931458 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2eb213a-3466-4559-8c22-59ea7300784c-catalog-content\") pod \"community-operators-6fw46\" (UID: \"f2eb213a-3466-4559-8c22-59ea7300784c\") " pod="openshift-marketplace/community-operators-6fw46" Oct 01 16:08:49 crc kubenswrapper[4771]: I1001 16:08:49.931594 4771 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2eb213a-3466-4559-8c22-59ea7300784c-utilities\") pod \"community-operators-6fw46\" (UID: \"f2eb213a-3466-4559-8c22-59ea7300784c\") " pod="openshift-marketplace/community-operators-6fw46" Oct 01 16:08:49 crc kubenswrapper[4771]: I1001 16:08:49.998228 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfd8d29d-231e-4d89-b6a1-c65c00b76a7b" path="/var/lib/kubelet/pods/bfd8d29d-231e-4d89-b6a1-c65c00b76a7b/volumes" Oct 01 16:08:50 crc kubenswrapper[4771]: I1001 16:08:50.033579 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgj48\" (UniqueName: \"kubernetes.io/projected/f2eb213a-3466-4559-8c22-59ea7300784c-kube-api-access-fgj48\") pod \"community-operators-6fw46\" (UID: \"f2eb213a-3466-4559-8c22-59ea7300784c\") " pod="openshift-marketplace/community-operators-6fw46" Oct 01 16:08:50 crc kubenswrapper[4771]: I1001 16:08:50.033650 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2eb213a-3466-4559-8c22-59ea7300784c-catalog-content\") pod \"community-operators-6fw46\" (UID: \"f2eb213a-3466-4559-8c22-59ea7300784c\") " pod="openshift-marketplace/community-operators-6fw46" Oct 01 16:08:50 crc kubenswrapper[4771]: I1001 16:08:50.033759 4771 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2eb213a-3466-4559-8c22-59ea7300784c-utilities\") pod \"community-operators-6fw46\" (UID: \"f2eb213a-3466-4559-8c22-59ea7300784c\") " pod="openshift-marketplace/community-operators-6fw46" Oct 01 16:08:50 crc kubenswrapper[4771]: I1001 16:08:50.034219 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2eb213a-3466-4559-8c22-59ea7300784c-utilities\") pod \"community-operators-6fw46\" (UID: \"f2eb213a-3466-4559-8c22-59ea7300784c\") " pod="openshift-marketplace/community-operators-6fw46" Oct 01 16:08:50 crc kubenswrapper[4771]: I1001 16:08:50.034374 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2eb213a-3466-4559-8c22-59ea7300784c-catalog-content\") pod \"community-operators-6fw46\" (UID: \"f2eb213a-3466-4559-8c22-59ea7300784c\") " pod="openshift-marketplace/community-operators-6fw46" Oct 01 16:08:50 crc kubenswrapper[4771]: I1001 16:08:50.052893 4771 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgj48\" (UniqueName: \"kubernetes.io/projected/f2eb213a-3466-4559-8c22-59ea7300784c-kube-api-access-fgj48\") pod \"community-operators-6fw46\" (UID: \"f2eb213a-3466-4559-8c22-59ea7300784c\") " pod="openshift-marketplace/community-operators-6fw46" Oct 01 16:08:50 crc kubenswrapper[4771]: I1001 16:08:50.074670 4771 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-wfbm9_must-gather-brgcz_bfd8d29d-231e-4d89-b6a1-c65c00b76a7b/copy/0.log" Oct 01 16:08:50 crc kubenswrapper[4771]: I1001 16:08:50.075316 4771 generic.go:334] "Generic (PLEG): container finished" podID="bfd8d29d-231e-4d89-b6a1-c65c00b76a7b" containerID="ed3fdce484e6b258cfcdd683343311cae2bba8d648ed04bf787223793586461a" exitCode=143 Oct 01 16:08:50 crc kubenswrapper[4771]: I1001 16:08:50.075387 4771 scope.go:117] "RemoveContainer" containerID="ed3fdce484e6b258cfcdd683343311cae2bba8d648ed04bf787223793586461a" Oct 01 16:08:50 crc kubenswrapper[4771]: I1001 16:08:50.075408 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wfbm9/must-gather-brgcz" Oct 01 16:08:50 crc kubenswrapper[4771]: I1001 16:08:50.077418 4771 generic.go:334] "Generic (PLEG): container finished" podID="854dfb65-df55-4285-8e9e-6ab6e0ee5fc4" containerID="6d0a7299f35285e5864442fb62d9834f68c7f39113cf59ffa4147671d1a18ac7" exitCode=0 Oct 01 16:08:50 crc kubenswrapper[4771]: I1001 16:08:50.077457 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l6sjb" event={"ID":"854dfb65-df55-4285-8e9e-6ab6e0ee5fc4","Type":"ContainerDied","Data":"6d0a7299f35285e5864442fb62d9834f68c7f39113cf59ffa4147671d1a18ac7"} Oct 01 16:08:50 crc kubenswrapper[4771]: I1001 16:08:50.077490 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l6sjb" event={"ID":"854dfb65-df55-4285-8e9e-6ab6e0ee5fc4","Type":"ContainerStarted","Data":"08fb7012ea372101d7e477429bfb65797446338ba1c599b76c9a0491da37ae17"} Oct 01 16:08:50 crc kubenswrapper[4771]: I1001 16:08:50.079568 4771 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 01 16:08:50 crc kubenswrapper[4771]: I1001 16:08:50.114883 4771 scope.go:117] "RemoveContainer" containerID="ff5572627a2645aae6e23c6c5e1b619d9763c95e556fb0f4474b25ee61711a70" Oct 01 16:08:50 crc kubenswrapper[4771]: I1001 16:08:50.117075 4771 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6fw46" Oct 01 16:08:50 crc kubenswrapper[4771]: I1001 16:08:50.201747 4771 scope.go:117] "RemoveContainer" containerID="ed3fdce484e6b258cfcdd683343311cae2bba8d648ed04bf787223793586461a" Oct 01 16:08:50 crc kubenswrapper[4771]: E1001 16:08:50.202994 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed3fdce484e6b258cfcdd683343311cae2bba8d648ed04bf787223793586461a\": container with ID starting with ed3fdce484e6b258cfcdd683343311cae2bba8d648ed04bf787223793586461a not found: ID does not exist" containerID="ed3fdce484e6b258cfcdd683343311cae2bba8d648ed04bf787223793586461a" Oct 01 16:08:50 crc kubenswrapper[4771]: I1001 16:08:50.203042 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed3fdce484e6b258cfcdd683343311cae2bba8d648ed04bf787223793586461a"} err="failed to get container status \"ed3fdce484e6b258cfcdd683343311cae2bba8d648ed04bf787223793586461a\": rpc error: code = NotFound desc = could not find container \"ed3fdce484e6b258cfcdd683343311cae2bba8d648ed04bf787223793586461a\": container with ID starting with ed3fdce484e6b258cfcdd683343311cae2bba8d648ed04bf787223793586461a not found: ID does not exist" Oct 01 16:08:50 crc kubenswrapper[4771]: I1001 16:08:50.203073 4771 scope.go:117] "RemoveContainer" containerID="ff5572627a2645aae6e23c6c5e1b619d9763c95e556fb0f4474b25ee61711a70" Oct 01 16:08:50 crc kubenswrapper[4771]: E1001 16:08:50.203397 4771 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff5572627a2645aae6e23c6c5e1b619d9763c95e556fb0f4474b25ee61711a70\": container with ID starting with ff5572627a2645aae6e23c6c5e1b619d9763c95e556fb0f4474b25ee61711a70 not found: ID does not exist" containerID="ff5572627a2645aae6e23c6c5e1b619d9763c95e556fb0f4474b25ee61711a70" Oct 01 16:08:50 crc kubenswrapper[4771]: I1001 16:08:50.203430 4771 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff5572627a2645aae6e23c6c5e1b619d9763c95e556fb0f4474b25ee61711a70"} err="failed to get container status \"ff5572627a2645aae6e23c6c5e1b619d9763c95e556fb0f4474b25ee61711a70\": rpc error: code = NotFound desc = could not find container \"ff5572627a2645aae6e23c6c5e1b619d9763c95e556fb0f4474b25ee61711a70\": container with ID starting with ff5572627a2645aae6e23c6c5e1b619d9763c95e556fb0f4474b25ee61711a70 not found: ID does not exist" Oct 01 16:08:51 crc kubenswrapper[4771]: I1001 16:08:51.088436 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l6sjb" event={"ID":"854dfb65-df55-4285-8e9e-6ab6e0ee5fc4","Type":"ContainerStarted","Data":"084909ac9d2220553f3aaaf2e4e25f812647515d2521ff6c02e86d126d469119"} Oct 01 16:08:51 crc kubenswrapper[4771]: I1001 16:08:51.153850 4771 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6fw46"] Oct 01 16:08:51 crc kubenswrapper[4771]: W1001 16:08:51.160180 4771 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2eb213a_3466_4559_8c22_59ea7300784c.slice/crio-88706417aae09d3b3bd86bd6951fe0b1aecda3fdf538241243e6163e1aad6347 WatchSource:0}: Error finding container 88706417aae09d3b3bd86bd6951fe0b1aecda3fdf538241243e6163e1aad6347: Status 404 returned error can't find the container with id 88706417aae09d3b3bd86bd6951fe0b1aecda3fdf538241243e6163e1aad6347 Oct 01 16:08:52 crc kubenswrapper[4771]: I1001 16:08:52.101779 4771 generic.go:334] "Generic (PLEG): container finished" podID="f2eb213a-3466-4559-8c22-59ea7300784c" containerID="db727645b267719bbcf9dcf5197a3f5ab0261c22f4c339af4071da51eddf8bdc" exitCode=0 Oct 01 16:08:52 crc kubenswrapper[4771]: I1001 16:08:52.102061 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6fw46" event={"ID":"f2eb213a-3466-4559-8c22-59ea7300784c","Type":"ContainerDied","Data":"db727645b267719bbcf9dcf5197a3f5ab0261c22f4c339af4071da51eddf8bdc"} Oct 01 16:08:52 crc kubenswrapper[4771]: I1001 16:08:52.102109 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6fw46" event={"ID":"f2eb213a-3466-4559-8c22-59ea7300784c","Type":"ContainerStarted","Data":"88706417aae09d3b3bd86bd6951fe0b1aecda3fdf538241243e6163e1aad6347"} Oct 01 16:08:52 crc kubenswrapper[4771]: I1001 16:08:52.105478 4771 generic.go:334] "Generic (PLEG): container finished" podID="854dfb65-df55-4285-8e9e-6ab6e0ee5fc4" containerID="084909ac9d2220553f3aaaf2e4e25f812647515d2521ff6c02e86d126d469119" exitCode=0 Oct 01 16:08:52 crc kubenswrapper[4771]: I1001 16:08:52.105534 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l6sjb" event={"ID":"854dfb65-df55-4285-8e9e-6ab6e0ee5fc4","Type":"ContainerDied","Data":"084909ac9d2220553f3aaaf2e4e25f812647515d2521ff6c02e86d126d469119"} Oct 01 16:08:53 crc kubenswrapper[4771]: I1001 16:08:53.118012 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l6sjb" event={"ID":"854dfb65-df55-4285-8e9e-6ab6e0ee5fc4","Type":"ContainerStarted","Data":"940143a44772557c983651c47afb4ee1aa2fe58505dbbb6087968ef72385e021"} Oct 01 16:08:53 crc kubenswrapper[4771]: I1001 16:08:53.120715 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6fw46" event={"ID":"f2eb213a-3466-4559-8c22-59ea7300784c","Type":"ContainerStarted","Data":"eea17be540970b337693131c5727bb3e7684a27cab489bebb026e065500adb27"} Oct 01 16:08:53 crc kubenswrapper[4771]: I1001 16:08:53.149883 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-l6sjb" podStartSLOduration=3.701897944 podStartE2EDuration="6.149866536s" podCreationTimestamp="2025-10-01 16:08:47 +0000 UTC" firstStartedPulling="2025-10-01 16:08:50.079306769 +0000 UTC m=+4374.698481940" lastFinishedPulling="2025-10-01 16:08:52.527275361 +0000 UTC m=+4377.146450532" observedRunningTime="2025-10-01 16:08:53.143004567 +0000 UTC m=+4377.762179758" watchObservedRunningTime="2025-10-01 16:08:53.149866536 +0000 UTC m=+4377.769041707" Oct 01 16:08:53 crc kubenswrapper[4771]: I1001 16:08:53.985527 4771 scope.go:117] "RemoveContainer" containerID="f55cdf44553a8313053f9f313ebda5423df128ee40bc2ac50a7599b6ed530df9" Oct 01 16:08:53 crc kubenswrapper[4771]: E1001 16:08:53.985854 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 16:08:54 crc kubenswrapper[4771]: I1001 16:08:54.136502 4771 generic.go:334] "Generic (PLEG): container finished" podID="f2eb213a-3466-4559-8c22-59ea7300784c" containerID="eea17be540970b337693131c5727bb3e7684a27cab489bebb026e065500adb27" exitCode=0 Oct 01 16:08:54 crc kubenswrapper[4771]: I1001 16:08:54.136607 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6fw46" event={"ID":"f2eb213a-3466-4559-8c22-59ea7300784c","Type":"ContainerDied","Data":"eea17be540970b337693131c5727bb3e7684a27cab489bebb026e065500adb27"} Oct 01 16:08:55 crc kubenswrapper[4771]: I1001 16:08:55.148441 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6fw46" event={"ID":"f2eb213a-3466-4559-8c22-59ea7300784c","Type":"ContainerStarted","Data":"a89da8b4ce736dff9c3e31514b36a8c2a9da38e14bb79a5863c56a0e4395180b"} Oct 01 16:08:55 crc kubenswrapper[4771]: I1001 16:08:55.166370 4771 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6fw46" podStartSLOduration=3.618568296 podStartE2EDuration="6.166354602s" podCreationTimestamp="2025-10-01 16:08:49 +0000 UTC" firstStartedPulling="2025-10-01 16:08:52.103607519 +0000 UTC m=+4376.722782700" lastFinishedPulling="2025-10-01 16:08:54.651393815 +0000 UTC m=+4379.270569006" observedRunningTime="2025-10-01 16:08:55.165783107 +0000 UTC m=+4379.784958288" watchObservedRunningTime="2025-10-01 16:08:55.166354602 +0000 UTC m=+4379.785529763" Oct 01 16:08:58 crc kubenswrapper[4771]: I1001 16:08:58.635530 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-l6sjb" Oct 01 16:08:58 crc kubenswrapper[4771]: I1001 16:08:58.636110 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-l6sjb" Oct 01 16:08:58 crc kubenswrapper[4771]: I1001 16:08:58.705215 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-l6sjb" Oct 01 16:08:59 crc kubenswrapper[4771]: I1001 16:08:59.263253 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-l6sjb" Oct 01 16:08:59 crc kubenswrapper[4771]: I1001 16:08:59.573758 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l6sjb"] Oct 01 16:09:00 crc kubenswrapper[4771]: I1001 16:09:00.117874 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6fw46" Oct 01 16:09:00 crc kubenswrapper[4771]: I1001 16:09:00.119321 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6fw46" Oct 01 16:09:00 crc kubenswrapper[4771]: I1001 16:09:00.426039 4771 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6fw46" Oct 01 16:09:01 crc kubenswrapper[4771]: I1001 16:09:01.209350 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-l6sjb" podUID="854dfb65-df55-4285-8e9e-6ab6e0ee5fc4" containerName="registry-server" containerID="cri-o://940143a44772557c983651c47afb4ee1aa2fe58505dbbb6087968ef72385e021" gracePeriod=2 Oct 01 16:09:01 crc kubenswrapper[4771]: I1001 16:09:01.266649 4771 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6fw46" Oct 01 16:09:01 crc kubenswrapper[4771]: I1001 16:09:01.981235 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6fw46"] Oct 01 16:09:02 crc kubenswrapper[4771]: I1001 16:09:02.218019 4771 generic.go:334] "Generic (PLEG): container finished" podID="854dfb65-df55-4285-8e9e-6ab6e0ee5fc4" containerID="940143a44772557c983651c47afb4ee1aa2fe58505dbbb6087968ef72385e021" exitCode=0 Oct 01 16:09:02 crc kubenswrapper[4771]: I1001 16:09:02.218107 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l6sjb" event={"ID":"854dfb65-df55-4285-8e9e-6ab6e0ee5fc4","Type":"ContainerDied","Data":"940143a44772557c983651c47afb4ee1aa2fe58505dbbb6087968ef72385e021"} Oct 01 16:09:02 crc kubenswrapper[4771]: I1001 16:09:02.352083 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l6sjb" Oct 01 16:09:02 crc kubenswrapper[4771]: I1001 16:09:02.508816 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/854dfb65-df55-4285-8e9e-6ab6e0ee5fc4-utilities\") pod \"854dfb65-df55-4285-8e9e-6ab6e0ee5fc4\" (UID: \"854dfb65-df55-4285-8e9e-6ab6e0ee5fc4\") " Oct 01 16:09:02 crc kubenswrapper[4771]: I1001 16:09:02.508933 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bch69\" (UniqueName: \"kubernetes.io/projected/854dfb65-df55-4285-8e9e-6ab6e0ee5fc4-kube-api-access-bch69\") pod \"854dfb65-df55-4285-8e9e-6ab6e0ee5fc4\" (UID: \"854dfb65-df55-4285-8e9e-6ab6e0ee5fc4\") " Oct 01 16:09:02 crc kubenswrapper[4771]: I1001 16:09:02.509026 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/854dfb65-df55-4285-8e9e-6ab6e0ee5fc4-catalog-content\") pod \"854dfb65-df55-4285-8e9e-6ab6e0ee5fc4\" (UID: \"854dfb65-df55-4285-8e9e-6ab6e0ee5fc4\") " Oct 01 16:09:02 crc kubenswrapper[4771]: I1001 16:09:02.510836 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/854dfb65-df55-4285-8e9e-6ab6e0ee5fc4-utilities" (OuterVolumeSpecName: "utilities") pod "854dfb65-df55-4285-8e9e-6ab6e0ee5fc4" (UID: "854dfb65-df55-4285-8e9e-6ab6e0ee5fc4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:09:02 crc kubenswrapper[4771]: I1001 16:09:02.516920 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/854dfb65-df55-4285-8e9e-6ab6e0ee5fc4-kube-api-access-bch69" (OuterVolumeSpecName: "kube-api-access-bch69") pod "854dfb65-df55-4285-8e9e-6ab6e0ee5fc4" (UID: "854dfb65-df55-4285-8e9e-6ab6e0ee5fc4"). InnerVolumeSpecName "kube-api-access-bch69". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:09:02 crc kubenswrapper[4771]: I1001 16:09:02.559456 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/854dfb65-df55-4285-8e9e-6ab6e0ee5fc4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "854dfb65-df55-4285-8e9e-6ab6e0ee5fc4" (UID: "854dfb65-df55-4285-8e9e-6ab6e0ee5fc4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:09:02 crc kubenswrapper[4771]: I1001 16:09:02.611095 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/854dfb65-df55-4285-8e9e-6ab6e0ee5fc4-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 16:09:02 crc kubenswrapper[4771]: I1001 16:09:02.611134 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bch69\" (UniqueName: \"kubernetes.io/projected/854dfb65-df55-4285-8e9e-6ab6e0ee5fc4-kube-api-access-bch69\") on node \"crc\" DevicePath \"\"" Oct 01 16:09:02 crc kubenswrapper[4771]: I1001 16:09:02.611146 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/854dfb65-df55-4285-8e9e-6ab6e0ee5fc4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 16:09:03 crc kubenswrapper[4771]: I1001 16:09:03.233460 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l6sjb" Oct 01 16:09:03 crc kubenswrapper[4771]: I1001 16:09:03.233912 4771 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6fw46" podUID="f2eb213a-3466-4559-8c22-59ea7300784c" containerName="registry-server" containerID="cri-o://a89da8b4ce736dff9c3e31514b36a8c2a9da38e14bb79a5863c56a0e4395180b" gracePeriod=2 Oct 01 16:09:03 crc kubenswrapper[4771]: I1001 16:09:03.233469 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l6sjb" event={"ID":"854dfb65-df55-4285-8e9e-6ab6e0ee5fc4","Type":"ContainerDied","Data":"08fb7012ea372101d7e477429bfb65797446338ba1c599b76c9a0491da37ae17"} Oct 01 16:09:03 crc kubenswrapper[4771]: I1001 16:09:03.234087 4771 scope.go:117] "RemoveContainer" containerID="940143a44772557c983651c47afb4ee1aa2fe58505dbbb6087968ef72385e021" Oct 01 16:09:03 crc kubenswrapper[4771]: I1001 16:09:03.270918 4771 scope.go:117] "RemoveContainer" containerID="084909ac9d2220553f3aaaf2e4e25f812647515d2521ff6c02e86d126d469119" Oct 01 16:09:03 crc kubenswrapper[4771]: I1001 16:09:03.274891 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l6sjb"] Oct 01 16:09:03 crc kubenswrapper[4771]: I1001 16:09:03.282647 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-l6sjb"] Oct 01 16:09:03 crc kubenswrapper[4771]: I1001 16:09:03.299685 4771 scope.go:117] "RemoveContainer" containerID="6d0a7299f35285e5864442fb62d9834f68c7f39113cf59ffa4147671d1a18ac7" Oct 01 16:09:03 crc kubenswrapper[4771]: I1001 16:09:03.999012 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="854dfb65-df55-4285-8e9e-6ab6e0ee5fc4" path="/var/lib/kubelet/pods/854dfb65-df55-4285-8e9e-6ab6e0ee5fc4/volumes" Oct 01 16:09:04 crc kubenswrapper[4771]: I1001 16:09:04.247171 4771 generic.go:334] "Generic (PLEG): container finished" podID="f2eb213a-3466-4559-8c22-59ea7300784c" containerID="a89da8b4ce736dff9c3e31514b36a8c2a9da38e14bb79a5863c56a0e4395180b" exitCode=0 Oct 01 16:09:04 crc kubenswrapper[4771]: I1001 16:09:04.247277 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6fw46" event={"ID":"f2eb213a-3466-4559-8c22-59ea7300784c","Type":"ContainerDied","Data":"a89da8b4ce736dff9c3e31514b36a8c2a9da38e14bb79a5863c56a0e4395180b"} Oct 01 16:09:04 crc kubenswrapper[4771]: I1001 16:09:04.573985 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6fw46" Oct 01 16:09:04 crc kubenswrapper[4771]: I1001 16:09:04.694586 4771 scope.go:117] "RemoveContainer" containerID="e97692021fe42f3d283229f4f257b89d8af7dc2cb83a1845bedfe8a160366360" Oct 01 16:09:04 crc kubenswrapper[4771]: I1001 16:09:04.716593 4771 scope.go:117] "RemoveContainer" containerID="f047eef171c386a3e4f10d9e29496a0547938fa486738a6f52e11764dcde0a0a" Oct 01 16:09:04 crc kubenswrapper[4771]: I1001 16:09:04.752213 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgj48\" (UniqueName: \"kubernetes.io/projected/f2eb213a-3466-4559-8c22-59ea7300784c-kube-api-access-fgj48\") pod \"f2eb213a-3466-4559-8c22-59ea7300784c\" (UID: \"f2eb213a-3466-4559-8c22-59ea7300784c\") " Oct 01 16:09:04 crc kubenswrapper[4771]: I1001 16:09:04.752337 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2eb213a-3466-4559-8c22-59ea7300784c-utilities\") pod \"f2eb213a-3466-4559-8c22-59ea7300784c\" (UID: \"f2eb213a-3466-4559-8c22-59ea7300784c\") " Oct 01 16:09:04 crc kubenswrapper[4771]: I1001 16:09:04.752395 4771 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2eb213a-3466-4559-8c22-59ea7300784c-catalog-content\") pod \"f2eb213a-3466-4559-8c22-59ea7300784c\" (UID: \"f2eb213a-3466-4559-8c22-59ea7300784c\") " Oct 01 16:09:04 crc kubenswrapper[4771]: I1001 16:09:04.753169 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2eb213a-3466-4559-8c22-59ea7300784c-utilities" (OuterVolumeSpecName: "utilities") pod "f2eb213a-3466-4559-8c22-59ea7300784c" (UID: "f2eb213a-3466-4559-8c22-59ea7300784c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:09:04 crc kubenswrapper[4771]: I1001 16:09:04.757605 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2eb213a-3466-4559-8c22-59ea7300784c-kube-api-access-fgj48" (OuterVolumeSpecName: "kube-api-access-fgj48") pod "f2eb213a-3466-4559-8c22-59ea7300784c" (UID: "f2eb213a-3466-4559-8c22-59ea7300784c"). InnerVolumeSpecName "kube-api-access-fgj48". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 01 16:09:04 crc kubenswrapper[4771]: I1001 16:09:04.812690 4771 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2eb213a-3466-4559-8c22-59ea7300784c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f2eb213a-3466-4559-8c22-59ea7300784c" (UID: "f2eb213a-3466-4559-8c22-59ea7300784c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 01 16:09:04 crc kubenswrapper[4771]: I1001 16:09:04.854323 4771 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgj48\" (UniqueName: \"kubernetes.io/projected/f2eb213a-3466-4559-8c22-59ea7300784c-kube-api-access-fgj48\") on node \"crc\" DevicePath \"\"" Oct 01 16:09:04 crc kubenswrapper[4771]: I1001 16:09:04.854359 4771 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2eb213a-3466-4559-8c22-59ea7300784c-utilities\") on node \"crc\" DevicePath \"\"" Oct 01 16:09:04 crc kubenswrapper[4771]: I1001 16:09:04.854368 4771 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2eb213a-3466-4559-8c22-59ea7300784c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 01 16:09:05 crc kubenswrapper[4771]: I1001 16:09:05.266054 4771 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6fw46" event={"ID":"f2eb213a-3466-4559-8c22-59ea7300784c","Type":"ContainerDied","Data":"88706417aae09d3b3bd86bd6951fe0b1aecda3fdf538241243e6163e1aad6347"} Oct 01 16:09:05 crc kubenswrapper[4771]: I1001 16:09:05.266196 4771 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6fw46" Oct 01 16:09:05 crc kubenswrapper[4771]: I1001 16:09:05.266517 4771 scope.go:117] "RemoveContainer" containerID="a89da8b4ce736dff9c3e31514b36a8c2a9da38e14bb79a5863c56a0e4395180b" Oct 01 16:09:05 crc kubenswrapper[4771]: I1001 16:09:05.311400 4771 scope.go:117] "RemoveContainer" containerID="eea17be540970b337693131c5727bb3e7684a27cab489bebb026e065500adb27" Oct 01 16:09:05 crc kubenswrapper[4771]: I1001 16:09:05.316964 4771 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6fw46"] Oct 01 16:09:05 crc kubenswrapper[4771]: I1001 16:09:05.327690 4771 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6fw46"] Oct 01 16:09:05 crc kubenswrapper[4771]: I1001 16:09:05.331073 4771 scope.go:117] "RemoveContainer" containerID="db727645b267719bbcf9dcf5197a3f5ab0261c22f4c339af4071da51eddf8bdc" Oct 01 16:09:05 crc kubenswrapper[4771]: I1001 16:09:05.998821 4771 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2eb213a-3466-4559-8c22-59ea7300784c" path="/var/lib/kubelet/pods/f2eb213a-3466-4559-8c22-59ea7300784c/volumes" Oct 01 16:09:06 crc kubenswrapper[4771]: I1001 16:09:06.985669 4771 scope.go:117] "RemoveContainer" containerID="f55cdf44553a8313053f9f313ebda5423df128ee40bc2ac50a7599b6ed530df9" Oct 01 16:09:06 crc kubenswrapper[4771]: E1001 16:09:06.986266 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 16:09:17 crc kubenswrapper[4771]: I1001 16:09:17.984954 4771 scope.go:117] "RemoveContainer" containerID="f55cdf44553a8313053f9f313ebda5423df128ee40bc2ac50a7599b6ed530df9" Oct 01 16:09:17 crc kubenswrapper[4771]: E1001 16:09:17.985747 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 16:09:29 crc kubenswrapper[4771]: I1001 16:09:29.985424 4771 scope.go:117] "RemoveContainer" containerID="f55cdf44553a8313053f9f313ebda5423df128ee40bc2ac50a7599b6ed530df9" Oct 01 16:09:29 crc kubenswrapper[4771]: E1001 16:09:29.986187 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 16:09:43 crc kubenswrapper[4771]: I1001 16:09:43.986245 4771 scope.go:117] "RemoveContainer" containerID="f55cdf44553a8313053f9f313ebda5423df128ee40bc2ac50a7599b6ed530df9" Oct 01 16:09:43 crc kubenswrapper[4771]: E1001 16:09:43.987555 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 16:09:58 crc kubenswrapper[4771]: I1001 16:09:58.984788 4771 scope.go:117] "RemoveContainer" containerID="f55cdf44553a8313053f9f313ebda5423df128ee40bc2ac50a7599b6ed530df9" Oct 01 16:09:58 crc kubenswrapper[4771]: E1001 16:09:58.985675 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 16:10:04 crc kubenswrapper[4771]: I1001 16:10:04.824043 4771 scope.go:117] "RemoveContainer" containerID="b6821859df514ce75e3d006b239ab2ceae671a4cf290c33c66d5a03d04c2ec28" Oct 01 16:10:11 crc kubenswrapper[4771]: I1001 16:10:11.986520 4771 scope.go:117] "RemoveContainer" containerID="f55cdf44553a8313053f9f313ebda5423df128ee40bc2ac50a7599b6ed530df9" Oct 01 16:10:11 crc kubenswrapper[4771]: E1001 16:10:11.987219 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 16:10:25 crc kubenswrapper[4771]: I1001 16:10:25.992127 4771 scope.go:117] "RemoveContainer" containerID="f55cdf44553a8313053f9f313ebda5423df128ee40bc2ac50a7599b6ed530df9" Oct 01 16:10:25 crc kubenswrapper[4771]: E1001 16:10:25.992839 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 16:10:39 crc kubenswrapper[4771]: I1001 16:10:39.985488 4771 scope.go:117] "RemoveContainer" containerID="f55cdf44553a8313053f9f313ebda5423df128ee40bc2ac50a7599b6ed530df9" Oct 01 16:10:39 crc kubenswrapper[4771]: E1001 16:10:39.986904 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc" Oct 01 16:10:53 crc kubenswrapper[4771]: I1001 16:10:53.985406 4771 scope.go:117] "RemoveContainer" containerID="f55cdf44553a8313053f9f313ebda5423df128ee40bc2ac50a7599b6ed530df9" Oct 01 16:10:53 crc kubenswrapper[4771]: E1001 16:10:53.986227 4771 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-vck47_openshift-machine-config-operator(289ee6d3-fabe-417f-964c-76ca03c143cc)\"" pod="openshift-machine-config-operator/machine-config-daemon-vck47" podUID="289ee6d3-fabe-417f-964c-76ca03c143cc"